ohm’s law:

Ohm’s law states that the current through a conductor between two points is directly proportional to the potential difference across the two points. Introducing the constant of proportionality, the resistance,[1] one arrives at the usual mathematical equation that describes this relationship:[2]

where I is the current through the conductor in units of amperes, V is the potential difference measured across the conductor in units of volts, and R is the resistance of the conductor in units of ohms. More specifically, Ohm’s law states that the R in this relation is constant, independent of the current.[3]





riemann hypothesis

In mathematics, the Riemann hypothesis, proposed by Bernhard Riemann (1859), is a conjecture about the location of the nontrivial zeros of the Riemann zeta function which states that all non-trivial zeros (as defined below) have real part 1/2. The name is also used for some closely related analogues, such as the Riemann hypothesis for curves over finite fields.

The Riemann hypothesis implies results about the distribution of prime numbers that are in some ways as good as possible. Along with suitable generalizations, it is considered by some mathematicians to be the most important unresolved problem in pure mathematics (Bombieri 2000). The Riemann hypothesis is part of Problem 8, along with the Goldbach conjecture, in Hilbert’s list of 23 unsolved problems, and is also one of the Clay Mathematics Institute Millennium Prize Problems.

The Riemann zeta function ζ(s) is defined for all complex numbers s ≠ 1 with a simple pole at s = 1. It has zeros at the negative even integers (i.e. at s = −2, −4, −6, …). These are called the trivial zeros. The Riemann hypothesis is concerned with the non-trivial zeros, and states that:
The real part of any non-trivial zero of the Riemann zeta function is 1/2.

Thus the non-trivial zeros should lie on the critical line, 1/2 + i t, where t is a real number and i is the imaginary unit.







Entropy is an extensive thermodynamic property that is the measure of a system’s thermal energy per unit temperature that is unavailable for doing useful work. In thermodynamics, entropy has the dimension of energy divided by temperature, which has a unit of joules per kelvin (J/K) in the International System of Units. The term entropy was coined in 1865 by Rudolf Clausius based on the Greek εντροπία [entropía], a turning toward, from εν- [en-] (in) and τροπή [tropē] (turn, conversion).[2][note 2]

Following the laws of thermodynamics, entropy of a closed system always increases and in heat transfer situations, heat energy is transferred from higher temperature components to lower temperature components. In thermally isolated systems, entropy runs in one direction only (it is not a reversible process). One can measure the entropy of a system to determine the energy not available for work in a thermodynamic process, such as energy conversion, engines, or machines. Such processes and devices can only be driven by convertible energy, and have a theoretical maximum efficiency when converting energy to work. During this work, entropy accumulates in the system, which then dissipates in the form of waste heat. Entropy is an abstract concept. As a tangible example, imagine a closed system with two separated masses with two significantly different temperatures. Some of the thermal energy can be converted into mechanical energy by adding a small heat engine to the system. Instead, if the two masses are brought together and put in contact, then the heat of the warmer mass will freely flow to the cooler mass until the two are of some uniform intermediate temperature. After this heat flow, the entropy of the system increases and the entropy multiplied by the intermediate temperature equals the thermal energy of the system and thus none of the thermal energy is available for conversion into mechanical work. When the two masses were still separated and had different temperatures, the energy was ordered or sorted out into the two separated masses. After the heat flow, the overall energy becomes more disordered. Other definitions of entropy focus on this abstract concept of disorder. In computing values of a change of entropy, a simple relationship illustrated by a Carnot cycle isotherm is that the change in entropy S is the heat Q reversibly added to or removed from the working substance of the engine divided by the constant temperature T.







RSS Feed. template design thanks to Modern Clix by Rodrigo Galindez.