Global Information Lookup Global Information

Entropy information


Entropy
Common symbols
S
SI unitjoules per kelvin (J⋅K−1)
In SI base unitskg⋅m2⋅s−2⋅K−1

Entropy is a scientific concept that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.[1]

Entropy is central to the second law of thermodynamics, which states that the entropy of an isolated system left to spontaneous evolution cannot decrease with time. As a result, isolated systems evolve toward thermodynamic equilibrium, where the entropy is highest. A consequence of the second law of thermodynamics is that certain processes are irreversible.

The thermodynamic concept was referred to by Scottish scientist and engineer William Rankine in 1850 with the names thermodynamic function and heat-potential.[2] In 1865, German physicist Rudolf Clausius, one of the leading founders of the field of thermodynamics, defined it as the quotient of an infinitesimal amount of heat to the instantaneous temperature. He initially described it as transformation-content, in German Verwandlungsinhalt, and later coined the term entropy from a Greek word for transformation.[3]

Austrian physicist Ludwig Boltzmann explained entropy as the measure of the number of possible microscopic arrangements or states of individual atoms and molecules of a system that comply with the macroscopic condition of the system. He thereby introduced the concept of statistical disorder and probability distributions into a new field of thermodynamics, called statistical mechanics, and found the link between the microscopic interactions, which fluctuate about an average configuration, to the macroscopically observable behavior, in form of a simple logarithmic law, with a proportionality constant, the Boltzmann constant, that has become one of the defining universal constants for the modern International System of Units (SI).

  1. ^ Wehrl, Alfred (1 April 1978). "General properties of entropy". Reviews of Modern Physics. 50 (2): 221–260. Bibcode:1978RvMP...50..221W. doi:10.1103/RevModPhys.50.221.
  2. ^ Truesdell, C. (1980). The Tragicomical History of Thermodynamics, 1822–1854. New York: Springer-Verlag. p. 215. ISBN 0387904034 – via Internet Archive.
  3. ^ Brush, S.G. (1976). The Kind of Motion We Call Heat: a History of the Kinetic Theory of Gases in the 19th Century, Book 2, Statistical Physics and Irreversible Processes, Elsevier, Amsterdam, ISBN 0-444-87009-1, pp. 576–577.

and 22 Related for: Entropy information

Request time (Page generated in 0.5788 seconds.)

Entropy

Last Update:

Entropy is a scientific concept that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used...

Word Count : 13924

Second law of thermodynamics

Last Update:

process." The second law of thermodynamics establishes the concept of entropy as a physical property of a thermodynamic system. It predicts whether processes...

Word Count : 15498

Entropy unit

Last Update:

The entropy unit is a non-S.I. unit of thermodynamic entropy, usually denoted "e.u." or "eU" and equal to one calorie per kelvin per mole, or 4.184 joules...

Word Count : 71

Entropy and life

Last Update:

Research concerning the relationship between the thermodynamic quantity entropy and both the origin and evolution of life began around the turn of the...

Word Count : 8459

Social entropy

Last Update:

entropy is a sociological theory that evaluates social behaviours using a method based on the second law of thermodynamics. The equivalent of entropy...

Word Count : 195

Information theory

Last Update:

and electrical engineering. A key measure in information theory is entropy. Entropy quantifies the amount of uncertainty involved in the value of a random...

Word Count : 7088

Heat death of the universe

Last Update:

energy, and will therefore be unable to sustain processes that increase entropy. Heat death does not imply any particular absolute temperature; it only...

Word Count : 3347

Boltzmann constant

Last Update:

constant, and in Planck's law of black-body radiation and Boltzmann's entropy formula, and is used in calculating thermal noise in resistors. The Boltzmann...

Word Count : 2776

Entropy coding

Last Update:

In information theory, an entropy coding (or entropy encoding) is any lossless data compression method that attempts to approach the lower bound declared...

Word Count : 475

Maximum entropy

Last Update:

Maximum entropy thermodynamics Maximum entropy spectral estimation Principle of maximum entropy Maximum entropy probability distribution Maximum entropy classifier...

Word Count : 99

Entropy as an arrow of time

Last Update:

Entropy is one of the few quantities in the physical sciences that require a particular direction for time, sometimes called an arrow of time. As one...

Word Count : 5020

Hardware random number generator

Last Update:

physical process capable of producing entropy (in other words, the device always has access to a physical entropy source), unlike the pseudorandom number...

Word Count : 3204

Black hole thermodynamics

Last Update:

law of thermodynamics requires that black holes have entropy. If black holes carried no entropy, it would be possible to violate the second law by throwing...

Word Count : 3990

Temperature

Last Update:

including the macroscopic entropy, though microscopically referable to the Gibbs statistical mechanical definition of entropy for the canonical ensemble...

Word Count : 12973

Third law of thermodynamics

Last Update:

The third law of thermodynamics states that the entropy of a closed system at thermodynamic equilibrium approaches a constant value when its temperature...

Word Count : 2975

Conditional entropy

Last Update:

In information theory, the conditional entropy quantifies the amount of information needed to describe the outcome of a random variable Y {\displaystyle...

Word Count : 2071

Principle of maximum entropy

Last Update:

entropy states that the probability distribution which best represents the current state of knowledge about a system is the one with largest entropy,...

Word Count : 4227

Negentropy

Last Update:

as a measure of distance to normality. The concept and phrase "negative entropy" was introduced by Erwin Schrödinger in his 1944 popular-science book What...

Word Count : 1106

Tsallis entropy

Last Update:

In physics, the Tsallis entropy is a generalization of the standard Boltzmann–Gibbs entropy. It is proportional to the expectation of the q-logarithm...

Word Count : 2485

Introduction to entropy

Last Update:

In thermodynamics, entropy is a numerical quantity that shows that many physical processes can go in only one direction in time. For example, cream and...

Word Count : 5274

Entropy of activation

Last Update:

In chemical kinetics, the entropy of activation of a reaction is one of the two parameters (along with the enthalpy of activation) which are typically...

Word Count : 558

Laws of thermodynamics

Last Update:

define a group of physical quantities, such as temperature, energy, and entropy, that characterize thermodynamic systems in thermodynamic equilibrium....

Word Count : 2858

PDF Search Engine © AllGlobal.net