Entropy

For other uses of the term entropy, see Entropy (disambiguation)

The thermodynamic entropy S, often simply called the entropy in the context of thermodynamics, is a measure of the amount of energy in a physical system that cannot be used to do work. It is also a measure of the disorder present in a system. The SI unit of entropy is J·K-1 (joule per kelvin), which is the same unit as heat capacity.

Thermodynamic entropy is closely related to information entropy.

Contents

Thermodynamic definitions of entropy

The concept of entropy was originally introduced in 1865 by Rudolf Clausius. He defined the change in entropy of a thermodynamic system, during a reversible process in which an amount of heat ΔQ is applied at constant absolute temperature T, as

<math>\Delta S = \frac{\Delta Q}{T} \,\!<math>

Clausius gave the quantity S the name "entropy", from the Greek word τρoπή, "transformation". Since this definition involves only differences in entropy, the entropy itself is only defined up to an arbitrary additive constant. Later, we will discuss an alternative definition that uniquely determines the additive constant.

Entropy change in heat engines

Clausius' identification of S as a significant quantity was motivated by the study of reversible and irreversible thermodynamic transformations. A thermodynamic transformation is a change in a system's thermodynamic properties, such as its temperature and volume. A transformation is reversible (also known as quasistatic) if the system is infinitesimally close to thermodynamic equilibrium at all times; otherwise, it is irreversible. To illustrate this, consider a gas enclosed in a piston chamber, whose volume may be changed by moving the piston. If we move the piston slowly enough, the density of the gas is always homogeneous, so the transformation is reversible. If we move the piston quickly, pressure waves are created, so the gas is not in equilibrium, and the transformation is irreversible.

A heat engine is a thermodynamic system that can undergo a sequence of transformations which ultimately return it to its original state. Such a sequence is called a cyclic process, or simply a cycle. During some transformations, the engine may exchange heat with heat reservoirs, which are systems so large that their temperatures do not change when exchanging heat with the engine. The net result of a cycle is (i) mechanical work done by the system (which can be positive or negative, the latter meaning that work is done on the engine), and (ii) heat transferred between the heat reservoirs. By the conservation of energy, the net heat lost by the reservoirs is equal to the work done by the engine.

If every transformation in the cycle is reversible, the cycle is reversible, and it can be run in reverse, so that the heat transfers occur in the opposite direction and the amount of work done switches sign. The simplest reversible cycle is a Carnot cycle, which exchanges heat with two heat reservoirs.

In thermodynamics, absolute temperature is defined in the following way. Suppose we have two heat reservoirs. If a Carnot cycle absorbs an amount of heat Q from the first reservoir and delivers an amount of heat Q′ to the second, then the respective reservoir temperatures T and T′ obey

<math>\frac{Q}{T} = \frac{Q'}{T'} \,\!<math>

Proof: Introduce an additional heat reservoir at an arbitrary temperature T0, as well as N Carnot cycles with the following property: the j-th such cycle operates between the T0 reservoir and the Tj reservoir, transferring heat Qj to the latter. From the above definition of temperature, the heat extracted from the T0 reservoir by the j-th cycle is

<math>Q_{0,j} = T_0 \frac{Q_j}{T_j} \,\!<math>

Now consider one cycle of heat engine, accompanied by one cycle of each of the Carnot cycles. At the end of this process, each of the N reservoirs have zero net heat loss (since the heat extracted by the engine is replaced by the Carnot cycles), and the heat engine has done an amount of work equal to the heat extracted from the T0 reservoir,

<math>W = \sum_{j=1}^N Q_{0,j} = T_0 \sum_{j=1}^N \frac{Q_j}{T_j} \,\!<math>

If this quantity is positive, this process would be a perpetual motion machine of the second kind, which is forbidden by the second law of thermodynamics. Thus,

<math>\sum_{i=1}^N \frac{Q_i}{T_i} \le 0 \,\!<math>

Now repeat the above argument for the reverse cycle. The result is

<math>\sum_{i=1}^N \frac{Q_i}{T_i} = 0 \,\!<math> (reversible cycles)

Now consider a reversible cycle in which the engine exchanges heats Q1, Q2, ..., QN with a sequence of N heat reservoirs with temperatures T1, ..., TN. A positive Q means that heat flows from the reservoir to the engine, and a negative Q means that heat flows from the engine to the reservoir. We can show (see the box on the right) that

<math>\sum_{i=1}^N \frac{Q_i}{T_i} = 0 \,\!<math>

Since the cycle is reversible, the engine is always infinitesimally close to equilibrium, so its temperature is equal to any reservoir with which it is contact. In the limiting case of a reversible cycle consisting of a continuous sequence of transformations,

<math>\oint \frac{dQ}{T} = 0 \,\!<math> (reversible cycles)

where the integral is taken over the entire cycle, and T is the temperature of the system at each point in the cycle.

Entropy as a state function

We can now deduce an important fact about the entropy change during any thermodynamic transformation, not just a cycle. First, consider a reversible transformation that brings a system from an equilibrium state A to another equilibrium state B. If we follow this with any reversible transformation which returns that system to state A, our above result says that the net entropy change is zero. This implies that the entropy change in the first transformation depends only on the initial and final states.

This allows us to define the entropy of any equilibrium state of a system. Choose a reference state R and call its entropy SR. The entropy of any equilibrium state X is

<math>S_X = S_R + \int_R^X \frac{dQ}{T} \,\! <math>

Since the integral is independent of the particular transformation taken, this equation is well-defined.

Entropy change in irreversible transformations

We now consider irreversible transformations. It is straightforward to show that the entropy change during any transformation between two equilibrium states is

<math>\Delta S \ge \int \frac{dQ}{T} \,\!<math>

where the equality holds if the transformation is reversible.

Notice that if dQ = 0, then ΔS ≥ 0. The second law of thermodynamics is sometimes stated as this result: the total entropy of a thermally isolated system can never decrease.

Suppose a system is thermally isolated but remains in mechanical contact with the environment. If it is not in mechanical equilibrium with the environment, it will do work on the environment, or vice versa. For example, consider a gas enclosed in a piston chamber whose walls are perfect thermal insulators. If the pressure of the gas differs from the pressure applied to the piston, it will expand or contract, and work will be done. Our above result indicates that the entropy of the system will increase during this process (it could in principle remain constant, but this is unlikely.) Typically, there exists a maximum amount of entropy the system may possess under the circumstances. This entropy corresponds to a state of stable equilibrium, since a transformation to any other equilibrium state would cause the entropy to decrease, which is forbidden. Once the system reaches this maximum-entropy state, no more work may be done.

Statistical definition of entropy: Boltzmann's principle

In 1877, Boltzmann realised that the entropy of a system may be related to the number of possible "microstates" (microscopic states) consistent with its thermodynamic properties. Consider, for example, an ideal gas in a container. A microstate is specified with the positions and momenta of each constituent atom. Consistency requires us to consider only those microstates for which (i) the positions of all the particles are located within the volume of the container, (ii) the kinetic energies of the atoms sum up to the total energy of the gas, and so forth. Boltzmann then postulated that

<math>S = k (\ln \Omega) \,\!<math>

where k is known as Boltzmann's constant and Ω is the number of microstates that are consistent with the given macroscopic state. This postulate, which is known as Boltzmann's principle, may be regarded as the foundation of statistical mechanics, which describes thermodynamic systems using the statistical behaviour of its constituents. It relates a microscopic property of the system (Ω) to one of its thermodynamic properties (S).

Under Boltzmann's definition, the entropy is clearly a function of state. Furthermore, since Ω is just a natural number (1,2,3,...), the entropy must be positive — this is simply a property of the logarithm.

Entropy as a measure of disorder

We can view Ω as a measure of the disorder in a system. This is reasonable because what we think of as "ordered" systems tend to have very few configurational possibilities, and "disordered" systems have very many. Consider, for example, a set of 10 coins, each of which is either heads up or tails up. The most "ordered" macroscopic states are 10 heads or 10 tails; in either case, there is exactly one configuration that can produce the result. In contrast, the most "disordered" state consists of 5 heads and 5 tails, and there are 252 (10 choose 5) ways to produce this result.

Under the statistical definition of entropy, the second law of thermodynamics states that the disorder in an isolated system tends to increase. This can be understood using our coin example. Suppose that we start off with 10 heads, and re-flip one coin at random every minute. If we examine the system after a long time has passed, it is possible that we will still see 10 heads, or even 10 tails, but that is not very likely; it is far more probable that we will see approximately as many heads as tails.

Since its discovery, the idea that disorder tends to increase has been the focus of a great deal of thought, some of it confused. A chief point of confusion is the fact that the result ΔS ≥ 0 applies only to isolated systems; notably, the Earth is not an isolated system because it is constantly receiving energy in the form of sunlight. Nevertheless, it has been pointed out that the universe may be considered an isolated system, so that its total disorder should be constantly increasing. It has been speculated that the universe is fated to a heat death in which all the energy ends up as a homogeneous distribution of thermal energy, so that no more work can be extracted from any source. Recent work, however, has cast extensive doubt on the heat death hypothesis and the applicability of any simple thermodynamical model to the universe in general. Although entropy does increase in an expanding universe, the maximum possible entropy rises much more rapidly and leads to an "entropy gap," thus pushing the system further away from equilibrium with each time increment. Furthermore, complicating factors such as the impact of gravity, energy density of the vacuum (and thus a hypothesized "antigravity"), and macroscopic quantum effects under unusual conditions cannot be reconciled with current thermodynamical models, making any predictions of large-scale thermodynamics extremely difficult.

Counting of microstates

In classical statistical mechanics, the number of microstates is actually uncountably infinite, since the properties of classical systems are continuous. For example, a microstate of a classical ideal gas is specified by the positions and momenta of all the atoms, which range continuously over the real numbers. If we want to define Ω, we have to come up with a method of grouping the microstates together to obtain a countable set. This procedure is known as "coarse graining". In the case of the ideal gas, we count two states of an atom as the "same" state if their positions and momenta are within δx and δp of each other. Since the values of δx and δp can be chosen arbitrarily, the entropy is not uniquely defined. It is defined only up to an additive constant, just like the thermodynamic definition of entropy.

This ambiguity can be resolved with quantum mechanics. The quantum state of a system can be expressed as a superposition of "basis" states, which can be chosen to be energy eigenstates (i.e. eigenstates of the quantum Hamiltonian.) In quantum statistical mechanics, we take Ω to be the number of energy eigenstates consistent with the thermodynamic properties of the system. Ω can be defined since the energy eigenstates are generally countable.

This leads to Nernst's theorem, sometimes referred to as the third law of thermodynamics, which states that the entropy of a system at zero absolute temperature is a well-defined constant. This is due to the fact that a system at zero temperature exists in its ground state, so that its entropy is determined by the degeneracy of the ground state. Many systems, such as crystal lattices, have a unique ground state, and therefore have zero entropy at absolute zero (since ln(1) = 0).

Measuring entropy

In real experiments, it is quite difficult to measure the entropy of a system. The techniques for doing so are based on the thermodynamic definition of the entropy, and require extremely careful calorimetry.

For simplicity, we will examine a mechanical system, whose thermodynamic state may be specified by its volume V and pressure P. In order to measure the entropy of a specific state, we must first measure the heat capacity at constant volume and at constant pressure (denoted CV and CP respectively), for a successive set of states intermediate between a reference state and the desired state. The heat capacities are related to the entropy S and the temperature T by

<math>C_X = T \left(\frac{\partial S}{\partial T}\right)_X \,\!<math>

where the X subscript refers to either constant volume or constant pressure. This may be integrated numerically to obtain a change in entropy:

<math>\Delta S = \int \frac{C_X}{T} dT \,\!<math>

We can thus obtain the entropy of any state (P,V) with respect to a reference state (P0,V0). The exact formula depends on our choice of intermediate states. For example, if the reference state has the same pressure as the final state,

<math> S(P,V) = S(P, V_0) + \int^{T(P,V)}_{T(P,V_0)} \frac{C_P(P,V(T,P))}{T} dT \,\!<math>

In addition, if the path between the reference and final states lies across any first order phase transition, the latent heat associated with the transition must be taken into account.

The entropy of the reference state must be determined independently. Ideally, one chooses a reference state at an extremely high temperature, at which the system exists as a gas. The entropy in such a state would be that of a classical ideal gas plus contributions from molecular rotations and vibrations, which may be determined spectroscopically. Choosing a low temperature reference state is sometimes problematic since the entropy at low temperatures may behave in unexpected ways. For instance, a calculation of the entropy of ice by the latter method, assuming no entropy at zero temperature, falls short of the value obtained with a high-temperature reference state by 3.41 J/(mol·K). This is due to the fact that the molecular crystal lattice of ice exhibits geometrical frustration, and thus possesses a non-vanishing "zero-point" entropy at arbitrarily low temperatures.

See also

External links

References

  • Fermi, E., Thermodynamics, Prentice Hall (1937)
  • Reif, F., Fundamentals of statistical and thermal physics, McGraw-Hill (1965)
  • Rifkin, Jeremy, Entropy Viking (1980)ar:انتروبية

de:Entropie es:Entropa fr:Entropie it:Entropia (termodinamica) ja:エントロピー ko:엔트로피 lv:entropija nl:entropie no:Entropi nn:Entropi pl:Entropia (termodynamika) ru:Энтропия sl:Entropija sr:Ентропија fi:Entropia zh:熵_(熱力學)

Navigation

  • Art and Cultures
    • Art (https://academickids.com/encyclopedia/index.php/Art)
    • Architecture (https://academickids.com/encyclopedia/index.php/Architecture)
    • Cultures (https://www.academickids.com/encyclopedia/index.php/Cultures)
    • Music (https://www.academickids.com/encyclopedia/index.php/Music)
    • Musical Instruments (http://academickids.com/encyclopedia/index.php/List_of_musical_instruments)
  • Biographies (http://www.academickids.com/encyclopedia/index.php/Biographies)
  • Clipart (http://www.academickids.com/encyclopedia/index.php/Clipart)
  • Geography (http://www.academickids.com/encyclopedia/index.php/Geography)
    • Countries of the World (http://www.academickids.com/encyclopedia/index.php/Countries)
    • Maps (http://www.academickids.com/encyclopedia/index.php/Maps)
    • Flags (http://www.academickids.com/encyclopedia/index.php/Flags)
    • Continents (http://www.academickids.com/encyclopedia/index.php/Continents)
  • History (http://www.academickids.com/encyclopedia/index.php/History)
    • Ancient Civilizations (http://www.academickids.com/encyclopedia/index.php/Ancient_Civilizations)
    • Industrial Revolution (http://www.academickids.com/encyclopedia/index.php/Industrial_Revolution)
    • Middle Ages (http://www.academickids.com/encyclopedia/index.php/Middle_Ages)
    • Prehistory (http://www.academickids.com/encyclopedia/index.php/Prehistory)
    • Renaissance (http://www.academickids.com/encyclopedia/index.php/Renaissance)
    • Timelines (http://www.academickids.com/encyclopedia/index.php/Timelines)
    • United States (http://www.academickids.com/encyclopedia/index.php/United_States)
    • Wars (http://www.academickids.com/encyclopedia/index.php/Wars)
    • World History (http://www.academickids.com/encyclopedia/index.php/History_of_the_world)
  • Human Body (http://www.academickids.com/encyclopedia/index.php/Human_Body)
  • Mathematics (http://www.academickids.com/encyclopedia/index.php/Mathematics)
  • Reference (http://www.academickids.com/encyclopedia/index.php/Reference)
  • Science (http://www.academickids.com/encyclopedia/index.php/Science)
    • Animals (http://www.academickids.com/encyclopedia/index.php/Animals)
    • Aviation (http://www.academickids.com/encyclopedia/index.php/Aviation)
    • Dinosaurs (http://www.academickids.com/encyclopedia/index.php/Dinosaurs)
    • Earth (http://www.academickids.com/encyclopedia/index.php/Earth)
    • Inventions (http://www.academickids.com/encyclopedia/index.php/Inventions)
    • Physical Science (http://www.academickids.com/encyclopedia/index.php/Physical_Science)
    • Plants (http://www.academickids.com/encyclopedia/index.php/Plants)
    • Scientists (http://www.academickids.com/encyclopedia/index.php/Scientists)
  • Social Studies (http://www.academickids.com/encyclopedia/index.php/Social_Studies)
    • Anthropology (http://www.academickids.com/encyclopedia/index.php/Anthropology)
    • Economics (http://www.academickids.com/encyclopedia/index.php/Economics)
    • Government (http://www.academickids.com/encyclopedia/index.php/Government)
    • Religion (http://www.academickids.com/encyclopedia/index.php/Religion)
    • Holidays (http://www.academickids.com/encyclopedia/index.php/Holidays)
  • Space and Astronomy
    • Solar System (http://www.academickids.com/encyclopedia/index.php/Solar_System)
    • Planets (http://www.academickids.com/encyclopedia/index.php/Planets)
  • Sports (http://www.academickids.com/encyclopedia/index.php/Sports)
  • Timelines (http://www.academickids.com/encyclopedia/index.php/Timelines)
  • Weather (http://www.academickids.com/encyclopedia/index.php/Weather)
  • US States (http://www.academickids.com/encyclopedia/index.php/US_States)

Information

  • Home Page (http://academickids.com/encyclopedia/index.php)
  • Contact Us (http://www.academickids.com/encyclopedia/index.php/Contactus)

  • Clip Art (http://classroomclipart.com)
Toolbox
Personal tools