In thermodynamics,
entropy (usual symbol S)
is a measure of the number of specific ways in which a thermodynamic system may be arranged,
commonly misunderstood as a measure of disorder. According to the second law of thermodynamics the entropy
of an isolated system never decreases; such a system
will spontaneously evolve toward thermodynamic equilibrium, the configuration
with maximum entropy. Systems
that are not isolated may decrease in entropy, provided they increase the
entropy of their environment by at least that same amount. Since entropy is a state
function, the change in the entropy of a system is the same for any
process that goes from a given initial state to a given final state, whether
the process is reversible or irreversible. However, irreversible
processes increase the combined entropy of the system and its environment.
The change in entropy (ΔS) of a system was
originally defined for a thermodynamically reversible process
as
,
where T is the absolute temperature of the system,
dividing an incremental reversible transfer of heat into that system (dQ). (If heat is transferred out the sign would be
reversed giving a decrease in entropy of the system.) The above definition is
sometimes called the macroscopic definition of entropy because it can be used
without regard to any microscopic description of the contents of a system. The
concept of entropy has been found to be generally useful and has several other
formulations. Entropy was discovered when it was noticed to be a quantity that
behaves as a function of state, as a consequence of the
second law of thermodynamics.
Entropy is an extensive property. It has
the dimension of energy divided
by temperature,
which has a unit of joules
per kelvin
(J K−1) in the International System of Units (or
kg m2 s−2 K−1 in terms of base
units). But the entropy of a pure substance is usually given as an intensive property —
either entropy per unit mass
(SI unit: J K−1 kg−1) or entropy per unit amount of substance (SI unit: J K−1 mol−1).
The absolute entropy (S rather than
ΔS) was defined later, using either statistical mechanics or the third law of thermodynamics.
In the modern microscopic interpretation of
entropy in statistical mechanics, entropy is the amount of additional
information needed to specify the exact physical state of a system, given its
thermodynamic specification. Understanding the role of thermodynamic entropy in
various processes requires an understanding of how and why that information
changes as the system evolves from its initial to its final condition. It is
often said that entropy is an expression of the disorder, or randomness
of a system, or of our lack of information about it. The second law is now
often seen as an expression of the fundamental postulate of statistical
mechanics through the modern definition of entropy.
Definitions and descriptions
There are two related definitions of entropy: the
thermodynamic
definition and the statistical mechanics definition.
Historically, the classical thermodynamics definition developed first. In the classical thermodynamics viewpoint, the
system is composed of very large numbers of constituents (atoms, molecules) and
the state of the system is described by the average thermodynamic properties of
those constituents; the details of the system's constituents are not directly
considered, but their behavior is described by macroscopically averaged
properties, e.g. temperature, pressure, entropy, heat capacity. The early
classical definition of the properties of the system assumed equilibrium. The
classical thermodynamic definition of entropy has more recently been extended
into the area of non-equilibrium thermodynamics.
Later, the thermodynamic properties, including entropy, were given an alternative definition in terms of the
statistics of the motions of the microscopic constituents of a system — modeled
at first classically, e.g. Newtonian particles constituting a gas, and later
quantum-mechanically (photons, phonons, spins, etc.). The statistical mechanics description
of the behavior of a system is necessary as the definition of the properties of
a system using classical thermodynamics become an increasingly unreliable
method of predicting the final state of a system that is subject to some
process.
Function of state
There are many thermodynamic properties that are functions of state. This means that at a
particular thermodynamic state (which should not be confused with the
microscopic state of a system), these properties have a certain value. Often,
if two properties of the system are determined, then the state is determined
and the other properties' values can also be determined. For instance, a gas at
a particular temperature and pressure has its state fixed by those values, and
has a particular volume that is determined by those values. As another
instance, a system composed of a pure substance of a single phase
at a particular uniform temperature and pressure is determined (and is thus a
particular state) and is at not only a particular volume but also at a
particular entropy. The
fact that entropy is a function of state is one reason it is useful. In the
Carnot cycle, the working fluid returns to the same state it had at the start
of the cycle, hence the line integral of any state function, such as
entropy, over the cycle is zero.
Reversible process
Entropy is defined for a reversible process and for
a system that, at all times, can be treated as being at a uniform state and
thus at a uniform temperature. Reversibility is an ideal that some real
processes approximate and that is often presented in study exercises. For a
reversible process, entropy behaves as a conserved quantity and no change
occurs in total entropy. More specifically, total entropy is conserved in a
reversible process and not conserved in an irreversible process.
One has to be careful about system boundaries. For example, in the Carnot
cycle, while the heat flow from the hot reservoir to the cold reservoir
represents an increase in entropy, the work output, if reversibly and perfectly
stored in some energy storage mechanism, represents a decrease in entropy that
could be used to operate the heat engine in reverse and return to the previous
state, thus the total entropy change is still zero at all times if the
entire process is reversible. Any process that does not meet the requirements
of a reversible process must be treated as an irreversible process, which is
usually a complex task. An irreversible process increases entropy.
Heat transfer situations require two or more
non-isolated systems in thermal contact. In irreversible heat transfer, heat
energy is irreversibly transferred from the higher temperature
system to the lower temperature system, and the combined entropy of the systems
increases. Each system, by definition, must have its own absolute temperature
applicable within all areas in each respective system in order to calculate the
entropy transfer. Thus, when a system at higher temperature TH transfers heat dQ
to a system of lower temperature TC,
the former loses entropy dQ/TH
and the latter gains entropy dQ/TC.
Since TH > TC,
it follows that dQ/TH < dQ/TC,
whence there is a net gain in the combined entropy. When calculating entropy,
the same requirement of having an absolute temperature for each system in
thermal contact exchanging heat also applies to the entropy change of an
isolated system having no thermal contact.
Classical thermodynamics
The thermodynamic definition of entropy was
developed in the early 1850s by Rudolf
Clausius and essentially describes how to measure the entropy of an isolated
system in thermodynamic equilibrium with its parts.
Clausius created the term entropy as an extensive thermodynamic variable that
was shown to be useful in characterizing the Carnot cycle.
Heat transfer along the isotherm steps of the Carnot cycle was found to be
proportional to the temperature of a system (known as its absolute temperature). This relationship
was expressed in increments of entropy equal to the ratio of incremental heat
transfer divided by temperature, which was found to vary in the thermodynamic
cycle but eventually return to the same value at the end of every cycle. Thus
it was found to be a function of state, specifically a thermodynamic
state of the system. Clausius wrote that he "intentionally formed the word
Entropy as similar as possible to the word Energy", basing the term on the
Greek
ἡ τροπή tropē, "transformation".
While Clausius based his definition on a
reversible process, there are also irreversible processes that change entropy.
Following the second law of thermodynamics, entropy of
an isolated system always increases. The difference
between an isolated system and closed system is that heat may not flow
to and from an isolated system, but heat flow to and from a closed system is
possible. Nevertheless, for both closed and isolated systems, and indeed, also
in open systems, irreversible thermodynamics processes may occur.
According to the Clausius equality,
for a reversible cyclic process: This
means the line integral is
path-independent.
So we can define a state function S called
entropy, which satisfies
To find the entropy difference between any two
states of a system, the integral must be evaluated for some reversible path
between the initial and final states. Since entropy is a state function, the
entropy change of the system for an irreversible path will be the same as for a
reversible path between the same two states. However, the entropy change of the
surroundings will be different.
We can only obtain the change of entropy by
integrating the above formula. To obtain the absolute value of the entropy, we
need the third law of thermodynamics, which states
that S = 0 at absolute zero for perfect crystals.
From a macroscopic perspective, in classical thermodynamics the entropy is
interpreted as a state function of a thermodynamic system: that is, a property
depending only on the current state of the system, independent of how that
state came to be achieved. In any process where the system gives up energy ΔE,
and its entropy falls by ΔS, a quantity at least TR ΔS
of that energy must be given up to the system's surroundings as unusable heat (TR
is the temperature of the system's external surroundings). Otherwise the
process will not go forward. In classical thermodynamics, the entropy of a
system is defined only if it is in thermodynamic equilibrium.
Second law of thermodynamics
The second law of thermodynamics states that in
general the total entropy of any system will not decrease other than by
increasing the entropy of some other system. Hence, in a system isolated from
its environment, the entropy of that system will tend not to decrease. It
follows that heat will not flow from a colder body to a hotter body without the
application of work (the imposition of order) to the colder body. Secondly, it
is impossible for any device operating on a cycle to produce net work from a
single temperature reservoir; the production of net work requires flow of heat
from a hotter reservoir to a colder reservoir, or a single expanding reservoir
undergoing adiabatic cooling, which performs adiabatic
work. As a result, there is no possibility of a perpetual
motion system. It follows that a reduction in the increase of
entropy in a specified process, such as a chemical
reaction, means that it is energetically more efficient.
It follows from the second law of thermodynamics
that the entropy of a system that is not isolated may decrease. An air
conditioner, for example, may cool the air in a room, thus reducing
the entropy of the air of that system. The heat expelled from the room (the
system), which the air conditioner transports and discharges to the outside
air, will always make a bigger contribution to the entropy of the environment
than will the decrease of the entropy of the air of that system. Thus, the
total of entropy of the room plus the entropy of the environment increases, in
agreement with the second law of thermodynamics.
In mechanics, the second law in conjunction with
the fundamental thermodynamic relation
places limits on a system's ability to do useful work. The entropy change of a
system at temperature T absorbing an infinitesimal amount of heat δq
in a reversible way, is given by δq/T. More explicitly, an energy TR S
is not available to do useful work, where TR is the
temperature of the coldest accessible reservoir or heat sink external to the
system. For further discussion, see Exergy.
Statistical mechanics demonstrates that entropy
is governed by probability, thus allowing for a decrease in disorder even in an
isolated system. Although this is possible, such an event has a small
probability of occurring, making it unlikely.
source = http://en.wikipedia.org/wiki/Entropy
0 komentar:
Posting Komentar