Rudolf
Clausius (1822–1888), originator of
the concept of entropy
The analysis which led to the concept of entropy
began with the work of French mathematician Lazare Carnot
who in his 1803 paper Fundamental Principles of Equilibrium and Movement
proposed that in any machine the accelerations and shocks of the moving parts
represent losses of moment of activity. In other words, in any natural
process there exists an inherent tendency towards the dissipation of useful
energy. Building on this work, in 1824 Lazare's son Sadi Carnot published Reflections on the Motive Power of
Fire which posited that in all heat-engines, whenever "caloric"
(what is now known as heat)
falls through a temperature difference, work or motive power
can be produced from the actions of its fall from a hot to cold body. He made
the analogy with that of how water falls in a water wheel.
This was an early insight into the second law of thermodynamics. Carnot based his views of heat partially on the early 18th century
"Newtonian hypothesis" that both heat and light were types of
indestructible forms of matter, which are attracted and repelled by other
matter, and partially on the contemporary views of Count Rumford
who showed (1789) that heat could be created by friction as when cannon bores
are machined.
Carnot reasoned that if the body of the working substance, such as a body of
steam, is returned to its original state at the end of a complete engine cycle,
that "no change occurs in the condition of the working body".
The first law of thermodynamics, deduced from
the heat-friction experiments of James Joule
in 1843, expresses the concept of energy, and its conservation in all processes; the first
law, however, is unable to quantify the effects of friction
and dissipation.
In the 1850s and 1860s, German physicist Rudolf
Clausius objected to the supposition that no change occurs in the
working body, and gave this "change" a mathematical interpretation by
questioning the nature of the inherent loss of usable heat when work is done,
e.g. heat produced by friction.
Clausius described entropy as the transformation-content, i.e.
dissipative energy
use, of a thermodynamic system or working body
of chemical species during a change of state.
This was in contrast to earlier views, based on the theories of Isaac Newton,
that heat was an indestructible particle that had mass.
Later, scientists such as Ludwig
Boltzmann, Josiah Willard Gibbs, and James Clerk Maxwell gave entropy a statistical
basis. In 1877 Boltzmann visualized a probabilistic way to measure the entropy
of an ensemble of ideal gas particles, in which he defined
entropy to be proportional to the logarithm of the number of microstates such a
gas could occupy. Henceforth, the essential problem in statistical thermodynamics, i.e. according
to Erwin Schrödinger, has been to determine the
distribution of a given amount of energy E over N identical systems. Carathéodory
linked entropy with a mathematical definition of irreversibility, in terms of
trajectories and integrability.
Applications
The fundamental thermodynamic relation
The entropy of a system depends on its internal
energy and the external parameters, such as the volume. In the thermodynamic
limit this fact leads to an equation relating the change in the internal energy
to changes in the entropy and the external parameters. This relation is known
as the fundamental thermodynamic relation. If the volume is the only external
parameter, this relation is:
Since the internal energy is fixed when one
specifies the entropy and the volume, this relation is valid even if the change
from one state of thermal equilibrium to another with infinitesimally larger
entropy and volume happens in a non-quasistatic way (so during this change the
system may be very far out of thermal equilibrium and then the entropy,
pressure and temperature may not exist).
The fundamental thermodynamic relation implies many
thermodynamic identities that are valid in general, independent of the
microscopic details of the system. Important examples are the Maxwell
relations and the relations between heat capacities. Entropy in chemical thermodynamics
Thermodynamic entropy is central in chemical thermodynamics, enabling changes
to be quantified and the outcome of reactions predicted. The second law of thermodynamics states that
entropy in an isolated system – the combination of a
subsystem under study and its surroundings – increases during all spontaneous
chemical and physical processes. The Clausius equation of δqrev/T
= ΔS introduces the measurement of entropy change, ΔS. Entropy
change describes the direction and quantifies the magnitude of simple changes
such as heat transfer between systems – always from hotter to cooler
spontaneously.
The thermodynamic entropy therefore has the
dimension of energy divided by temperature, and the unit joule per kelvin (J/K) in
the International System of Units (SI).
Thermodynamic entropy is an extensive property,
meaning that it scales with the size or extent of a system. In many processes
it is useful to specify the entropy as an intensive property
independent of the size, as a specific entropy characteristic of the type of
system studied. Specific entropy may be expressed relative to a unit of mass,
typically the kilogram
(unit: Jkg−1K−1). Alternatively, in chemistry, it is also
referred to one mole of substance, in which case it is called
the molar entropy with a unit of Jmol−1K−1.
Thus, when one mole of substance at about 0K is
warmed by its surroundings to 298K, the sum of the incremental values of qrev/T
constitute each element's or compound's standard molar entropy, an indicator of
the amount of energy stored by a substance at 298K.
Entropy change also measures the mixing of substances as a summation of their
relative quantities in the final mixture.
Entropy is equally essential in predicting the
extent and direction of complex chemical reactions. For such applications, ΔS
must be incorporated in an expression that includes both the system and its
surroundings, ΔSuniverse = ΔSsurroundings +
ΔS system. This expression becomes, via some steps, the Gibbs free
energy equation for reactants and products in the system: ΔG
[the Gibbs free energy change of the system] = ΔH [the enthalpy change]
−T ΔS [the entropy change].
Entropy balance equation for open systems
During steady-state continuous
operation, an entropy balance applied to an open system accounts for system
entropy changes related to heat flow and mass flow across the system boundary.
In chemical engineering, the principles of
thermodynamics are commonly applied to "open systems", i.e. those in which
heat, work, and mass flow across the
system boundary. Flows of both heat () and work, i.e. (shaft work) and P(dV/dt)
(pressure-volume work), across the system boundaries, in general cause changes
in the entropy of the system. Transfer as heat entails entropy transfer where T is
the absolute thermodynamic temperature of the system at
the point of the heat flow. If there are mass flows across the system
boundaries, they will also influence the total entropy of the system. This
account, in terms of heat and work, is valid only for cases in which the work
and heat transfers are by paths physically distinct from the paths of entry and
exit of matter from the system.
To derive a generalized entropy balanced
equation, we start with the general balance equation for the change in any extensive quantity Θ in a thermodynamic system, a quantity that may
be either conserved, such as energy, or non-conserved, such as entropy. The
basic generic balance expression states that dΘ/dt, i.e. the rate of change of
Θ in the system, equals the rate at which Θ enters the system at the boundaries,
minus the rate at which Θ leaves the system across the system boundaries, plus
the rate at which Θ is generated within the system. For an open thermodynamic
system in which heat and work are transferred by paths separate from the paths
for transfer of matter, using this generic balance equation, with respect to
the rate of change with time of the extensive quantity entropy S, the
entropy balance equation is:
where
the net rate of entropy flow due to the flows of mass into and out of the
system (where = entropy per unit
mass).
= the rate
of entropy flow due to the flow of heat across the system boundary.
= the rate of entropy production within the system. This
entropy production arises from processes within the system, including chemical
reactions, internal matter diffusion, internal heat transfer, and frictional
effects such as viscosity occurring within the system from mechanical work
transfer to or from the system.
Note, also, that if there are multiple heat
flows, the term will be replaced
by where is the heat flow
and is the temperature at
the jth heat flow port into the system.
source :
0 komentar:
Posting Komentar