This article gives an overview of entropy in thermodynamics. For entropy in information theory, see information entropy. For connections between the two, see Entropy in thermodynamics and information theory. For other uses of the term, see Entropy (disambiguation), and further articles in Category:Entropy.
In chemistry, physics and thermodynamics, thermodynamic entropy, symbolized by S, is a differential term "dQ/T", where dQ is the amount of heat absorbed reversibly by a thermodynamic system at a temperature T. German physicist Rudolf Clausius introduced the mathematical concept of entropy in the early 1850s to account for the dissipation of energy in thermodynamic systems that produce work. He coined the term based on the Greek ÏÏοÏη meaning "transformation". Although the concept of entropy is primarily a thermodynamic construct, it has given rise to ideas in many disparate fields of study, including statistical mechanics, thermal physics, information theory, psychodynamics, economics, and evolution
Ice melting example
The illustration for this article is a classic example in which entropy increases in a small 'universe', a thermodynamic system consisting of the 'surroundings' (the warm room) and 'system' (glass, ice, cold water). In this universe, some heat energy dQ from the warmer room surrroundings (at 77 F (298 K) will spread out to the cooler system of ice and water at its constant temperature T of 32 F (273 K), the melting temperature of ice. Thus, the entropy of the system, which is dQ/T, increases by dQ/273 K. (The heat dQ for this process is the energy required to change water from the solid state to the liquid state, and is called the enthalpy of fusion, i.e. the ÎH for ice fusion.)
It is important to realize that the entropy of the surrounding room decreases less than the entropy of the ice and water increases: the room temperature of 298 K is larger than 273 K and therefore the ratio, (entropy change), of dQ/298 K for the surroundings is smaller than the ratio (entropy change), of dQ/273 K for the ice+water system. This is always true in spontaneous events in a thermodynamic system and it shows the predictive importance of entropy': the final net entropy after such an event is always greater than was the initial entropy.
As the temperature of the cool water rises to that of the room and the room further cools imperceptibly, the sum of the dQ/T over the continuous range, “at many increments”, in the initially cool to finally warm water can be found by calculus. The entire miniature ‘universe’, i.e. this thermodynamic system, has increased in entropy. Energy has spontaneously become more dispersed and spread out in that ‘universe’ than when the glass of ice + water was introduced and became a 'system' within it.
Overview
In a thermodynamic system, a 'universe' consisting of 'surroundings' and 'system' and made up of quantities of matter, its pressure differences, density differences, and temperature differences all tend to equalize over time. As shown in the preceding discussion of the illustration involving a warm room (surrroundings) and cold glass of ice and water (system), the difference in temperature begins to be equalized as portions of the heat energy from the warm surroundings become spread out to the cooler system of ice and water. Over time the temperature of the glass and its contents becomes equal to that of the room. The entropy of the room has decreased because some of its energy has been dispersed to the ice and water. However, as calculated in the discussion above, the entropy of the system of ice and water has increased more than the entropy of the surrounding room decreased. This is always true, the dispersal of energy from warmer to cooler always results in an increase in entropy. Thus, when the 'universe' of the room surroundings and ice and water system has reached an equilibrium of equal temperature, the entropy change from the initial state is at a maximum. The entropy of the thermodynamic system is a measure of how far the equalization has progressed.
Entropy is often described as "a measure of the disorder of a thermodynamic system" or "how mixed-up the system is". Such statements should be suspect immediately, because the terms "disorder" and "mixedupedness" are not well defined. The "disorder" of the system as a whole can be formally defined (as discussed below) in a way that is consistent with the realities of entropy, but note that such a definition will almost always lead to confusion. It is only if the word is used in this special sense that a system that is more "disordered" or more "mixed up" on a molecular scale will necessarily also be "a system with a lower amount of energy available to do work" or "a system in a macroscopically more probable state".
The entropy of a thermodynamic system can be interpreted in two distinct, but compatible, ways:
From a macroscopic perspective, in classical thermodynamics the entropy is interpreted simply as a state function of a thermodynamic system: that is, a property depending only on the current state of the system, independent of how that state came to be achieved. The state function has the important property that, when multiplied by a reference temperature, it can be understood as a measure of the amount of energy in a physical system that cannot be used to do thermodynamic work; i.e., work mediated by thermal energy. More precisely, in any process where the system gives up energy ÎE, and its entropy falls by ÎS, a quantity at least TR ÎS of that energy must be given up to the system's surroundings as unusable heat (TR is the temperature of the system's external surroundings). Otherwise the process will not go forward.
From a microscopic perspective, in statistical thermodynamics the entropy is envisioned as a measure of the number of microscopic configurations that are capable of yielding the observed macroscopic description of the thermodynamic system. A more "disordered" or "mixed up" system can thus be formally defined as one which has more microscopic states compatible with the macroscopic description, however this definition is not standard and thus prone to confusing people. It can be shown that this definition of entropy, sometimes referred to as Boltzmann's postulate, reproduces all of the properties of the entropy of classical thermodynamics.
An important law of physics, the second law of thermodynamics, states that the total entropy of any isolated thermodynamic system tends to increase over time, approaching a maximum value. Unlike almost all other laws of physics, this associates thermodynamics with a definite arrow of time. However, for a universe of infinite size, which cannot be regarded as an isolated system, the second law does not apply
History
The short history of entropy begins with the work of mathematician Lazare Carnot who in his 1803 work Fundamental Principles of Equilibrium and Movement postulated that in any machine the accelerations and shocks of the moving parts all represent losses of moment of activity. In other words, in any natural process there exists an inherent tendency towards the dissipation of useful energy. Building on this work, in 1824 Lazare’s son Sadi Carnot published Reflections on the Motive Power of Fire in which he set forth the view that in all heat-engines "caloric", or what is now known as heat, moves from hot to cold and that "some caloric is always lost". This lost caloric was a precursory form of entropy loss as we now know it. Though formulated in terms of caloric, rather than entropy, this was an early insight into the second law of thermodynamics. In the 1850s, Rudolf Clausius began to give this "lost caloric" a mathematical interpretation by questioning the nature of the inherent loss of heat when work is done, e.g. heat produced by friction.[1] In 1865, Clausius gave this heat loss a name:[2]
I propose to name the quantity S the entropy of the system, after the Greek word [trope], the transformation. I have deliberately chosen the word entropy to be as similar as possible to the word energy: the two quantities to be named by these words are so closely related in physical significance that a certain similarity in their names appears to be appropriate.
Later, scientists such as Ludwig Boltzmann, Willard Gibbs, and James Clerk Maxwell gave entropy a statistical basis. Carathéodory linked entropy with a mathematical definition of irreversiblity, in terms of trajectories and integrability.
Thermodynamic definition
In the early 1850s, Rudolf Clausius began to put the concept of "energy turned to waste" on a differential footing. Essentially, he set forth the concept of the thermodynamic system and positioned the argument that in any irreversible process a small amount of heat energy dQ is incrementally dissipated across the system boundary.
Specifically, in 1850 Clausius published his first memoir in which he presented a verbal argument as to why Carnot’s theorem, proposing the equivalence of heat and work, i.e. Q = W, was not perfectly correct and as such it would need amendment. In 1854, Clausius states: “In my memoir ‘On the Moving Force of Heat, &c.’, I have shown that the theorem of the equivalence of heat and work, and Carnot’s theorem, are not mutually exclusive, by that, by a small modification of the latter, which does not affect its principle, they can be brought into accordance.” This small modification on the latter is what developed into the second law of thermodynamics.
In his 1854 memoir, Clausius first develops the concepts of interior work, i.e. “those which the atoms of the body exert upon each other”, and exterior work, i.e. “those which arise from foreign influences which the body may be exposed”, which may act on a working body of fluid or gas, typically functioning to work a piston. He then discusses the three types of heat by which Q may be divided:
heat employed in increasing the heat actually existing in the body
heat employed in producing the interior work
heat employed in producing the exterior work
Building on this logic, and following a mathematical presentation of the first fundamental theorem, Clausius then presents us with the first-ever mathematical formulation of entropy, although at this point in the development of his theories calls it “equivalence-value”. He states, “the second fundamental theorem in the mechanical theory of heat may thus be enunciated:"[3]
This is the first-ever mathematical formulation of entropy; at this point, however, Clausius had not yet affixed the concept with the label entropy as we currently know it; this would come in the following two years.
In 1876, chemical engineer Willard Gibbs, building on the work of those as Clausius and Hermann von Helmholtz, situated the view that the measurement of “available energy” ÎG in a thermodynamic system could be mathematically accounted for by subtracting the “energy loss” TÎS from total energy change of the system ÎH. These concepts were further developed by James Clerk Maxwell [1871] and Max Planck [1903].
[edit]
Units and symbols
Conjugate variables
of thermodynamics
Pressure Volume
Temperature Entropy
Chem. potential Particle no.
Entropy is a key physical variable in describing a thermodynamic system. The SI unit of entropy is 'joule per kelvin' (J·Kâ1), which is the same as the unit of heat capacity, and entropy is said to be thermodynamically conjugate to temperature. The entropy depends only on the current state of the system, not its detailed previous history, and so it is a state function of the parameters like pressure, temperature, etc., which describe the observable macroscopic properties of the system. Entropy is usually symbolized by the letter S.
There is an important connection between entropy and the amount of internal energy in the system which is not available to perform work. In any process where the system gives up an energy ÎE, and its entropy falls by ÎS, a quantity at least TR ÎS of that energy must be given up to the system's surroundings as unusable heat. Otherwise the process will not go forward. (TR is the temperature of the system's external surroundings, which may not be the same as the system's current temperature T ).
2006-07-17 04:20:37
·
answer #10
·
answered by vishal 3
·
0⤊
0⤋