English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

Can someone explain in some-what simple terms what entropy is? I've read some things about it, but it's a bit hard for me to grasp.

2006-07-13 22:09:38 · 10 answers · asked by blah 2 in Science & Mathematics Physics

10 answers

Entropy is the state of being less ordered. For example, if you don't invest the energy to clean your room, over time it will become more and more "entropic".

In physics, the rule is that the universe overall tends to move towards entropy. If you invest energy to make something more ordered, such as putting atoms together to make a molecule, the chemical reactions to do that will release heat, which is random energy, leaving the universe as a whole less ordered (more entropic) overall.

2006-07-13 22:21:25 · answer #1 · answered by Aaron 2 · 1 0

Entropy is often described as "a measure of the disorder of a thermodynamic system" or "how mixed-up the system is". Such statements should be approached with care, as the terms "disorder" and "mixedupedness" are not well defined. The "disorder" of the system as a whole can be formally defined (as discussed below) in a way that is consistent with the realities of entropy, but note that such a definition will almost always lead to confusion. It is only if the word is used in this special sense that a system that is more "disordered" or more "mixed up" on a molecular scale will necessarily also be "a system with a lower amount of energy available to do work" or "a system in a macroscopically more probable state".

2006-07-14 05:18:05 · answer #2 · answered by sunil 3 · 0 0

Entropy is what drives the universe. Without it, nothing would happen. Ever.

The idea came into being when it was realised that in physics energy is conserved. If this is the case, what is it that makes heat flow from a hot to a cold body? There is no net change in energy, so what is the "driving force".

The answer was to introduce the notion of entropy, a quantity that cannot decrease and so points in the direction of change (this makes it unlike almost all other laws of physics which are symmetric under a reversal of time).

Entropy - though useful and quantifiable - was not at all well understood until Boltzmann introduced the idea that it related to the number of degrees of freedom of a system. These degrees of freedom are usually called microstates by physicists. Boltzmann realised you could in principle count the number of states a system could possibly occupy, and realised that left to its own devices statistically a system will simply go on to occupy more and more of them (think of it like this - i have a million boxes and a thousand bouncing balls, all in one box; the balls never lose energy, but at a later time they will not all be in one box).

2006-07-14 05:31:40 · answer #3 · answered by Epidavros 4 · 0 0

In physics and thermodynamics, entropy, symbolized by S, is a differential term "dQ/T", where dQ is the amount of heat absorbed reversibly by a thermodynamic system at a temperature T. German physicist Rudolf Clausius introduced the mathematical concept of entropy in the early 1850s to account for the dissipation of energy in thermodynamic systems that produce work. He coined the term based on the Greek entrepein meaning "to turn inward". Although the concept of entropy is primarily a thermodynamic construct, it has given rise to ideas in many disparate fields of study, including statistical mechanics, thermal physics, information theory, psychodynamics, economics, and evolution.

2006-07-14 05:28:00 · answer #4 · answered by Anonymous · 0 0

Entropy in basic terms, is a measure of the total number of possible ways you could arrange a system of particles or objects. Let's talk about the macroscopic scale because it's easier to visualize than the microscopic world. If you have an empty drinking glass, it is in one piece and it's pretty clear that there aren't a lot of ways to arrange one object with respect to itself. If the glass were broken, there would now be many pieces of glass which could be arranged in many different combinations. The more pieces of glass you have, the more possible ways there are to arrange the pieces.

it's easy to see why we simplify this to "disorder". A broken glass is clearly much less ordered than a pile of glass.

in more technical terms, it is S = k log (Ω)

where k is Boltzmann's constant and Ω is the total number of possible arrangements.

2006-07-14 14:31:47 · answer #5 · answered by idiuss 2 · 0 0

The physical quantity which remains constant during Adiabatic and Isothermal process.

2006-07-14 08:56:26 · answer #6 · answered by --> ( Charles ) <-- 4 · 0 0

entropy is what is used to measure what mayhem young brats create at their homes!!!

2006-07-14 07:24:31 · answer #7 · answered by know it all guy 1 · 0 0

the amount of Energy that is not available for work during a certain process

2006-07-14 05:13:53 · answer #8 · answered by Libby 2 · 0 0

It isn't what it used to be!

2006-07-14 19:56:18 · answer #9 · answered by Daniel T 4 · 0 0

This article gives an overview of entropy in thermodynamics. For entropy in information theory, see information entropy. For connections between the two, see Entropy in thermodynamics and information theory. For other uses of the term, see Entropy (disambiguation), and further articles in Category:Entropy.
In chemistry, physics and thermodynamics, thermodynamic entropy, symbolized by S, is a differential term "dQ/T", where dQ is the amount of heat absorbed reversibly by a thermodynamic system at a temperature T. German physicist Rudolf Clausius introduced the mathematical concept of entropy in the early 1850s to account for the dissipation of energy in thermodynamic systems that produce work. He coined the term based on the Greek τροπη meaning "transformation". Although the concept of entropy is primarily a thermodynamic construct, it has given rise to ideas in many disparate fields of study, including statistical mechanics, thermal physics, information theory, psychodynamics, economics, and evolution



Ice melting example
The illustration for this article is a classic example in which entropy increases in a small 'universe', a thermodynamic system consisting of the 'surroundings' (the warm room) and 'system' (glass, ice, cold water). In this universe, some heat energy dQ from the warmer room surrroundings (at 77 F (298 K) will spread out to the cooler system of ice and water at its constant temperature T of 32 F (273 K), the melting temperature of ice. Thus, the entropy of the system, which is dQ/T, increases by dQ/273 K. (The heat dQ for this process is the energy required to change water from the solid state to the liquid state, and is called the enthalpy of fusion, i.e. the ΔH for ice fusion.)

It is important to realize that the entropy of the surrounding room decreases less than the entropy of the ice and water increases: the room temperature of 298 K is larger than 273 K and therefore the ratio, (entropy change), of dQ/298 K for the surroundings is smaller than the ratio (entropy change), of dQ/273 K for the ice+water system. This is always true in spontaneous events in a thermodynamic system and it shows the predictive importance of entropy': the final net entropy after such an event is always greater than was the initial entropy.

As the temperature of the cool water rises to that of the room and the room further cools imperceptibly, the sum of the dQ/T over the continuous range, “at many increments”, in the initially cool to finally warm water can be found by calculus. The entire miniature ‘universe’, i.e. this thermodynamic system, has increased in entropy. Energy has spontaneously become more dispersed and spread out in that ‘universe’ than when the glass of ice + water was introduced and became a 'system' within it.




Overview
In a thermodynamic system, a 'universe' consisting of 'surroundings' and 'system' and made up of quantities of matter, its pressure differences, density differences, and temperature differences all tend to equalize over time. As shown in the preceding discussion of the illustration involving a warm room (surrroundings) and cold glass of ice and water (system), the difference in temperature begins to be equalized as portions of the heat energy from the warm surroundings become spread out to the cooler system of ice and water. Over time the temperature of the glass and its contents becomes equal to that of the room. The entropy of the room has decreased because some of its energy has been dispersed to the ice and water. However, as calculated in the discussion above, the entropy of the system of ice and water has increased more than the entropy of the surrounding room decreased. This is always true, the dispersal of energy from warmer to cooler always results in an increase in entropy. Thus, when the 'universe' of the room surroundings and ice and water system has reached an equilibrium of equal temperature, the entropy change from the initial state is at a maximum. The entropy of the thermodynamic system is a measure of how far the equalization has progressed.

Entropy is often described as "a measure of the disorder of a thermodynamic system" or "how mixed-up the system is". Such statements should be suspect immediately, because the terms "disorder" and "mixedupedness" are not well defined. The "disorder" of the system as a whole can be formally defined (as discussed below) in a way that is consistent with the realities of entropy, but note that such a definition will almost always lead to confusion. It is only if the word is used in this special sense that a system that is more "disordered" or more "mixed up" on a molecular scale will necessarily also be "a system with a lower amount of energy available to do work" or "a system in a macroscopically more probable state".

The entropy of a thermodynamic system can be interpreted in two distinct, but compatible, ways:

From a macroscopic perspective, in classical thermodynamics the entropy is interpreted simply as a state function of a thermodynamic system: that is, a property depending only on the current state of the system, independent of how that state came to be achieved. The state function has the important property that, when multiplied by a reference temperature, it can be understood as a measure of the amount of energy in a physical system that cannot be used to do thermodynamic work; i.e., work mediated by thermal energy. More precisely, in any process where the system gives up energy ΔE, and its entropy falls by ΔS, a quantity at least TR ΔS of that energy must be given up to the system's surroundings as unusable heat (TR is the temperature of the system's external surroundings). Otherwise the process will not go forward.
From a microscopic perspective, in statistical thermodynamics the entropy is envisioned as a measure of the number of microscopic configurations that are capable of yielding the observed macroscopic description of the thermodynamic system. A more "disordered" or "mixed up" system can thus be formally defined as one which has more microscopic states compatible with the macroscopic description, however this definition is not standard and thus prone to confusing people. It can be shown that this definition of entropy, sometimes referred to as Boltzmann's postulate, reproduces all of the properties of the entropy of classical thermodynamics.
An important law of physics, the second law of thermodynamics, states that the total entropy of any isolated thermodynamic system tends to increase over time, approaching a maximum value. Unlike almost all other laws of physics, this associates thermodynamics with a definite arrow of time. However, for a universe of infinite size, which cannot be regarded as an isolated system, the second law does not apply



History

The short history of entropy begins with the work of mathematician Lazare Carnot who in his 1803 work Fundamental Principles of Equilibrium and Movement postulated that in any machine the accelerations and shocks of the moving parts all represent losses of moment of activity. In other words, in any natural process there exists an inherent tendency towards the dissipation of useful energy. Building on this work, in 1824 Lazare’s son Sadi Carnot published Reflections on the Motive Power of Fire in which he set forth the view that in all heat-engines "caloric", or what is now known as heat, moves from hot to cold and that "some caloric is always lost". This lost caloric was a precursory form of entropy loss as we now know it. Though formulated in terms of caloric, rather than entropy, this was an early insight into the second law of thermodynamics. In the 1850s, Rudolf Clausius began to give this "lost caloric" a mathematical interpretation by questioning the nature of the inherent loss of heat when work is done, e.g. heat produced by friction.[1] In 1865, Clausius gave this heat loss a name:[2]

I propose to name the quantity S the entropy of the system, after the Greek word [trope], the transformation. I have deliberately chosen the word entropy to be as similar as possible to the word energy: the two quantities to be named by these words are so closely related in physical significance that a certain similarity in their names appears to be appropriate.

Later, scientists such as Ludwig Boltzmann, Willard Gibbs, and James Clerk Maxwell gave entropy a statistical basis. Carathéodory linked entropy with a mathematical definition of irreversiblity, in terms of trajectories and integrability.


Thermodynamic definition

In the early 1850s, Rudolf Clausius began to put the concept of "energy turned to waste" on a differential footing. Essentially, he set forth the concept of the thermodynamic system and positioned the argument that in any irreversible process a small amount of heat energy dQ is incrementally dissipated across the system boundary.

Specifically, in 1850 Clausius published his first memoir in which he presented a verbal argument as to why Carnot’s theorem, proposing the equivalence of heat and work, i.e. Q = W, was not perfectly correct and as such it would need amendment. In 1854, Clausius states: “In my memoir ‘On the Moving Force of Heat, &c.’, I have shown that the theorem of the equivalence of heat and work, and Carnot’s theorem, are not mutually exclusive, by that, by a small modification of the latter, which does not affect its principle, they can be brought into accordance.” This small modification on the latter is what developed into the second law of thermodynamics.

In his 1854 memoir, Clausius first develops the concepts of interior work, i.e. “those which the atoms of the body exert upon each other”, and exterior work, i.e. “those which arise from foreign influences which the body may be exposed”, which may act on a working body of fluid or gas, typically functioning to work a piston. He then discusses the three types of heat by which Q may be divided:

heat employed in increasing the heat actually existing in the body
heat employed in producing the interior work
heat employed in producing the exterior work
Building on this logic, and following a mathematical presentation of the first fundamental theorem, Clausius then presents us with the first-ever mathematical formulation of entropy, although at this point in the development of his theories calls it “equivalence-value”. He states, “the second fundamental theorem in the mechanical theory of heat may thus be enunciated:"[3]


This is the first-ever mathematical formulation of entropy; at this point, however, Clausius had not yet affixed the concept with the label entropy as we currently know it; this would come in the following two years.

In 1876, chemical engineer Willard Gibbs, building on the work of those as Clausius and Hermann von Helmholtz, situated the view that the measurement of “available energy” ΔG in a thermodynamic system could be mathematically accounted for by subtracting the “energy loss” TΔS from total energy change of the system ΔH. These concepts were further developed by James Clerk Maxwell [1871] and Max Planck [1903].

[edit]
Units and symbols
Conjugate variables
of thermodynamics
Pressure Volume
Temperature Entropy
Chem. potential Particle no.


Entropy is a key physical variable in describing a thermodynamic system. The SI unit of entropy is 'joule per kelvin' (J·K−1), which is the same as the unit of heat capacity, and entropy is said to be thermodynamically conjugate to temperature. The entropy depends only on the current state of the system, not its detailed previous history, and so it is a state function of the parameters like pressure, temperature, etc., which describe the observable macroscopic properties of the system. Entropy is usually symbolized by the letter S.

There is an important connection between entropy and the amount of internal energy in the system which is not available to perform work. In any process where the system gives up an energy ΔE, and its entropy falls by ΔS, a quantity at least TR ΔS of that energy must be given up to the system's surroundings as unusable heat. Otherwise the process will not go forward. (TR is the temperature of the system's external surroundings, which may not be the same as the system's current temperature T ).

2006-07-17 04:20:37 · answer #10 · answered by vishal 3 · 0 0

fedest.com, questions and answers