Entropy is defined as [in a chemistry text book]:
A thermodynamic function associated with the number of different, equivalent energy states or spatial arrangements in which a system may be found. It is a thermodynamic state function, which means that once we specify the conditions for a system - that is, the temperature, pressure, and so on - the entropy is defined.
Essentially, entropy measures the disorder of a system.
A very random system will have a high amount of entropy and a very ordered system's entropy will be low.
Often we do not care about the exact, numeric, entropy value, but rather the change in entropy associated with a given process.
On the whole in the universe, the total entropy is always increasing even though locally entropy might be observed to decrease.
2006-11-01 10:59:30
·
answer #2
·
answered by mrjeffy321 7
·
3⤊
0⤋
gosh, entropy.
i struggled with this question for a good 4 years before i took a course that explained it to my satisfaction.
the deal is that, back in like the 1700's or the 1800's or whenever it was, when they were first figuring out the rules to thermodynamics, early physicists were discovering parameters that they could use to describe a system (so like, gas in a pressurized cylinder, or water getting heated up, or ...): energy, gibbs free energy... stuff like that. differently defined parameters were more useful than others depending on what thermodynamic system they were trying to describe... so anyway, one of them was called "entropy". they discovered that if you're talking about closed systems, the "entropy" will always increase.
okay, great. what is it?
so for the longest time, no one knew. all they knew is that it went up over time in closed systems.
so near the turn of the 20'th century, a new field of physics was invented, called "statistical mechanics". Unlike thermodynamics, which just described the behaviour of gasses and liquids as they were pressurized and heated and changed states and stuff, but did not EXPLAIN why everything worked the way it did; statistical mechanics sought to explain WHY gasses and liquids follow the rules of thermodynamics.
what they did was kind of cool, they said "hey, gasses are made out of little particles, and each one is like a little ball, obeying the rules of newtonian mechanics (momentum is conserved, energy is conserved) as it bounces around." Which is cool. if they knew exactly what the momentum and position of each particle was at some specific time, they could predict the trajectories of all the particles for the rest of time: like pool balls on a pool table. the problem, of course, is that there are like 10^24 or more particles. that's WAY too many to keep track of. so instead of keeping track of each and every particle, "statistical mechanics" sought to keep track of averages: the average kinetic energy of each particle, the average momentum, the average mass, the average density... etc. and it turns out that looking at the setup in terms of averages still gives us some pretty useful predictive information.
for example. pressure. the pressure is the force exerted by a gas (or liquid or whatever) over a unit area. what causes it in the statistical mechanical picture? well, lets say we have a gas exerting pressure on a wall. in statistical mechanics, we imagine that the gas is made up of little balls, each one with mass M, and some average kinetic energy E= 1/2 m (U^2 + V^2 +W^2) where U, V, W are the average velocities in the X Y and Z directions respecitvely. Okay, well, from this, we can tell talk about our average ball having a momentum in the X direction. if this ball bounces off the wall, the momentum will change signs: thus, the wall applied a force to it/ it applied a force to the wall. the amount of force clearly depends on the average kinetic energy, right? okay. so. if we know about the density of the gas, we can tell how many balls bounce off the wall in a unit time, per unit area ON AVERAGE. So we can relate the pressure on the wall to the density and energy of the gas! the higher the temp, the more pressure!
okay, so that's all well and good. lets talk about ENTROPY.
So there was a question asked: "what is the probability that, at some specific time, our system will be in some (given) specific arrangement". lets talk in really simple terms here. suppose we have arrows, and 50% of the time, they're pointing up, and 50% of the time, they're pointing down. lets label up with the letter u, and down with the label d.
so lets say that we have a system of 3 arrows. what is the probability that the system will be in configuration
u, d , u ?
well, how many different configurations are there? lets count them:
u u u
u u d
u d u
u d d
d u u
d u d
d d u
d d d
EIGHT! there are 8.
so the probability that the system will be configured like (u d u) is one in eight (1/8).
obviously, if i added one more arrow, the chances of it being in any one configuration (u d u u, for instancE) will decrease... the more arrows the less probable.
But we can still discuss general trends. for instance. lets look at the list of 4 arrows:
u u u d
u u d d
u d u d
u d d d
d u u d
d u d d
d d u d
d d d d
u u u u
u u d u
u d u u
u d d u
d u u u
d u d u
d d u u
d d d u
and we can ask questions like: what are the odd's that there will be on "down" on the left side, and one "up" on the right? (4/16), or "what are the odds that there will be two "up" or two "down" adjacent to one-another? (14/16)... the point i'm trying to make right now is that asking more general questions regarding the appearence of some features in the system will give you probabilities that are not really small. and if we start talking about bigger systems, these kinds of questions are the only ones that won't give really small probabilities.
(I know it doesn't seem like i'm going anywhere, but bear with me. the explanation's almost done.)
okay, so, if we start talking about really really large systems...
u u u u u u u u u u u u u u u u u d d d d d d d d d d d d d d d d
say... and i try to ask a question like "what are the odds that there will be 5 or more more "u"s than "d"s on the left side , and 5 or more "d"s than "u"s on the right side than the left side", you'll see that even though there are lots of configurations which satisfy this description, there are far far more configurations which won't satisfy the criteria: so the odds of this criteria being satisfied will be kind of low.
This is the statistical mechanical description of "entropy": that, if we list all of the possible configurations a system can take, and we are trying to measure some averaged quantity of the system: the quantity we're trying to measure will probably be the one which has the most configurations which give it.
For this reason, the entropy is frequently described as being "the move from order to disorder".
look at it this way. if you take a bunch of molecules of food colouring, and drop them in a glass of water. initially, they're all in a tight blob in the corner of the volume of water... but given enough time, there will be a uniform probability that any one molecule of dye is at any point in the glass. so then we can treat the position of each of the molecules of water statistically! when we count ALL of the different configurations of all the dye molecules in the glass of water, we see that the number of configurations where all of the dye molecules will be grouped in a small drop in the corner of the glass are very small compared to the number of configurations where the dye molecules are spread out through the glass... thus, the statistics of the system make the water droplets spread out through the glass! order -> disorder!
similar explanations work for why if you start with a bar of iron with one side really hot, and the other really cold, the bar will end up, after a bunch of time has passed, with a uniform temperature.
okay. so that's what entropy is.
2006-11-01 12:34:51
·
answer #4
·
answered by BenTippett 2
·
3⤊
0⤋