While I have some notion of what Entropy is qualitatively reading around, but I have no idea why dS = dQ / T exactly. Specifically why you divide by T. That is, why Entropy is exactly inversely proportional to temperature.
Is this just an arbitrary definition? or does it actually predict some some phenomenas. If so, what quantitative scientific observation does it predict?
2007-07-13
21:23:05
·
7 answers
·
asked by
lainwired
1
in
Science & Mathematics
➔ Physics
I'm not sure I follow Escuerdo logic which seems the most plausible. While it might be true that dS/dQ must decrease with T, this does not seem to imply to me why it would exactly by inversely related. ie why not 1/T^2, or e^(1/t).
Currently I reason i S = Q/T like this: Given Q amount of energy, it becomes more difficult to increase S(Entropy) as temperature increases.
That is to say, it's harder to increase the entropy of a system at 100K then it is to increase the entropy of that same system at 200k. Harder in this case means requires more energy.
However, I have no idea why this is true... Escuerdo tries to explain T in terms of S, but I think I already have a decent idea of what temperature is already, but I'm fuzzy on what S is in relation to T and Q.
2007-07-14
10:04:12 ·
update #1
the equations match what happens in the real world. There'd be no point to having equations if they didn't stand for something
Q is the heat transferred to the system during any reversible process that leads from the first state to the second, T is the absolute temperature, it does not change
the entropy of the system is increased by deltaS = Q/T
2007-07-13 21:37:56
·
answer #1
·
answered by Anonymous
·
0⤊
1⤋
Short answer: dS=dQ/T is a definition, but of temperature, not of entropy. Also, it's not an arbitrary one, because it matches our intuitive notion of temperature too.
Long answer: Originally dS=dQ/T actually was simply used to define entropy in terms of temperature and heat.
However, you're right to note that this is not the fundamental definition used anymore even though the equation still holds true. The definition people use for entropy now is S=k*ln(Ω) where k is Boltzmann's constant (to make the units work with the old definition) and Ω is the number of possible microstates a system in a given macrostate could occupy (classically this would be infinite, but quantum mechanics reduces the number of independent states to a finite, albeit large, value).
The relation dS=dQ/T is still a definition, though. It's just used to define temperature now instead of entropy. But it's still not an arbitrary definition. We all have an intuitive idea of what temperature is. It's a measure of the tendency of heat to flow from one object into its neighbors. If two objects are in contact, the one with the higher temperature will lose thermal energy to the one with the lower temperature until they're equal.
So we still have to show that this definition of temperature matches up with this concept.
Imagine we have two objects that are exchanging heat with one another. If this is a closed system then ΔQ in one of the objects will be the same as -ΔQ in the other. The one that's losing energy will also be undergoing a decrease in entropy (but a local one only, the system's total entropy will still increase), which means that the signs for ΔS are opposite also, so the negatives cancel, and we can consider ΔQ the same for each system.
If the second law of thermodynamics is true, then the energy must flow from the object with the lower value of ΔS/ΔQ in order for the total entropy to increase (because the total entropy is the sum of the entropy of each object).
But this means that since energy always flows from an object with lower dS/DQ to one with a higher value, that this must be inversely related to temperature.
dS/dQ=C/T where C is some constant. As it happens, we've just set C=1 and used this as the definition of temperature.
So just re-arrange the differentials in dS/dQ=1/T and you get the relation dS=dQ/T, and that's pretty much all there is to it.
It connects with the modern definition of entropy S=k*ln(Ω) because this definition implies that the 2nd law of thermodynamics should statistically be expected to hold. The longer a causal system is allowed to evolve, the more possible states it has available to occupy.
ADDENDUM: I just read the additional information to this question, and will give my best shot at elucidating it further.
Logically speaking, it could be 1/T^2 or 1/ln(T) just as well as 1/T, and my argument didn't make a distinction between the two. But imagine if it were something like 1/T^2, then the definition of entropy would have the wrong units.
dS/dQ has units of entropy/energy, and entropy has units of energy/temperature.
But 1/T^2 has units of 1/temperature^2, obviously, so this wouldn't fit.
There is actually a bit more to it. Remember, that dS/dQ=1/T was originally the definition of entropy, and was later used to formalize the definition of temperature while entropy was re-defined. This means that the definition of entropy S=k*ln(Ω) had to be made to fit this. It's obvious that Boltzamann's constant fixes the units, but why there's no other constant and why the logarithm is necessary, I'm not sure off the top of my head (maybe we'd just need a complete derivation to see that, which I don't know off the top of my head or have available at the moment). I will need to think on this a bit before I can elaborate on that part.
2007-07-14 00:40:17
·
answer #2
·
answered by Escuerdo 3
·
1⤊
0⤋
Ok, I just took a test on this last week. Based on my understand of what I know about entropy, here's what I can say:
The entropy of a system is a quantitative measure of its disorder. Therefore, if an amount of heat "Q" flows into a system at absolute temperature "T", the entropy of change of the system is: ΔS = Q/T (with units of J/K)
This means that heat ENTERING the system INCREASES the system's entropy whereas heat LEAVING the system DECREASES the system's entropy.
Thus, the flow of heat from a hotter system to a colder system causes an increase in the total entropy of the two systems. Every irreversible process increases the total entropy of the universe. A process that would decrease the total entropy is impossible. A reversible process causes no change in the total entropy of the universe.
The second law of thermodyanmics states:
The entropy of the universe never decreases.
I hope this clears up some misconceptions and made it easier to understand. It's a bit confusing for me at first, but it just takes time to get use to it :) Goodluck.
2007-07-13 22:23:33
·
answer #3
·
answered by Mark 1
·
0⤊
0⤋
I think there is a need to simplify the situation, but we should not over simplify it. Let me think about it.
Yes, as mentioned by another answerer, it is best to think of the temperature T as being, by definition, the energy Q divided by the entropy S. Let see why this definition makes sense. For a system made of independent particles, such as an ideal gas, the entropy of the system is proportional to the number of particles in the system. So, this definition says that the temperature is proportional to the amount of energy per particle. If you disperse a given amount of energy over more particles, it gets colder. This is consistent with our intuitive notion of temperature. So, by definition, T = Q/S, which means S = Q/T. In particular, if T is fixed, ΔS = ΔQ / T. This answers the question.
In case you want to know more, here is why the entropy is proportional to the number of particles in the system. Except for a constant that depends on the choice of unit, the entropy is defined as S = lg(Ω) where Ω is the total number of possible states of the system. If there are N particles in the system, Ω = ω^N, where ω is the number of possible states of each particle. We obtain S = lg( ω^N ) = N lg(ω). You can think of lg(ω) as being the entropy of a single particle in the system.
2007-07-20 22:52:11
·
answer #4
·
answered by My account has been compromised 2
·
0⤊
0⤋
Wow...lots of complicated answers to a relatively simple question.
Here's the answer that's eluded you:
Temperature (T) is the "concentration of energy"
Energy (Q) is self-defining
Entropy (S) is Size of energy spread
With this in mind, if you have a fixed amount of energy (Q) and you allow it to expand into an volume twice the original size (S x 2), then Temperature (T), or the concentration of energy, is half as much as it was. It's algebra.
If you double S, then you either double Q (T remaining constant) or you have to cut T in half (Q remains constant).
This is important because this "formula" is actually a mathematical description of the relationship between Entropy, Energy and Temperature. It might have been easier to see if they had written it this way:
Q/S = T
You can see there is a direct relationship between Q and S. For a constant temperature (T), whatever happens to Q must happen to S. Double your energy and you have to double your "randomness" to maintain the same temperature.
Hope this clears it up for you.
2007-07-20 16:25:47
·
answer #5
·
answered by Kevin S 7
·
2⤊
0⤋
in accordance to the 2nd regulation of Thermodynamics all techniques could have a tendency in direction of the introduction of entropy. In different words no technique can happen and return its device AND our environment back to the unique state. there'll continuously be some thing misplaced to our environment, some "ailment" which would be created.
2016-12-14 08:29:47
·
answer #6
·
answered by maiale 4
·
0⤊
0⤋
I thought entropy was stated as.... S=k log w is my understanding wrong?
2007-07-13 21:58:28
·
answer #7
·
answered by Andrew 3
·
0⤊
0⤋