Entropy as a thermal measurement is the amount of energy not converted into work during a process, and is a measure of efficiency of process (dS = dq/T)
Entropy is also a measurement of a process to occur in a spontaneous/irreversible manner as defined by Boltzmann (S= k log W). Here S is measured by the number of possible microstates in a system. The more microstates the higher the "entropy": and the more likely a spontaneous process will occur.
The BIGGEST misconception about entropy, the oft repeated lie if you will, is that it has anything to do with "order" or "disorder" of a physical system. There are several physical processes that exhibit an increase of order driven by an INCREASE in entropy. Self assembly of molecules on surfaces are many times entropically driven.
Many chemical reactions are entropically driven and result in an increase in local order (e.g. a bigger more complex molecule).
The old entropy = disorder relation is probably due to the fact that the entropy of an isolated system always increases during any physical process. Since the only possible isolated system is the universe itself, it leads to the idea that the universe might "wear down".
Entropy causes as much order and structure in the universe as it does disorder. I do not believe that data system entropy is the same as thermal entropy where they intentionally link entropy to data disorder. And even if it is, I can find as many examples where entropy gives rise to order not disorder.
2006-08-08 01:31:54
·
answer #1
·
answered by DrSean 4
·
0⤊
0⤋
Enthropy
2016-09-28 07:16:25
·
answer #2
·
answered by vignola 4
·
0⤊
0⤋
Entropy is the degree of un-orderliness of a chemical or physical system. If the energy of the system is in the form of macroscopic movements that can be described by a few parameters, the entropy is low. If it's in the form of microscopic movements where you need one or more parameters for each individual particle to describe it accurately, the entropy is high.
Entropy is measured in joules per Kelvin, i.e. energy units per temperature unit.
Suppose you push a snooker ball so that it acquires an energy of one joule. At first, the entropy is low because the movement can be described by two parameters, namely the ball's velocity in the x- and y-direction (imagine a coordinate system on the snooker table).
Friction slow your ball down so that the orderly energy is converted into microscopic (heat) energy: in order to describe that accurately, you would need to describe the thermal oscillations of each molecule making up the ball.
The entropy has risen by 1/293 joules/Kelvin, assuming the temperature in the ball is 293 Kelvins (20 degrees Celsius).
2006-08-07 22:44:08
·
answer #3
·
answered by helene_thygesen 4
·
0⤊
0⤋
No, not 'enthropy' - not a word.
But entropy is the running down and wearing out of the everything in the universe. Me and you too. Watch a cup of coffee cool down - you are seeing entropy. Water flows downhill - entropy. See a fire burn down to ash - that's good ol' entropy. Capice?
2006-08-07 22:42:17
·
answer #4
·
answered by MaqAtak 4
·
1⤊
0⤋
Definition of: entropy
Disorder or randomness. In data compression, it is a measure of the amount of non-redundant and non-compressible data in an object (the amount that is not similar). In encryption, it is the amount of disorder or randomness that is added. In software, it is the disorder and jumble of its logic, which occurs after the program has been modified over and over
http://www.pcmag.com/encyclopedia_term/0,2542,t=entropy&i=42666,00.asp
2006-08-08 01:25:54
·
answer #5
·
answered by nyack 4
·
0⤊
0⤋
entropy is a scientific word for CHAOS. everything in this world, you and me, this earth, sun, universe... soon is getting older and die. entropy is increasing since lots of troubles happened these days!!! Israel's attack... damn!!! Earth is getting hotter!!!! Check your Physics book for 2nd Law of Termodynamics
2006-08-08 16:09:29
·
answer #6
·
answered by **naDEshiKO** 2
·
0⤊
0⤋
http://education.yahoo.com/reference/encyclopedia/entry/entropy
check out this site it explains entropy simply.
Entropy basically is the amount of disorder a system has
2006-08-07 23:23:36
·
answer #7
·
answered by xtra-great-gal 2
·
0⤊
0⤋