The measurement of disorder in a system. For a down to earth example, a messy room has a higher entropy than a neat room.
2006-10-01 08:52:05
·
answer #1
·
answered by cchew4 2
·
1⤊
0⤋
Entropy
From Wikipedia, the free encyclopedia
For entropy in information theory, see information entropy. For connections between the two, see Entropy in thermodynamics and information theory. For other uses of the term, see Entropy (disambiguation), and further articles in Category:Entropy.
In thermodynamics, entropy, symbolized by S, is a state function of a thermodynamic system defined by the differential quantity dS = dQ / T, where dQ is the amount of heat absorbed in a reversible process in which the system goes from one state to another, and T is the absolute temperature.[1] Entropy is one of the factors that determines the free energy in the system and appears in the second law of thermodynamics. Entropy measures the spontaneous dispersal of energy: how much energy is spread out in a process, or how widely spread out it becomes – at a specific temperature.[2]
In terms of statistical mechanics, the entropy describes the number of the possible microscopic configurations of the system. The statistical definition of entropy is generally thought to be the more fundamental definition, from which all other important properties of entropy follow. Although the concept of entropy was originally a thermodynamic construct, it has been adapted in other fields of study, including information theory, psychodynamics, and thermoeconomics.
"Ice melting" - a classic example of entropy increasingContents [hide]
1 Overview
2 History
3 Thermodynamic definition
3.1 1854 definition
3.2 Later development
3.3 Units
4 Statistical interpretation
5 Information theory
6 The second law
7 The arrow of time
8 Entropy and cosmology
9 Generalized Entropy
10 Ice melting example
11 Entropy in fiction
12 See also
13 References
14 Further reading
15 External links
2006-10-01 15:59:55
·
answer #2
·
answered by SecretUser 4
·
0⤊
0⤋
It is Boltzmans constant times the log of the number of states that a system can be in.
I accept that that definition is probably not quite what you are after. Think of it as the randomness of a situation, a regular grid of identical objects has low entropy but a mix of different objects has far more possible combinations and hence a higher entropy.
Given that there are far more high entropy possibilities than low entropy ones any random juggling of a system is almost guaranteed to increase the entropy.
2006-10-02 18:35:26
·
answer #3
·
answered by m.paley 3
·
0⤊
0⤋
Entropy is the amount of disorder in the Universe and it tends towards the maximum.
2006-10-01 15:57:28
·
answer #4
·
answered by Mike N 2
·
0⤊
0⤋
en‧tro‧py /ËÉntrÉpi/ Pronunciation Key - Show Spelled Pronunciation[en-truh-pee] Pronunciation Key - Show IPA Pronunciation
–noun 1. Thermodynamics. a. (on a macroscopic scale) a function of thermodynamic variables, as temperature, pressure, or composition, that is a measure of the energy that is not available for work during a thermodynamic process. A closed system evolves toward a state of maximum entropy.
b. (in statistical mechanics) a measure of the randomness of the microscopic constituents of a thermodynamic system. Symbol:. S
2. (in data transmission and information theory) a measure of the loss of information in a transmitted signal or message.
3. (in cosmology) a hypothetical tendency for the universe to attain a state of maximum homogeneity in which all matter is at a uniform temperature (heat death).
4. a doctrine of inevitable social decline and degeneration.
2006-10-01 15:53:02
·
answer #5
·
answered by Jens 5
·
0⤊
1⤋
Main Entry:entropy
Pronunciation:*en-tr*-p*
Function:noun
Inflected Form:plural -pies
Etymology:International Scientific Vocabulary 2en- + Greek trop* change, literally, turn, from trepein to turn
Date:1875
1 : a measure of the unavailable energy in a closed thermodynamic system that is also usually considered to be a measure of the system's disorder, that is a property of the system's state, and that varies directly with any reversible change in heat in the system and inversely with the temperature of the system; broadly : the degree of disorder or uncertainty in a system
2 a : the degradation of the matter and energy in the universe to an ultimate state of inert uniformity b : a process of degradation or running down or a trend to disorder
3 : CHAOS, DISORGANIZATION, RANDOMNESS
–entropic \en-*tr*-pik, -*tr*-pik\ adjective
–entropically \-pi-k(*-)l*\ adverb
2006-10-01 15:58:09
·
answer #6
·
answered by Wrath Warbone 4
·
0⤊
0⤋
entropy is the measure of disorder in a system-confused?-so am I!!
2006-10-01 15:56:05
·
answer #7
·
answered by CHARLIEDONTSURF 2
·
0⤊
0⤋
Usually experienced at the family dinner table with husband and kids when trying to ascertain what everyone has been doing for the last eight hours.Or is that atropy?
2006-10-02 08:41:44
·
answer #8
·
answered by magicharp/wolf 1
·
0⤊
0⤋
Disorder.
2006-10-01 16:29:58
·
answer #9
·
answered by PragmaticAlien 5
·
0⤊
0⤋
It is a measure of disorder in a system - a measure of randomness.
2006-10-01 16:42:24
·
answer #10
·
answered by razorfish_98 3
·
0⤊
0⤋