I dont want to talk thermodynamic equations and theory. This is already done by other answerers in length and detail.
In my understanding, Entropy is a measure of "what is not available".
Entropy is just the measure of heat energy not available for converting to useful work.
Simple example is :
Presume light (Heat Energy)entering a place. Yes that place gets illuminated ( Ideal work ). But, if there is some obstruction ( actual heat loss)in the light passage, you see it as a shadow( Entropy). And that much amount (measure of entropy)of light is not available for illumination.
ENTROPY is applicable to all your life and not to Thermal engineering alone.
EX: Researchers have calculated entropy even for mutual love between people
2007-04-01 05:49:26
·
answer #1
·
answered by babu n 2
·
0⤊
0⤋
The concept of entropy (Greek: εν (en=inside) + verb: ÏÏÎÏÏ (trepo= to chase, escape, rotate, turn)) in thermodynamics is central to the second law of thermodynamics, which deals with physical processes and whether they occur spontaneously. Spontaneous changes occur with an increase in entropy. Spontaneous changes tend to smooth out differences in temperature, pressure, density, and chemical potential that may exist in a system, and entropy is thus a measure of how far this smoothing-out process has progressed. In contrast, the first law of thermodynamics deals with the concept of energy, which is conserved. Entropy change has often been defined as a change to a more disordered state at a molecular level. In recent years, entropy has been interpreted in terms of the "dispersal" of energy. Entropy is an extensive state function that accounts for the effects of irreversibility in thermodynamic systems.
2007-03-29 04:58:38
·
answer #2
·
answered by lucreciacresent 2
·
0⤊
0⤋
Measure of disorganization or degradation in the universe that reduces available energy, or tendency of available energy to dwindle. Chaos, opposite of order.
2007-03-29 06:35:57
·
answer #3
·
answered by Rahul 3
·
0⤊
0⤋
The concept of entropy (Greek: εν (en=inside) + verb: ÏÏÎÏÏ (trepo= to chase, escape, rotate, turn)) in thermodynamics is central to the second law of thermodynamics, which deals with physical processes and whether they occur spontaneously. Spontaneous changes occur with an increase in entropy. Spontaneous changes tend to smooth out differences in temperature, pressure, density, and chemical potential that may exist in a system, and entropy is thus a measure of how far this smoothing-out process has progressed. In contrast, the first law of thermodynamics deals with the concept of energy, which is conserved. Entropy change has often been defined as a change to a more disordered state at a molecular level. In recent years, entropy has been interpreted in terms of the "dispersal" of energy. Entropy is an extensive state function that accounts for the effects of irreversibility in thermodynamic systems.
Quantitatively, entropy, symbolized by S, is defined by the differential quantity dS = δQ / T, where δQ is the amount of heat absorbed in an isothermal and reversible process in which the system goes from one state to another, and T is the absolute temperature at which the process is occurring.[3] Entropy is one of the factors that determines the free energy of the system.
This thermodynamic definition of entropy is only valid for a system in equilibrium (because temperature is defined only for a system in equilibrium), while statistical definition of entropy (see below) applies to any system. Thus the statistical definition is usually considered fundamental definition of entropy.
When a system's energy is defined as the sum of its "useful" energy, (e.g. that used to push a piston), and its "useless energy", i.e. that energy which cannot be used for external work, then entropy may be (most concretely) visualized as the "scrap" or "useless" energy whose energetic prevalence over the total energy of a system is directly proportional to the absolute temperature of the considered system. (Note the product "TS" in the Gibbs free energy or Helmholtz free energy relations).
In terms of statistical mechanics, the entropy describes the number of the possible microscopic configurations of the system. The statistical definition of entropy is the more fundamental definition, from which all other definitions and all properties of entropy follow. Although the concept of entropy was originally a thermodynamic construct, it has been adapted in other fields of study, including information theory, psychodynamics, thermoeconomics, and evolution.[4][5][6]
Ice melting example
Main article: disgregation
The illustration for this article is a classic example in which entropy increases in a small 'universe', a thermodynamic system consisting of the 'surroundings' (the warm room) and 'system' (glass, ice, cold water). In this universe, some heat energy δQ from the warmer room surroundings (at 298 K or 25 C) will spread out to the cooler system of ice and water at its constant temperature T of 273 K (0 C), the melting temperature of ice. The entropy of the system will change by the amount dS = δQ/T, in this example δQ/273 K. (The heat δQ for this process is the energy required to change water from the solid state to the liquid state, and is called the enthalpy of fusion, i.e. the ÎH for ice fusion.) The entropy of the surroundings will change by an amount dS = -δQ/298 K. So in this example, the entropy of the system increases, whereas the entropy of the surroundings decreases.
It is important to realize that the decrease in the entropy of the surrounding room is less than the increase in the entropy of the ice and water: the room temperature of 298 K is larger than 273 K and therefore the ratio, (entropy change), of δQ/298 K for the surroundings is smaller than the ratio (entropy change), of δQ/273 K for the ice+water system. To find the entropy change of our 'universe', we add up the entropy changes for its constituents: the surrounding room, and the ice+water. The total entropy change is positive; this is always true in spontaneous events in a thermodynamic system and it shows the predictive importance of entropy: the final net entropy after such an event is always greater than was the initial entropy.
As the temperature of the cool water rises to that of the room and the room further cools imperceptibly, the sum of the δQ/T over the continuous range, at many increments, in the initially cool to finally warm water can be found by calculus. The entire miniature "universe", i.e. this thermodynamic system, has increased in entropy. Energy has spontaneously become more dispersed and spread out in that "universe" than when the glass of ice water was introduced and became a "system" within it.
2007-03-29 06:30:34
·
answer #4
·
answered by sagarukin 4
·
0⤊
0⤋
Measure of disorganization or degradation in the universe that reduces available energy, or tendency of available energy to dwindle. Chaos, opposite of order.
2007-03-29 04:55:12
·
answer #5
·
answered by Leonidas 1
·
0⤊
0⤋
Energy tends to be disordered. Laser light is highly organized energy, turns into heat which is the most disorganized form of energy. It is interesting that in the progress of making a laser beam, at each step in the process there is waste energy and it is always in the form of heat. From burning coal to make steam to make electricity to make flashlamps to make the laser light, each stage has waste and it is always heat.
2007-03-29 05:10:21
·
answer #6
·
answered by ZORCH 6
·
0⤊
0⤋