English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

13 answers

you could type your question out but couldnt type the word into a web browser?

2006-06-29 22:26:19 · answer #1 · answered by Anonymous · 0 1

Entropy is often described as "a measure of the disorder of a thermodynamic system" or "how mixed-up the system is". This has led to a popular misinterpretation of the meaning of entropy because order and disorder are not well-defined terms in science, and their use with entropy does not match the usual use of the words. However, "disorder" can be formally defined in a way that is consistent with the realities of entropy: a system that is more "disordered" or more "mixed up" (on a molecular scale) is equivalently also "a system with a lower amount of energy available to do work" or "a system in a macroscopically more probable state". These definitions, however, are not standard and the word "disorder" can not be used to describe entropy in any clear way.

2006-06-29 22:26:40 · answer #2 · answered by Not Tellin 4 · 0 0

A fundamental problem in information theory is to find the minimum average number of bits needed to represent a particular message selected from a set of possible messages. Shannon solved this problem by using the notion of entropy. The word entropy is borrowed from physics, in which entropy is a measure of the disorder of a group of particles. In information theory disorder implies uncertainty and, therefore, information content, so in information theory, entropy describes the amount of information in a given message. Entropy also describes the average information content of all the potential messages of a source. This value is useful when, as is often the case, some messages from a source are more likely to be transmitted than are others.

2006-06-30 01:37:42 · answer #3 · answered by sandy 1 · 0 0

ENTROPY :
During a spontaneous process, the entropy of the system goes on increasing. When the system reaches equillibrium state, entropy becomes maximum and therefore no more increase of entropy of system is possible. The mathematical condition for the entropy to be maximum is that energy change in entropy is zero.

Entropy is often described as "a measure of the disorder of a thermodynamic system" or "how mixed-up the system is"

2006-06-29 23:11:21 · answer #4 · answered by Sherlock Holmes 6 · 0 0

In physics and thermodynamics, entropy is a differential term "dQ/T", where dQ is the amount of heat absorbed reversibly by a thermodynamic system at a temperature T. German physicist Rudolf Clausius introduced the mathematical concept of entropy in the early 1860s to account for the dissipation of energy in thermodynamic systems that produce work. He coined the term based on the Greek entrepein meaning "to turn inward". Although the concept of entropy is primarily a thermodynamic construct, it has given rise to ideas in many disparate fields of study, including statistical mechanics, thermal physics, information theory, psychodynamics, economics, and evolution.

2006-06-30 00:10:45 · answer #5 · answered by Anonymous · 0 0

In physics and thermodynamics, entropy is a differential term "dQ/T", where dQ is the amount of heat absorbed reversibly by a thermodynamic system at a temperature T. German physicist Rudolf Clausius introduced the mathematical concept of entropy in the early 1860s to account for the dissipation of energy in thermodynamic systems that produce work. He coined the term based on the Greek entrepein meaning "to turn inward". Although the concept of entropy is primarily a thermodynamic construct, it has given rise to ideas in many disparate fields of study, including statistical mechanics, thermal physics, information theory, psychodynamics, economics, and evolution.

2006-06-29 22:51:13 · answer #6 · answered by Anonymous · 0 0

wilkipedia says In physics and thermodynamics, entropy is a differential term "dQ/T", where dQ is the amount of heat absorbed reversibly by a thermodynamic system at a temperature T. German physicist Rudolf Clausius introduced the mathematical concept of entropy in the early 1860s to account for the dissipation of energy in thermodynamic systems that produce work. He coined the term based on the Greek entrepein meaning "to turn inward". Although the concept of entropy is primarily a thermodynamic construct, it has given rise to ideas in many disparate fields of study, including statistical mechanics, thermal physics, information theory, psychodynamics, economics, and evolution.

2006-06-29 22:25:28 · answer #7 · answered by shellyshelly 2 · 0 0

The ratio of heat absorbed at a particular temperature to the temperature is called reduced heat.

A body may change from one state to another in an infinite number of ways. The meaning of this sentence is understood as follows.

If we represent in a diagram the initial and final point, we can connect the two points by infinite number of curves.

The heat absorbed in each path may vary; but the summation of the reduced heat will be the same in any path in which the transition take place.

This summation of reduced heat is defined as the change in entropy.

From this definition one has to understand what entropy is.

Like internal energy, entropy is an arbitrary constant.

2006-06-30 00:39:53 · answer #8 · answered by Pearlsawme 7 · 0 0

entropy
One entry found for entropy.


Main Entry: en·tro·py
Pronunciation: 'en-tr&-pE
Function: noun
Inflected Form(s): plural -pies
Etymology: International Scientific Vocabulary 2en- + Greek tropE change, literally, turn, from trepein to turn
1 : a measure of the unavailable energy in a closed thermodynamic system that is also usually considered to be a measure of the system's disorder and that is a property of the system's state and is related to it in such a manner that a reversible change in heat in the system produces a change in the measure which varies directly with the heat change and inversely with the absolute temperature at which the change takes place; broadly : the degree of disorder or uncertainty in a system
2 a : the degradation of the matter and energy in the universe to an ultimate state of inert uniformity b : a process of degradation or running down or a trend to disorder
3 : CHAOS, DISORGANIZATION, RANDOMNESS

2006-06-29 22:28:55 · answer #9 · answered by Anonymous · 0 0

Symbol S For a closed thermodynamic system, a quantitative measure of the amount of thermal energy not available to do work.
A measure of the disorder or randomness in a closed system.
A measure of the loss of information in a transmitted message.
The tendency for all matter and energy in the universe to evolve toward a state of inert uniformity.
Inevitable and steady deterioration of a system or society.


--------------------------------------------------------------------------------
[German Entropie : Greek en-, in; see en-2 + Greek trop, transformation; see trep- in Indo-European Roots.]
--------------------------------------------------------------------------------
en·tropic (n-trpk, -trpk) adj.
en·tropi·cal·ly adv.

[Download Now or Buy the Book]
Source: The American Heritage® Dictionary of the English Language, Fourth Edition
Copyright © 2000 by Houghton Mifflin Company.
Published by Houghton Mifflin Company. All rights reserved.


en·tro·py (ntr-p)
n.

For a closed thermodynamic system, a quantitative measure of the amount of thermal energy not available to do work.
A measure of the disorder or randomness in a closed system.

--------------------------------------------------------------------------------
en·tropic (n-trpk, -trpk) adj.


Source: The American Heritage® Stedman's Medical Dictionary
Copyright © 2002, 2001, 1995 by Houghton Mifflin Company. Published by Houghton Mifflin Company.


Main Entry: en·tro·py
Pronunciation: 'en-tr&-pE
Function: noun
Inflected Form: plural -pies
: a measure of the unavailable energy in a closed thermodynamic system that is also usually considered to be a measure of the system's disorder and that is a property of the system's state and is related to it in such a manner that a reversible change in heat in the system produces a change in the measure which varies directly with the heat change and inversely with the absolute temperature at which the change takes place; broadly : the degree of disorder or uncertainty in a system —en·tro·pic /en-'trOp-ik, -'träp-/ adjective —en·tro·pi·cal·ly /-i-k(&-)lE/ adverb


Source: Merriam-Webster's Medical Dictionary, © 2002 Merriam-Webster, Inc.


ENTROPY

n 1: (communication theory) a numerical measure of the uncertainty of an outcome; "the signal contained thousands of bits of information" [syn: information, selective information] 2: (thermodynamics) a thermodynamic quantity representing the amount of energy in a system that is no longer available for doing mechanical work; "entropy increases as matter and energy in the universe degrade to an ultimate state of inert uniformity" [syn: randomness, {S}]


Source: WordNet ® 2.0, © 2003 Princeton University


ENTROPY



A measure of the disorder of a system. Systems tend
to go from a state of order (low entropy) to a state of
maximum disorder (high entropy).

The entropy of a system is related to the amount of
information it contains. A highly ordered system can be
described using fewer bits of information than a disordered
one. For example, a string containing one million "0"s can be
described using run-length encoding as [("0", 1000000)]
whereas a string of random symbols (e.g. bits, or characters)
will be much harder, if not impossible, to compress in this
way.

Shannon's formula gives the entropy H(M) of a message M in
bits:

H(M) = -log2 p(M)

Where p(M) is the probability of message M.

(1998-11-23)



Source: The Free On-line Dictionary of Computing, © 1993-2005 Denis Howe


ENTROPY

entropy: in CancerWEB's On-line Medical Dictionary


Source: On-line Medical Dictionary, © 1997-98 Academic Medical Publishing & CancerWEB

2006-06-29 22:28:15 · answer #10 · answered by WyattEarp 7 · 0 0

en·tro·py (ntr-p)
n. pl. en·tro·pies
1. Symbol S For a closed thermodynamic system, a quantitative measure of the amount of thermal energy not available to do work.
2. A measure of the disorder or randomness in a closed system.
3. A measure of the loss of information in a transmitted message.
4. The tendency for all matter and energy in the universe to evolve toward a state of inert uniformity.
5. Inevitable and steady deterioration of a system or society.

--------------------------------------------------------------------------------

[German Entropie : Greek en-, in; see en-2 + Greek trop, transformation; see trep- in Indo-European roots.]

--------------------------------------------------------------------------------

en·tropic (n-trpk, -trpk) adj.
en·tropi·cal·ly adv.

2006-06-30 01:37:15 · answer #11 · answered by George N 1 · 0 0

fedest.com, questions and answers