English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

What is entropy. Ive read my phisics book and looked at videoes and im still lost. Im supposed to write 10 pages on this.

2007-08-02 12:48:36 · 10 answers · asked by Anonymous in Science & Mathematics Physics

10 answers

Entropy is the tendency towards disorder. One of the best ways to explain entropy is using analogies of disorder you find in the world around you. For example, if you were to stop cleaning your house for a month, what would it look like? If you stopped maintaining your car, what would happen to the engine? If you put no effort into relationships, what would become of them? These are just a few everyday examples of what entropy is in lay terms. Entropy is the natural order of disorder. Things will tend towards chaos if something doesn't interfere and organize it. However, when something does make order out of chaos, it does so at the expense of something else; i.e, it creates more chaos elsewhere. Imagine walking down the street and passing by a pile of bricks. This is a state of disorder. The next day, you walk by again and the bricks are now a wall. This is now a state of order. You know that the bricks couldn't possibly have built themselves into a wall. You know that someone made the wall. In this case, the chaos created comes from the energy expended by the person who built the wall. He ate food that his body used to provide energy to his muscles to move the bricks. Not all of the energy released from the food was used by his body however. Some was lost as heat, some as less complex molecules. This is the nature of entropy. The order of disorder.

2007-08-02 13:46:49 · answer #1 · answered by Anonymous · 1 0

I'm going to assume that this question is about entropy coming from a statistical mechanics/thermodynamics angle. In that case the entropy (S) is given by:

S = log( Oe )

Where Oe is the number of microstates at a particular energy level.

This is not terribly helpful, right? Well, that's the literal definition. A MICROSTATE is a possible configuration of a closed system with a particular energy level. For example, consider a helium atom with it's electrons in their ground states. In this case there's only one microstate: Both electrons in the lowest orbital. However, if you add enough energy to excite one of the electrons, you end up with one of the electrons excited, meaning there are TWO possible combinations: The first one excited, and the second one in the ground state, and the converse. This means the system now has TWO possible microstates for that energy level.

As you add more energy, there are more and more combinations of those two excited electrons that add up to a certain energy state.

At a very basic level, entropy can be a rough measurement of the DISORDER of a system, though it's not, really: It's a measure of the POSSIBLE disorder of a system at a particular energy. It is not the TIME ARROW, either, though statistically, entropy always tends toward a maximum, (always increases,) as you move forward in time. (The teacup falls off the table and breaks into a dozen pieces, but the pieces never leap up off the floor and re-assemble.)

2007-08-02 13:27:02 · answer #2 · answered by Garrett J 3 · 1 0

I have read many of the responses that your question has drawn. They are all more or less correct.
Entropy was explained this way in one of my physics seminars: entropy is a measure of energy of a system, specifically that energy expended. If you look at the whole universe, then you see that the end of it will occur when the inherent entropy is expended.
Look at a physics text on thermodynamics; this will more or less say the same thing. By the way, take a look at the definition of enthalpy, not to put too fine an edge on it.

2007-08-02 13:31:59 · answer #3 · answered by kellenraid 6 · 1 0

The simple answer is that it's a measure of uselessness.

Although it's frequently heard in the context of heat, it can be used to measure the lack of usefulness of about anything. For example static is a form of entropy in communication. The signal to noise (static) ratio measures just how much useful information (signal) in relation to useless background (noise) Higher signal to noise ratios imply better ability to exchange useful information.

In heat all energy is either enthalpic (useful) or entropic (useless). Thus, the total energy in the universe E = H + S; where H is enthalpic and S is entropic. A good example of entropic heat it that heat created by friction as a body slides over something. That energy is gone, we cannot retrieve and use it elsewhere...it is entropic.

Precise definitions of entropy (e.g., equations) depend on the context (e.g., heat, communication). But in general, our universe is winding down, which is to say it is positive entropic. Useful energy is being continually converted to useless energy. Some time, in the very far future, our universe is destined to completely run down; at that time, theory goes, its temperature will be absolute 0 deg Kelvin.

2007-08-02 14:00:50 · answer #4 · answered by oldprof 7 · 1 0

entropy = quantity specifying the amount of disorder or randomness in a system bearing energy or information.

Normally used in an equation that asks how much force is necessary to turn chaos into order.

2007-08-02 13:03:09 · answer #5 · answered by bronte heights 6 · 0 0

There are many different complementary definitions. Probably the most common one is how a system tends to become more disordered over time. A more technical definition is the number of microstates available to a system.

2007-08-02 12:57:52 · answer #6 · answered by eri 7 · 1 0

Entropy is basically a state of disorder.

2007-08-02 13:00:22 · answer #7 · answered by Anonymous · 0 0

Energy can't be destroyed but it can become unusable. Star light is a form of energy but it is so weak that for all intents and purposes it is useless, eventually all of the energy in the universe will be spread out so thinly it will be lost.

2007-08-05 05:36:41 · answer #8 · answered by johnandeileen2000 7 · 0 0

hi

1) information measure
2) disorder measure
3) time arrow

bye

2007-08-02 12:54:17 · answer #9 · answered by railrule 7 · 0 0

I dont know what it is but try to look it up online or ask a phisician that is what i would so.

2007-08-02 12:57:08 · answer #10 · answered by Mogan 1 · 0 5

fedest.com, questions and answers