If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.

# More on entropy

Distinguishing between microstates and macro states. How entropy S is defined for a system and not for a specific microstate. Created by Sal Khan.

## Want to join the conversation?

• What exactly does it mean that a system can take on a higher number of states? Does it mean that the molecules present can arrange themselves in a higher number of different combinations? So that if the volume of a system gets bigger, for instance, there are more possible locations for the molecules present to occupy, and therefore a higher number of possible combinations of those molecules?
• Exactly. It doesn't matter how the molecules are arranged. Just the number of ways they can be arranged.
• We said that the CHANGE in S is equal to Qrev/T, so the only thing we could measure was the change, not the actual value, in fact at a given state in a condition of equilibrium, no heat is added to a sistem so the change in S is 0. Then we discover a way to define S at a given state using the number of states that a system can assume. Ok. But how come then that the CHANGE in entropy (when we expand the piston) results the same value as the value of entropy in a defined state given by Boltzmann??
• They do not result the same. Reference: http://www.khanacademy.org/video/reconciling-thermodynamic-and-state-definitions-of-entropy?playlist=Chemistry. Jump to @ in that video. Boltzman's statistical definition of entropy using the number of states can be used to calculate a change in entropy (by taking Sfinal-Sinitial), which gives the same value as using Q/T. In other words, adding heat to a system increases the number of states possible at the micro level.
• I'm confused. If there are an infinite ammount of points in a 3 dimensional space, then can't a molecule occupy an infinite ammount of different states? (orientation, position, etc.) So even if you increase the volume, it's still and infinite ammount of states right? Or am I just misunderstanding the use of the word 'state'?
• You need to know that not all infinities are just as big. That is INF < 2*INF.
That is how entropy works.
• Can someone explain to me why elements in their standard states do not have 0 entropy? (i.e. S = 0)
Thank you
• It depends on the standard state. Gas will always have high entropy, because there is little order. The same goes for liquids. For an entropy of zero in a solid, the temp must be 0K, so there is not too much movement, which causes disorder. Hope this helps
• One definition of entropy I have read is "a measure of the energy that is no longer available to perform useful work within the current environment", but it seems much different from the definition given here. Can someone help me make the connection between the two definitions?
• Suppose you have a gas that is expanding and moving a piston from left to right, thus doing "useful", i.e. mechanical, work. Since the gas particles are transferring a net rightward momentum to the piston, there is a net rightward velocity to the particles. There are not as many ways to get this state as there are to get a state where the particles are moving completely randomly. The piston eventually stops moving because the excess rightward momentum of the particles has all been transferred to the piston and now the particles are moving completely randomly, and the entropy of the gas is higher. Then no more useful work is possible. The particles are still moving and still have kinetic energy, but there's no way to capture it to do mechanical work.
• If heat is added to an ideal gas (+q) how much of that heat will be converted to mechanical work (-w)? Given that entropy is "a thermodynamic quantity representing the unavailability of a system's thermal energy for conversion into mechanical work" would it be as simple as dividing the heat put into the system (+q) by the temperature of the system (T) Q/T=S and deducting that S value from the heat put into the system (+q) to find the total amount of energy left to do mechanical work? If U = Q - W and 10 J of heat is added to the system, is the change in internal energy (U), which is the amount of energy not lost to work after the heat was added, equal to entropy's negative affect on converting that heat put into the system into mechanical work (-S)?
• but then why do we associate entropy with disorder ?? how does a disordered system take on more states ??
• entropy is the measure of disorder in a system.
in a solid substance, molecules dont move freely and thus the entropy of that system is small
in a liquid the molecules move a bit more freely and hence the entropy of htis system is more
in a gaseous system, the entropy is more as compared to the solid and liquid substance
• Can someone give me a super quick and easy definition of entropy?
(1 vote)
• Entropy is just the measure of chaos within a system. The more disordered a system is, the more entropy it has. For example, a neat deck of playing cards has little entropy. However, if you throw the deck in the air, all the cards will go in random directions. Now, the cards have more entropy. Hope this helps!