10

E.F. Redish

Now we know that entropy is the number of combinations. However, as we will see in class, the entropy (S) of a system is defined as as S = k_B \ln W, where W is the number of possible arrangements of the system. But why? Why not just say that entropy is the number of arrangements? Let’s think through why it has to be defined this way.

We want to define entropy as an extensive property, i.e. if I have two systems A and B, the total entropy should be the entropy of A plus the entropy of B. This is like mass (2 kg + 2 kg = 4 kg), and not an intensive property like temperature (if you combine two systems that are each at 300 K, you have a system at 300 K, not at 600 K!).

What happens to the number of possible arrangements when you combine two systems? If system A can be in 3 different arrangements and system B can be in 5 different arrangements, then there are 3*5 = 15 possible combinations. They multiply! This ’80s music video http://www.youtube.com/watch?v=w0i_ZFlGTVY explains why.

So we can’t just define entropy as the number of possible arrangements, because we need the entropy to add, not multiply, when we combine two systems.

How do you turn multiplication into addition? Just take the logarithm. 3 \times 5 = 15, but \ln 3 + \ln 5 = \ln 15. So that’s why entropy is defined as a constant times \ln W. The W (the number of arrangements) is a dimensionless number, so \ln W is too.

The constant out in front could be any constant, but we use Boltzmann’s constant, 1.38 x 10-23 J/K. When we get to Gibbs free energy, we’ll see that this constant has the right units, since we need entropy to be in units of energy/temperature.

License

Share This Book