I was told a while back that there is no such thing as entropy of mixing for ideal gases, and it makes sense to me.
In an ideal gas, the components of the gas don’t take up any volume, and they don’t have any specific interactions with each other.
If you start out with two ideal gases, A and B, in a container separated by a partition, and remove the partition, then gas A will expand into the whole of the volume previously occupied by A and B. Each molecule of A will have more options available to it than it had previously, and entropy will increase. Similarly, gas B will expand into the whole of the volume, etc. See, there is no entropy of mixing. There is only the entropy of expansion. If the volume of A equals the volume of B, this entropy of expansion turns out to be R.ln2, where R is the ideal gas constant.
But, let’s say we had equal pressure of gas A on both sides of the partition. We remove the partition, and gases A and A mix. Both of them expand, so we ought to get an entropy increase. But the pressure and volume and temperature of the final A + A system is exactly the same as the initial one: there has been no change, and the entropy of expansion turns out to be 0.
This is the Gibbs Paradox.
It bugs me because, let’s say we didn’t stop at one partition, but kept putting in more and more partitions until every particle was in its own little box, surely that would mean we had a system with less entropy?
I keep looking for experiment data on entropy of mixing of gases, and all I find is people writing theoretical papers explaining the paradox away in different ways.
This one is particularly good, and says more or less- I think- that the thermodynamic entropy that we can use to do work with is not really a well defined function in the same way that energy is. It will depend on the things we have selected to characterise the system, and if we were to find out new ways of distinguishing particles that were indistinguishable before, we could exploit these to do work, and see an entropy change on mixing. This seems perfectly valid, but troubles me because I have been teaching first year in such a way that energy is an abstraction from entropy as a more fundamental concept. Which I will have to rethink without confusing myself totally.
Another paper that I have to read over again to try and start thinking clearly is this one, which is about the confusion between thermodynamic entropy and informational’entropy’.
So it seemed to me that when we remove the partition between A and A′, we had to be increasing the informational ‘entropy’ of the system, but maybe because there is no way to exploit this to do work, we haven’t done anything to the thermodynamic entropy.
I was talking about putting particles in boxes before, so I thought I should go all quantum and actually put our particles in boxes.
Let’s say we have some energy kT available for partitioning all our particles into translational states. These translational states will be separated by energies proportional to 1/a2, where a is the size of the box. So the number of translational states available if the temperature stays the same but we double the size of the box doesn’t go up by a factor of 2, but by a factor of 4.
So I was thinking that there seemed to be a lot more ways of putting 2n objects in 4m boxes than of n objects in 2m boxes, and so there ought to be an increase in informational ‘entropy’ when we double the size of the box. Sure enough, when I looked up how to calculate the number of permuations there are a whole lot more permutations. But informational entropy is related to the log of the number of permutations. The log of the number of permutations at the end seems to be converging to twice the log of the number of the permutations I had in a box half the size... but it is converging towards one too slowly for the factorials I can do in Excel to cope. Is it going to go to one or not? Does ‘entropy’ of A + ‘entropy’ of A = entropy of (A+A)?
So I have to say this is something I don’t understand.
Next up: Quantum Teleporation.
- ► 2012 (13)
- ► 2008 (21)