Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Actually no, this is a good question that hits at a tension between the second law of thermodynamics and the big bang model. The aftermath of the big bang and the inflation period are known to be times where the universe was extremely hot and dense. And yet, by the rule that entropy can't increase over time, it follows that they were the lowest entropy state of the universe: definitely lower entropy than what we have today. But how can an extremely hot plasma made up of all of the particles that today make up stars and planets and so on have been a lower entropy state than the galaxies of today?


Disclaimer: I'm saying a lot of words here, but I don't actually entirely know what I'm talking about here. I'm saying things that I think make sense based on what I do understand, but I don't actually know details how people model things about entropy in relation to models of "the big bang". I'm not simplifying to try to make things easy to understand, but rather I'm grasping at too-simple weak analogies in an attempt to understand.

I think this is generally explained as being due to the expansion of space? This is only an extremely loose analogy because the expansion of space is not really all that much like a container getting bigger (because it isn't like it is thought to start with one particular size and get bigger), but, suppose you have a syringe with the tip closed off, where there is some hot gas that is highly compressed in the front bit of the syringe, and then you pull back on the syringe plunger. The temperature of the gas should decrease, right? By the same principle of how refrigeration works by rarefying the coolant in the place you want to make cool, to make it cooler so that it will absorb heat from the environment, and then moving it to the place you want to heat and compressing it so that it will be hot and give off that heat?

But, pulling the plunger back doesn't decrease the entropy of the gas in the plunger, does it? It might require putting in energy in which case it could decrease the energy in the plunger at the cost of increasing it elsewhere, but if the plunger is loose enough the pressure from the gas should be able to push it out, in which case I think the temperature and pressure would still decrease without an external source doing the work, so the entropy definitely shouldn't decrease in that case.

So, going from hot and dense to less hot and dense, keeping the same amount of stuff but more spread out, doesn't always mean a decrease in entropy, and can instead correspond to more entropy?

After all, if there are more places available to be, that seems like more uncertainty, right? At least, in a finite system.

Like, if you are looking at the relative entropy of a distribution of one particle in a 1D box of length L, where the entropy is relative to the Lebesgue measure, and you compare this to the case of a 1D box of length 2L , still measuring the relative entropy relative to the Lebesgue measure, well, the maximum entropy distribution for the position when in the 2L length box, the uniform distribution on [0,2L], has more relative entropy than the uniform distribution on [0,L] , which has the highest entropy for the distribution over position attainable for the length L box. (I say "relative entropy" because for a continuous variable, we don't have like, a countable set of possibilities which each have non-zero probability that we can sum over, and so we can't use that definition of entropy, but instead integrate over possibilities using the density of the distribution with respect to some reference measure, and I think the Lebesgue measure, the usual measure on sets of real numbers, is a good common reference point.)

Though, I guess the thing shouldn't really be just distribution over position, but joint distribution over position and momentum.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: