Page 1 of 1

Lay question about entropy

Posted: Thu Apr 20, 2017 7:13 pm UTC
by webgrunt
Greetings,

I don't understand something. According to what I've read, there's a law that states things progress from an ordered state to a disordered state. But according to some shows I've seen, early after or during the Big Bang, there was just a sort of chaotic energy soup before atoms formed and began to clump together forming stars which in turn formed more complex elements and eventually led to the complexity of the human brain, which in spite of all appearances to the contrary in the YouTube comments section, seems to be pretty damned ordered. Like, incredibly ordered.

So how does entropy hold up when we've gone from chaos to a bunch of brains that can work together to discover the secrets of the universe? What am I missing?

Re: Lay question about entropy

Posted: Thu Apr 20, 2017 7:39 pm UTC
by speising
That's life for you.

Entropy increases for the whole system, but it can always decrease locally, at the cost of even more entropy somewhere else (aka expenditure of energy).

Life could be defined as such an ordering process. In fact, everything life does is directed towards gathering energy and expending it to create order.

Re: Lay question about entropy

Posted: Thu Apr 20, 2017 7:51 pm UTC
by Tub
Entropy is not order.

I know that entropy is often explained as being a measure of order, and that works out for a few simple examples.

But is it really more ordered to have all the particles of the universe randomly strewn about on planets and stars? I know that when my socks are randomly dispersed across the room, I consider that less ordered than a homogenous stack of socks. So shall we consider the homogenous particle soup ordered or unordered?

tl;dr: You can't use an intuition of order to determine entropy. You need to count microstates, put them into the formula and then you have it. And then you'll figure out that the total entropy has continually increased since the big bang. Local decreases are not a problem, only the total entropy of a system counts.

Re: Lay question about entropy

Posted: Thu Apr 20, 2017 7:55 pm UTC
by doogly
We don't need the word life at all though, we just need the universe to be expanding and cooling. The entropy of the whole system is still increasing during this process. It is totally a very exciting chapter in Mukhanov's book. I recommend it if you are not in fact a lay person, but a grad student testing us.
https://www.amazon.com/Physical-Foundat ... DG7W1AF3B9

If you are actually a lay person, Carroll's book is p aight
https://www.amazon.com/Eternity-Here-Qu ... an+carroll

The only thing unsatisfying with this story is that maybe you also don't want the initial state to be very low entropy. You might like to think that a good initial state would be one that is "generic", and since low entropy states are "special", we are doing a shell game with our explanations. But we really do like entropy increasing, so thinking it was initially v v low is not a weird thought. Just, yeah, the cosmological implications do have some prickles.

Re: Lay question about entropy

Posted: Thu Apr 20, 2017 8:06 pm UTC
by Tub
doogly wrote:If you are actually a lay person, Carroll's book is p aight

Can confirm that his book "The Big Picture" is a good read for lay persons. His older book "From Eternity to Here" probably has more details on entropy, but I haven't read it. Can also recommend searching talks from Sean Carroll on youtube. His talks (i.e. not his lectures) are pretty laymen friendly. Can also recommend his blog.

Re: Lay question about entropy

Posted: Thu Apr 20, 2017 11:51 pm UTC
by Eebster the Great
doogly wrote:The only thing unsatisfying with this story is that maybe you also don't want the initial state to be very low entropy. You might like to think that a good initial state would be one that is "generic", and since low entropy states are "special", we are doing a shell game with our explanations. But we really do like entropy increasing, so thinking it was initially v v low is not a weird thought. Just, yeah, the cosmological implications do have some prickles.

More specifically, the same statistical argument that shows that entropy is likely to be higher in the future also shows that, if we knew nothing about the past, we would expect it to be higher in the past as well. There ought to be some good reason why it was lower in the past (indeed, why it was extremely low following the Big Bang).

Re: Lay question about entropy

Posted: Fri Apr 21, 2017 1:07 am UTC
by doogly
I think you'd only expect that if the past were also infinite.

Re: Lay question about entropy

Posted: Fri Apr 21, 2017 5:02 am UTC
by Eebster the Great
doogly wrote:I think you'd only expect that if the past were also infinite.

How do you figure?

Re: Lay question about entropy

Posted: Fri Apr 21, 2017 12:13 pm UTC
by doogly
Normally, forward in time is increase in entropy. You'd only say forward and backwards in time are going to look the same if you are in some eternal steady state, and you think it's just as likely that you're on the upswing or the downswing of a fluctuation.

Re: Lay question about entropy

Posted: Fri Apr 21, 2017 1:48 pm UTC
by somitomi
Despite having passed Thermodynamics I. at university, my understanding of entropy comes mainly from this video The gist of it is that things can be disordered and complex at the same time (see also: my room).

Re: Lay question about entropy

Posted: Fri Apr 21, 2017 2:28 pm UTC
by doogly
"Complexity" is just very difficult to define in any physically meaningful way.

Re: Lay question about entropy

Posted: Sat Apr 22, 2017 4:07 am UTC
by Eebster the Great
doogly wrote:Normally, forward in time is increase in entropy.

Well empirically, yes, but there is nothing in statistical mechanics that demonstrates that this ought to be the case in general, classical physics being time-symmetrical and all.

Re: Lay question about entropy

Posted: Sat Apr 22, 2017 10:08 pm UTC
by doogly
Passing to the statistical case breaks the reversability of the microphysics though

Re: Lay question about entropy

Posted: Sat Apr 22, 2017 10:40 pm UTC
by Eebster the Great
doogly wrote:Passing to the statistical case breaks the reversability of the microphysics though

What do you mean? Microscopic physics is still reversible, unless you think weak CP violation explains the arrow of time.

Re: Lay question about entropy

Posted: Sun Apr 23, 2017 12:52 am UTC
by doogly
Right, but entropy isn't a thing in the microscopic picture, it's only defined in a statistical sense, at which point you break the reversability.

Re: Lay question about entropy

Posted: Sun Apr 23, 2017 12:59 am UTC
by madaco
I think they meant something like (and I'm going to say this very poorly)

that even if the "time progressing" function is a bijection, that if some macrostates have more possible microstates than others, that the output of the function will be a microstate of a macrostate with more possible microstates more often than of one with fewer possible microstates.

I think?

whoops, they explained themself while I was writing this

Re: Lay question about entropy

Posted: Sun Apr 23, 2017 2:46 am UTC
by morriswalters
Well that got technical really quickly. Ginsberg's Theorem is an amusing take on it. In a practical sense it's why, no matter how long you wait, your coffee will never get any warmer, unless you add heat. The most common use of of the principle, at least that most people will recognize, is in their AC. The concept is enshrined in the bureaucracy, it's used by the patent office to weed out perpetual motion machines. Our brain had a chance to evolve because because we had a heat source.

Re: Lay question about entropy

Posted: Sun Apr 23, 2017 6:25 am UTC
by Eebster the Great
doogly wrote:Right, but entropy isn't a thing in the microscopic picture, it's only defined in a statistical sense, at which point you break the reversability.

But the statistics of reversible physics can only be irreversible in peculiar circumstances. In the overwhelming majority of circumstances, they will be symmetrical. So that does not answer the question of why our particular circumstance is so extremely peculiar. It is predictable only in reference to another even more peculiar circumstance at another point in time (which we call the past). The fact that entropy is lower in the past and higher in the future cannot be predicted by statistics alone.

To put it another way, if we take a tenseless view of the universe, and we pick some point in time at random, we can equally well predict that the entropy at any other nearby point in time will be equal or greater. There is no law saying that it must be higher moving in one direction and lower moving in the other.

Re: Lay question about entropy

Posted: Sun Apr 23, 2017 11:16 am UTC
by Tub
Eebster the Great wrote:To put it another way, if we take a tenseless view of the universe, and we pick some point in time at random, we can equally well predict that the entropy at any other nearby point in time will be equal or greater. There is no law saying that it must be higher moving in one direction and lower moving in the other.

Careful with "picking at random". If we live in a universe where the overwhelming parts are in thermal equilibrium, then we predict the opposite - entropy in either direction from a random point should be equal or less.

Re: Lay question about entropy

Posted: Sun Apr 23, 2017 5:05 pm UTC
by Eebster the Great
Tub wrote:
Eebster the Great wrote:To put it another way, if we take a tenseless view of the universe, and we pick some point in time at random, we can equally well predict that the entropy at any other nearby point in time will be equal or greater. There is no law saying that it must be higher moving in one direction and lower moving in the other.

Careful with "picking at random". If we live in a universe where the overwhelming parts are in thermal equilibrium, then we predict the opposite - entropy in either direction from a random point should be equal or less.

In such a universe, what I said is still technically true. Entropy cannot decrease any more often than it increases in such a case, and regardless, most of the time it will remain constant.

Re: Lay question about entropy

Posted: Mon Apr 24, 2017 10:59 am UTC
by Zamfir
But the statistics of reversible physics can only be irreversible in peculiar circumstances. In the overwhelming majority of circumstances, they will be symmetrical. So that does not answer the question of why our particular circumstance is so extremely peculiar.

Is 'majority of circumstances' really a meaningful concept, here? For all we know, every theoretically possible universe has to start out like this. 'Start' is perhaps not the right word, just that every universe has to have a big-bang state somewhere, and the region of time near that event will have an observable arrow of time.

Re: Lay question about entropy

Posted: Mon Apr 24, 2017 12:49 pm UTC
by doogly
Zamfir wrote:Is 'majority of circumstances' really a meaningful concept, here?

Is it time for the measure problem? Is it time? :mrgreen:

Re: Lay question about entropy

Posted: Mon Apr 24, 2017 4:16 pm UTC
by jewish_scientist
Just wondering, does empty space* count as high entropy or low entropy?

*To anyone about to bring up quantum physics: shut up, you know what I mean and this is hard enough as it is without you bringing up a technicality from a different branch of physics.

Re: Lay question about entropy

Posted: Mon Apr 24, 2017 4:28 pm UTC
by doogly
The entropy of classical empty space is 0.
A classical vacuum is super boring, and you cannot avoid quantum mechanics for very long here.

Re: Lay question about entropy

Posted: Mon Apr 24, 2017 4:36 pm UTC
by Eebster the Great
Zamfir wrote:
But the statistics of reversible physics can only be irreversible in peculiar circumstances. In the overwhelming majority of circumstances, they will be symmetrical. So that does not answer the question of why our particular circumstance is so extremely peculiar.

Is 'majority of circumstances' really a meaningful concept, here? For all we know, every theoretically possible universe has to start out like this. 'Start' is perhaps not the right word, just that every universe has to have a big-bang state somewhere, and the region of time near that event will have an observable arrow of time.

That might be true, but it's not something you can derive from statistical mechanics, which is my point. Statistical mechanics do not explain the arrow of time as is often claimed. The low entropy of the Big Bang does.

Re: Lay question about entropy

Posted: Mon Apr 24, 2017 6:21 pm UTC
by WibblyWobbly
doogly wrote:The entropy of classical empty space is 0.
A classical vacuum is super boring, and you cannot avoid quantum mechanics for very long here.

So, quantum mechanics in a vacuum is like Liam Neeson looking for his daughter? :D

Re: Lay question about entropy

Posted: Mon Apr 24, 2017 6:48 pm UTC
by doogly
It is probably not the worst analogy someone has tried to use for quantum mechanics.

Re: Lay question about entropy

Posted: Mon Apr 24, 2017 6:54 pm UTC
by WibblyWobbly
doogly wrote:It is probably not the worst analogy someone has tried to use for quantum mechanics.

Oh, I've certainly heard worse. Mostly from mainstream reporting on quantum mechanics. But they generally try to succeed and fail, where as I was trying to fail and succeeded. We're all winners.

Re: Lay question about entropy

Posted: Mon Apr 24, 2017 6:59 pm UTC
by Soupspoon
WibblyWobbly wrote:
doogly wrote:The entropy of classical empty space is 0.
A classical vacuum is super boring, and you cannot avoid quantum mechanics for very long here.

So, quantum mechanics in a vacuum is like Liam Neeson looking for his daughter? :D

"I don't know who you are. I don't know what you want. At least not at the same time. If you are looking for discrete values, I can tell you I don't have precision. But what I do have are a very particular set of interference patterns, interference patterns I have acquired over a very large light-cone. Interference patterns that make me a probability amplitude for superpositions like you. If you let my wave/particle duality go now, that'll be the end of it. I will not collapse your waveform for you, I will not entangle you. But if you don't, I will observe you, I will resolve your momentum, and I will cohere you."

Re: Lay question about entropy

Posted: Mon Apr 24, 2017 7:02 pm UTC
by WibblyWobbly
Soupspoon wrote:
WibblyWobbly wrote:
doogly wrote:The entropy of classical empty space is 0.
A classical vacuum is super boring, and you cannot avoid quantum mechanics for very long here.

So, quantum mechanics in a vacuum is like Liam Neeson looking for his daughter? :D

"I don't know who you are. I don't know what you want. At least not at the same time. If you are looking for discrete values, I can tell you I don't have precision. But what I do have are a very particular set of interference patterns, interference patterns I have acquired over a very large light-cone. Interference patterns that make me a probability amplitude for superpositions like you. If you let my wave/particle duality go now, that'll be the end of it. I will not collapse your waveform for you, I will not entangle you. But if you don't, I will observe you, I will resolve your momentum, and I will cohere you."

Good luck.