Page 1 of 1

Lay question about entropy

Posted: Thu Apr 20, 2017 7:13 pm UTC
by webgrunt
Greetings,

I don't understand something. According to what I've read, there's a law that states things progress from an ordered state to a disordered state. But according to some shows I've seen, early after or during the Big Bang, there was just a sort of chaotic energy soup before atoms formed and began to clump together forming stars which in turn formed more complex elements and eventually led to the complexity of the human brain, which in spite of all appearances to the contrary in the YouTube comments section, seems to be pretty damned ordered. Like, incredibly ordered.

So how does entropy hold up when we've gone from chaos to a bunch of brains that can work together to discover the secrets of the universe? What am I missing?

Re: Lay question about entropy

Posted: Thu Apr 20, 2017 7:39 pm UTC
by speising
That's life for you.

Entropy increases for the whole system, but it can always decrease locally, at the cost of even more entropy somewhere else (aka expenditure of energy).

Life could be defined as such an ordering process. In fact, everything life does is directed towards gathering energy and expending it to create order.

Re: Lay question about entropy

Posted: Thu Apr 20, 2017 7:51 pm UTC
by Tub
Entropy is not order.

I know that entropy is often explained as being a measure of order, and that works out for a few simple examples.

But is it really more ordered to have all the particles of the universe randomly strewn about on planets and stars? I know that when my socks are randomly dispersed across the room, I consider that less ordered than a homogenous stack of socks. So shall we consider the homogenous particle soup ordered or unordered?

tl;dr: You can't use an intuition of order to determine entropy. You need to count microstates, put them into the formula and then you have it. And then you'll figure out that the total entropy has continually increased since the big bang. Local decreases are not a problem, only the total entropy of a system counts.

Re: Lay question about entropy

Posted: Thu Apr 20, 2017 7:55 pm UTC
by doogly
We don't need the word life at all though, we just need the universe to be expanding and cooling. The entropy of the whole system is still increasing during this process. It is totally a very exciting chapter in Mukhanov's book. I recommend it if you are not in fact a lay person, but a grad student testing us.
https://www.amazon.com/Physical-Foundat ... DG7W1AF3B9

If you are actually a lay person, Carroll's book is p aight
https://www.amazon.com/Eternity-Here-Qu ... an+carroll

The only thing unsatisfying with this story is that maybe you also don't want the initial state to be very low entropy. You might like to think that a good initial state would be one that is "generic", and since low entropy states are "special", we are doing a shell game with our explanations. But we really do like entropy increasing, so thinking it was initially v v low is not a weird thought. Just, yeah, the cosmological implications do have some prickles.

Re: Lay question about entropy

Posted: Thu Apr 20, 2017 8:06 pm UTC
by Tub
doogly wrote:If you are actually a lay person, Carroll's book is p aight

Can confirm that his book "The Big Picture" is a good read for lay persons. His older book "From Eternity to Here" probably has more details on entropy, but I haven't read it. Can also recommend searching talks from Sean Carroll on youtube. His talks (i.e. not his lectures) are pretty laymen friendly. Can also recommend his blog.

Re: Lay question about entropy

Posted: Thu Apr 20, 2017 11:51 pm UTC
by Eebster the Great
doogly wrote:The only thing unsatisfying with this story is that maybe you also don't want the initial state to be very low entropy. You might like to think that a good initial state would be one that is "generic", and since low entropy states are "special", we are doing a shell game with our explanations. But we really do like entropy increasing, so thinking it was initially v v low is not a weird thought. Just, yeah, the cosmological implications do have some prickles.

More specifically, the same statistical argument that shows that entropy is likely to be higher in the future also shows that, if we knew nothing about the past, we would expect it to be higher in the past as well. There ought to be some good reason why it was lower in the past (indeed, why it was extremely low following the Big Bang).

Re: Lay question about entropy

Posted: Fri Apr 21, 2017 1:07 am UTC
by doogly
I think you'd only expect that if the past were also infinite.

Re: Lay question about entropy

Posted: Fri Apr 21, 2017 5:02 am UTC
by Eebster the Great
doogly wrote:I think you'd only expect that if the past were also infinite.

How do you figure?

Re: Lay question about entropy

Posted: Fri Apr 21, 2017 12:13 pm UTC
by doogly
Normally, forward in time is increase in entropy. You'd only say forward and backwards in time are going to look the same if you are in some eternal steady state, and you think it's just as likely that you're on the upswing or the downswing of a fluctuation.

Re: Lay question about entropy

Posted: Fri Apr 21, 2017 1:48 pm UTC
by somitomi
Despite having passed Thermodynamics I. at university, my understanding of entropy comes mainly from this video The gist of it is that things can be disordered and complex at the same time (see also: my room).

Re: Lay question about entropy

Posted: Fri Apr 21, 2017 2:28 pm UTC
by doogly
"Complexity" is just very difficult to define in any physically meaningful way.

Re: Lay question about entropy

Posted: Sat Apr 22, 2017 4:07 am UTC
by Eebster the Great
doogly wrote:Normally, forward in time is increase in entropy.

Well empirically, yes, but there is nothing in statistical mechanics that demonstrates that this ought to be the case in general, classical physics being time-symmetrical and all.

Re: Lay question about entropy

Posted: Sat Apr 22, 2017 10:08 pm UTC
by doogly
Passing to the statistical case breaks the reversability of the microphysics though

Re: Lay question about entropy

Posted: Sat Apr 22, 2017 10:40 pm UTC
by Eebster the Great
doogly wrote:Passing to the statistical case breaks the reversability of the microphysics though

What do you mean? Microscopic physics is still reversible, unless you think weak CP violation explains the arrow of time.

Re: Lay question about entropy

Posted: Sun Apr 23, 2017 12:52 am UTC
by doogly
Right, but entropy isn't a thing in the microscopic picture, it's only defined in a statistical sense, at which point you break the reversability.

Re: Lay question about entropy

Posted: Sun Apr 23, 2017 12:59 am UTC
by madaco
I think they meant something like (and I'm going to say this very poorly)

that even if the "time progressing" function is a bijection, that if some macrostates have more possible microstates than others, that the output of the function will be a microstate of a macrostate with more possible microstates more often than of one with fewer possible microstates.

I think?

whoops, they explained themself while I was writing this

Re: Lay question about entropy

Posted: Sun Apr 23, 2017 2:46 am UTC
by morriswalters
Well that got technical really quickly. Ginsberg's Theorem is an amusing take on it. In a practical sense it's why, no matter how long you wait, your coffee will never get any warmer, unless you add heat. The most common use of of the principle, at least that most people will recognize, is in their AC. The concept is enshrined in the bureaucracy, it's used by the patent office to weed out perpetual motion machines. Our brain had a chance to evolve because because we had a heat source.

Re: Lay question about entropy

Posted: Sun Apr 23, 2017 6:25 am UTC
by Eebster the Great
doogly wrote:Right, but entropy isn't a thing in the microscopic picture, it's only defined in a statistical sense, at which point you break the reversability.

But the statistics of reversible physics can only be irreversible in peculiar circumstances. In the overwhelming majority of circumstances, they will be symmetrical. So that does not answer the question of why our particular circumstance is so extremely peculiar. It is predictable only in reference to another even more peculiar circumstance at another point in time (which we call the past). The fact that entropy is lower in the past and higher in the future cannot be predicted by statistics alone.

To put it another way, if we take a tenseless view of the universe, and we pick some point in time at random, we can equally well predict that the entropy at any other nearby point in time will be equal or greater. There is no law saying that it must be higher moving in one direction and lower moving in the other.