[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 39 KB, 1350x550, latest.gif [View same] [iqdb] [saucenao] [google]
8389835 No.8389835 [Reply] [Original]

Anyone care to give me a definition of entropy that is simpler than wikipidia?

>> No.8389852

>>8389835
Entropy is the amount of information needed to pick out a particular state a system could be in.

There is only one way to be a perfect crystal at absolute zero - only one pattern in which its atoms could be arranged at any given time.

But there are many, many different possible patterns if you vaporize it and turn it into a hot, jumbled gas with atoms whizzing all over the place.

There is only one way for a room to be clean and well-organized, but many ways for it to be messy.

>> No.8389855

>>8389835
https://simple.wikipedia.org/wiki/Entropy

>> No.8389934

>>8389835
Write the shortest possible solution to "2+2". It's 4, a single number, something incredibly low in entropy (1 bit). You either executed the equation or didn't, two states, representable by 1 and 0.

Now, you add the possibility for a random variable. The first one can be a random from 1 to 10, making the room for possible answers range from [1+2 = 3] to [10+2 = 12], or 10 different states.

What you just did is increase the entropy of the information of that function, which now has to be represented by 0 if not executed, and additional bits for the 10 values:
>1+2 = "01", 2+2 = "10", 3+2 = "11", 4+2 = "100", 5+2 = "101" and so on.
We say that there is an increase in entropy because the randomized integer allows us to have 10 different values, which need to be accounted for by 10 different combinations of bits in a binary system.

If you increased the random range of the left one to 1 to 100, to transmit that information to someone else, you would need a translator on their end that would decode every single bit combination from 1 to 100, or "01" to "1100100". That's a lot of possible information packets you can receive, therefor, high in entropy compared to the single deterministic 2+2=4 in the beginning.

That's as easy as I can go, but don't worry that you don't quite understand what it is right now, you will sooner or later. A lot of people struggle with it. Keep reading stuff about it and you'll eventually realize what it's supposed to mean

>> No.8389951

>>8389835
Is it just random or is it meaningful information?

>> No.8390001

>>8389934
Why is and increase entropy an increase in potential outcomes? Isn't entropy the effort required to answer those different outcomes and thus the loss of energy to attain similar information?

Also why does 'better math' or 'better interpretive tools' not have an impact on entropy since it only exists when the other outcomes have to be solved for.

Lastly, don't difficult problems often contain more valuable answers? So from a thermodynamic standpoint it could have been a waste of effort but at some higher level it could overcome this?

>> No.8390004

ITT, Things that will never happen.

>> No.8391568

https://www.youtube.com/watch?v=sMb00lz-IfE

watch it

>> No.8391595

So formally entropy is thought of as the measurement of the disorder of a system. A better way to word it is a measurement of the number of ways a system can be rearranged.

So lets introduce the concept of statistical weight and microstates. The best way to think about it really is to use throwing a coin as an example.
Lets say you throw a coin 4 times. What are the combinations possible (not caring for specific order)?
HHHH, TTTT, HHTT, HHHT and HTTT. (These are called macrostates)
How many ways to arrange these distinct combinations ( this is known as a microstate)?
An example would be (HHTT,HTHT,TTHH ect.)
This is where statistical weight is brought in.

(cont if you care)

>> No.8391599

>>8391595
>(cont if you care)
I'll allow it

>> No.8391602 [DELETED] 

>>8391595
If you have 4 throws there are 4! ( i.e 4x3x2x1 ) combinations (microstates).
If you want to find all the ways (microstates) 2 coins will be heads you need to divide this number by H!(N-H)!
Think about this as the number of ways to arrange heads and the number of ways to arrange tails (since Tails must be N (number of throws) - Heads)

Lets right this as an equation:
[math]\omega = frac{N!}{n!(N-n)!}[\math]

Here I have replaced H with n, but this the number of heads you want.

(cont if you care)

>> No.8391608

>>8391595
If you have 4 throws there are 4! ( i.e 4x3x2x1 ) combinations (microstates).
If you want to find all the ways (microstates) 2 coins will be heads you need to divide this number by H!(N-H)!
Think about this as the number of ways to arrange heads and the number of ways to arrange tails (since Tails must be N (number of throws) - Heads)

Lets right this as an equation:
[math]\Omega = \frac{N!}{n!(N-n)!} [/math]

Here I have replaced H with n, but this the number of heads you want.

(cont if you care)

>> No.8391630

>>8391608
Using this equation we can find the number of ways a macrostate ( e.g 2 Heads ans 2 Tails) can be arranged.

Now onto Boltzmann entropy.

[math] S = k ln(\Omega) [/math]

How did this come about?

Firstly we assume that every microstate in a macrostate is equally probable, this means (HTHT is as likely as HHTT and TTHH etc.)

We also must make the assumption that the macrostate that occurs is the one with the highest statistical weight, (this is only true in very large N e.g. lots of throws.)

I'm probably not going to go into exactly why S is proportional to ln since the derivation is fucking vile. Hopefully that helps a bit though

>> No.8391657
File: 1.75 MB, 374x254, 1475258489346.gif [View same] [iqdb] [saucenao] [google]
8391657

>>8391595

fkn saved this great explanation. Thank you, based anon