[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 32 KB, 600x557, Ghost Busters Pacman Crossover.jpg [View same] [iqdb] [saucenao] [google]
4097643 No.4097643 [Reply] [Original]

If the information theoretic entropy

Entropy = sum of probability x log_2 probability

derivable from the physical definition, or vice versa?

tl;dr How are the information theoretic and physical entropies related.

>> No.4097646

Wat?

>> No.4097674

Smaller entropy means more information about the state of the

>> No.4097676

>>4097674
the state of the system. They aren't wholly disconnected.

>> No.4097768
File: 70 KB, 852x480, cutey_Emma-superbad_plain.jpg [View same] [iqdb] [saucenao] [google]
4097768

>>4097643
The information theoretic is the more fundamental one.
The fundamental postulate of statistical mechanics says that all microscopic states are equally probably, i.e. p_i=const for all i. If there are Omega possible states, then all p_i are 1/Omega

S=-sum_i p_i log(p_i)

is then


S=-sum_i p_i log(p_i)
=-(log(p_i))*sum_i p_i
=log(1/p_i)*sum_i p_i
=log(Omega)*(sum_i 1)/Omega
=log(Omega)

>> No.4097770

I mean probable*

>> No.4097781
File: 29 KB, 300x400, cutey_Emma-yellow_window.jpg [View same] [iqdb] [saucenao] [google]
4097781

http://en.wikipedia.org/wiki/Statistical_mechanics#Fundamental_postulate