[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/lit/ - Literature

Search:


View post   

>> No.20154356 [View]
File: 181 KB, 1400x2118, 71JiWNx1A4L.jpg [View same] [iqdb] [saucenao] [google]
20154356

>>20154343
II. Complexity and Life - Information

Now, between the local and global, we have a middle ground, Secondness. This is the home of complex systems, far from equilibrium systems, and life. Complexity is also a middle ground between chaos and order. Too much of either and complex systems cannot form.

In order to understand life, you need to understand information theory. Information theory is underpinned by Shannon's information entropy formula. This formula is actually identical to the Boltzmann entropy formula above, except that it uses log 2. It uses log 2 to compute information in bits.

It does this for practical reasons; electronic computers store all information in binary. However, there is also a philosophical reason for this. With the law of the excluded middle and principal of bivalence, it must hold that we can only store true or false values about things in the world. Of course, we know from quantum mechanics that modality is a physical reality; objects can hold probabilistic values anywhere between 0 and 1 before wave function collapse. That's not central now, but the qbits of quantum computing hold promise here.

What Shannon Entropy is summarizing is essentially all the possible messages that could be contained in a finite signal through a channel. The entropy of a sentence would thus be the total possible character combinations (all letters upper and lower case, spaces, punctuation, etc.) Multiplied by the length of the string. Borges' story "The Library of Babel" is a great example.

However, not all letters are equally common in English. Not all words follow each other with equal frequency. The use of words in human languages actually fall into a power law distribution. So, we can further define entropy as the mathematical amount of surprise a message holds for us. That is, based on the information we have already received through a channel, what is the likelyhood of each possible message of X characters.

For more on information, the Great Courses - The Science of Information is really great and currently free with a membership. The Ascent of Information is another good book.

The main point here is that information is physical (Landauer's Principal).

Pic related gets at the tie in to the emergence of life.

Navigation
View posts[+24][+48][+96]