[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 7 KB, 320x51, Selfinformation.jpg [View same] [iqdb] [saucenao] [google]
2245121 No.2245121 [Reply] [Original]

Okay /sci/ducks, /sci/entist, /sci/borgs, and /sci/lons!
I have a hard question for you, today. How does one go about measuring the amount of information a body of matter has. Alternatively, how does one go about finding the amount of information required to completely describe the body of matter? We need an experiment.

To start, we can take information to be identical to the states it describes, so we can express some relationship between it's quantum states <span class="math">E_n=nh\omega[/spoiler] and the self-information <span class="math">I(ω_n)[/spoiler] associated with outcome <span class="math">ωn[/spoiler] with probability <span class="math">P(ω_n)[/spoiler] is: Pic related

This definition complies with the above conditions. In the above definition, the base of the logarithm is not specified: if using base 2, the unit of is <span class="math">I(\omega_n)[/spoiler] in bits. When using the logarithm of base , the unit will be in nat. For the log of base 10, the unit will be in hartley.

>> No.2245128

We want to pick a representation which gives us the quantity of information in units of quantum bits, qubits.

Also, we're going to need a measure of entropy which approaches the Shannon entropy measure at the top and approaches the von Neumann entropy measure at the bottom.

>> No.2245136
File: 991 KB, 1440x900, 1287894262256.jpg [View same] [iqdb] [saucenao] [google]
2245136

Can we take <span class="math">P(\omega_n) = E_n = nh\omega[/spoiler] for the OP equation?

>> No.2245138
File: 65 KB, 227x219, 1291625568555.png [View same] [iqdb] [saucenao] [google]
2245138

>> No.2245141

What's wrong with von Neumann entropy? You want the number of qubits in a quantum system, that gives you it. von Neumann reduces to Shannon if you do a unitary transformation to the appropriate basis.

>> No.2245142
File: 331 KB, 1000x750, 1292205391791.jpg [View same] [iqdb] [saucenao] [google]
2245142

>>2245138
You know what the OP and this picture have in common?

>> No.2245149

>>2245141
I would like to be able to get both the classical information content and quantum information content of the system in a continuous relationship if possible.

>> No.2245176
File: 50 KB, 480x360, 30.png [View same] [iqdb] [saucenao] [google]
2245176

Does anyone have a self-information expression for qubits?

>> No.2245181

>>2245149
You're going to have to do a much better job of explaining what exactly it is you're trying to accomplish here.

>> No.2245222
File: 24 KB, 580x401, 1288912913465.png [View same] [iqdb] [saucenao] [google]
2245222

>>2245181
I'm trying to devise an experiment to measure the classical and quantum information content of a system.

I'm thinking I can use a measure of the energy of the system to estimate the information content of the system. I'd like as complete of a theory of measurement as I can get in case I can devise an instrument for directly measuring the quantity of information of a system.

>> No.2245274
File: 120 KB, 407x405, 1284219992586.png [View same] [iqdb] [saucenao] [google]
2245274

>>2245222
As I understand it the formative and emergent complexity of matter-energy states is highly relevant to the information content in information entropy, not just the amount of energy as such. Or so I assume.

>> No.2245278

>>2245222
The obvious trouble is that the information content of a system isn't an objective property. It depends on our knowledge of the system. This is true even in classical information theory.

>> No.2245305 [DELETED] 
File: 17 KB, 500x375, 1289457186243.jpg [View same] [iqdb] [saucenao] [google]
2245305

>>2245278
Knowledge about what? It's state?
What I am interested in is how much information it takes for a system to be simulate.

Let a given physical body be equivalent to it's computational encoding. The question is how much information would we need to simulate a proton exactly?

>> No.2245317

>>2245305
On a classical or quantum computer? The answers are very different.

>> No.2245320
File: 17 KB, 500x375, 1289457186243.jpg [View same] [iqdb] [saucenao] [google]
2245320

>>2245278
Knowledge about what? It's state?
What I am interested in is how much information does it take for a system to be simulated.

Let a given physical body be equivalent to it's computational encoding. The question is how much information would we need to simulate a proton exactly?

>> No.2245329

>>2245305
For a proton I don't know the specifics, but consider that for a macroscopic system the complexity will be considerably greater than the sum of its parts.

>> No.2245331
File: 39 KB, 400x296, dipole_1.jpg [View same] [iqdb] [saucenao] [google]
2245331

>>2245317
I'd like a system which describes the information requirements for classical and quantum cases. If I have to pick one, it would be computed on a quantum computer.

The idea is that if the Turing Thesis holds for physics, an ensemble of qubits could compute itself resulting in physical systems like particles. Self-simulating systems.

>> No.2245339

Information content is described by the holographic principle.
http://en.wikipedia.org/wiki/Holographic_principle
It's inherently a theoretical question, not an experimental one.

>> No.2245357
File: 105 KB, 587x565, space_time_energy.gif [View same] [iqdb] [saucenao] [google]
2245357

>>2245339
Fully familiar with the holographic principle, but I'm not asking about the information content of the universe. I'm asking about the information content of an isolated system neglecting universal contributions.

How would the surface area relationship help us find the information content of a human or a proton? I take it as a given that a system as complex as a human being has both a classical and quantum information content.

As for whether it is an experimental question or not? I'm asking to count the information content of a physical thing using empirical methods on the premise that information is physical. If it's not subject to experimental methods, we'll find that out as a null-result, and it will invalidate my hypothesis; otherwise, there exists an experiment to measure the information content if information is physical.

>> No.2245367

>>2245339
That's more of a conjecture.

>>2245331
>I'd like a system which describes the information requirements for classical and quantum cases. If I have to pick one, it would be computed on a quantum computer.
Well, if you have a system with 2^n possible states, you'd need n qubits or 2^(n+1) floating-point numbers to simulate it. You can of course decrease these figures by compressing the data, but I don't know how you'd run the simulation without decompressing it. I suppose on a quantum computer, if the system had many parts, you could decompress one or a few parts at a time and evolve it/them forward a bit before recompressing. Then your storage required would be the sum of the von Neumann entropies of your components.

>The idea is that if the Turing Thesis holds for physics, an ensemble of qubits could compute itself resulting in physical systems like particles. Self-simulating systems.
Well, every physical system can be potentially thought of as an elaborate computer that simulates itself, but that doesn't tell us much.

>> No.2245368

By the way this an open notebook thread. All work published here is open research under the creative commons 3.0 share-alike license.

>> No.2245373

>>2245367
Tells me a lot. Tells me whether a system is computationally finite or Turing complete. Tells me whether the pumping lemma can be used on it to recursively discover it's domain and range. Tells me what language it accepts and how to reprogram it. Tells me how to free it from "memory". Tells me what it can do to other abstract computers.

>> No.2245377

>>2245373
>Well, every physical system can be potentially thought of as an elaborate computer that simulates itself
is a tautology.

>> No.2245379
File: 623 KB, 1000x1029, 2.jpg [View same] [iqdb] [saucenao] [google]
2245379

>>2245377
Dude, the whole of deduction is tautological which extends to include mathematics.

>> No.2247369
File: 577 KB, 1920x1060, 1289261138679.jpg [View same] [iqdb] [saucenao] [google]
2247369

>>2245367
>if you have a system with 2^n possible states, you'd need n qubits or 2^(n+1) floating-point numbers to simulate it.
Does that account for entangled and super-positioned states?

For now let's assume I'm only asking for the raw amount of information without compression.

>> No.2247512

wtf are y'all talking about?
inb4 DUMBASS

>> No.2247544
File: 200 KB, 1280x720, dna.002.jpg [View same] [iqdb] [saucenao] [google]
2247544

>>2247512
Hypothesis: The universe is computable in the sense of the Church-Turing-Deutsch principle.

http://en.wikipedia.org/wiki/Church%E2%80%93Turing%E2%80%93Deutsch_principle

Prediction: Every physical system has a measure of information that is isomorphic to it. IE physical systems are equivalent to their encoding as information.

Question: how do you count the number of bits that encodes a physical system like a proton?

>> No.2247640

>>2247544
Dumbass observation here. What claim is there to the requirement of quantum mechanics? Classical wave optics can simulate quantum computation with the catch of exponentially more resources in place of the scaling that entanglement brings.

>> No.2247641

>>2247544
So what I got from this is that biological life could possibly be replicated through computer mechanics. We just don't have the atomic level details down yet. Is that an accurate description?

>> No.2247953 [DELETED] 
File: 43 KB, 521x425, 1289610384064.jpg [View same] [iqdb] [saucenao] [google]
2247953

>>2247641
If the universe is computable and information is physical, biological life can and is replicated/simulated through computational mechanics.

If this pans out, it would mean that DNA isn't like a computer code, it is a computer encoding.

>>2247640
The Church-Turing thesis and the Church-Turing-Deustch principle are hypotheses. They have never been proven though in computer science they are generally assumed to be true.

Given my research, I can see a possibility in which the thesis fails wherein quantum computation is not equivalent to classical computation. The break down point is when we started to examine the behavior of single and double qubits. Zizzi claims that individual qubits can only be completely described by a unique nonstructural paraconsistent logic.

http://arxiv.org/abs/1003.5976
http://arxiv.org/abs/quant-ph/9708022

>> No.2247976
File: 43 KB, 521x425, 1289610384064.jpg [View same] [iqdb] [saucenao] [google]
2247976

>>2247641
If the universe is computable and information is physical, biological life can and is replicated/simulated through computational mechanics.

If this pans out, it would mean that DNA isn't like a computer code, it is a computer encoding.

>>2247640
The Church-Turing thesis and the Church-Turing-Deustch principle are hypotheses. They have never been proven though in computer science they are generally assumed to be true.

Given my research, I can see a possibility in which the thesis fails wherein quantum computation is not equivalent to classical computation. The break down point is when we start to examine the behavior of single and double qubits. Zizzi claims that individual qubits can only be completely described by a unique nonstructural paraconsistent logic. Turing based his thesis on the results of Godel and Cantor such that his thesis is qualified over the whole of Boolean algebra. Zizzi's logic can be structured into a Boolean algebra, but by itself, it is not equivalent.

http://arxiv.org/abs/1003.5976
http://arxiv.org/abs/quant-ph/9708022

Anyway, all that is another argument for another thread.

>> No.2249026 [DELETED] 
File: 12 KB, 200x256, 1290839979919.jpg [View same] [iqdb] [saucenao] [google]
2249026

If you aren't prepared to be wrong, you aren't prepared to practice science.

This isn't a homework thread, it's an open research question. No one knows the answer yet.

>> No.2249101
File: 12 KB, 200x256, 1290839979919.jpg [View same] [iqdb] [saucenao] [google]
2249101

If you aren't prepared to be wrong, you aren't prepared to practice science.

This isn't a homework thread, it's an open research question. No one knows the answer yet. Start making guesses.

>> No.2249647

>>2249101
To begin with one should start with a simpler system than a proton. Perhaps an electron or (like holographic proponents) a black hole should be used.

From memory, when one accounts for the information content that a given system is holding, they are only taking in degrees of freedom (dof). Black holes and fundamental particles are convenient due to their low number of dof.

OP, what I don't see is how you can get around the constraint of needing to fully know about your system before attempting to recreate it. You can put an information bound on the system. The bound is only what you know and approaches the complete interactions that the system takes to represent.

Additionally, how does one work in the physical framework that determines the behavior of said system? Where do you put in Maxwell's equations, general relativity, and the other fundamental constraints that we arbitrarily claim to control actions in the universe? Mustn't you first find the origin of these these constraints at one general law?

Finally, this notion of classical versus quantum information content is wrong. Classical comes out of quantum. Classical approximates quantum at the benefit of reduced complexity.

>> No.2250861

Bring in some fresh eyes

>> No.2252308
File: 662 KB, 1905x792, tron-legacy-flynn.jpg [View same] [iqdb] [saucenao] [google]
2252308

>>2249647
>OP, what I don't see is how you can get around the constraint of needing to fully know about your system before attempting to recreate it. You can put an information bound on the system. The bound is only what you know and approaches the complete interactions that the system takes to represent.

As for recreating the system, I'm interested in measuring it's information content on the premise that it is computable. In computing theory, an abstract computer can be held to be equivalent to it's bit-wise encoding. The statement that a proton can be simulated perfectly in a quantum computer is another way of saying it might be a simulation. In this case, I'm not so much looking to model a particle in CAD or something as I am looking to see what the properties of the measured system would be if it were a self-sustaining simulation in "real" space. Of those properties, I'm chiefly interested in three quantities: information content, space complexity, and time complexity.

>Additionally, how does one work in the physical framework that determines the behavior of said system? Where do you put in Maxwell's equations, general relativity, and the other fundamental constraints that we arbitrarily claim to control actions in the universe? Mustn't you first find the origin of these these constraints at one general law?

Finally, as to where does the rest of physics fit in? We'll find that out as we go that's part of the exercise. I suspect that where Maxwell's equations fits in will tie to answering the question of "where does thermodynamics fit in?"

>> No.2252316
File: 133 KB, 1280x1699, 2010_tron_legacy_055.jpg [View same] [iqdb] [saucenao] [google]
2252316

>>2245136
><span class="math">P(\omega_n) = E_n = nh\omega[/spoiler]
I work in natural units because if the computable universe hypothesis holds, we can expect that most equations will reduce to a handful of dimensions and many existing units will turn out to be dimensionless. Also, makes working the math out easier. My first question in the thread is can we take the energy of the quantum states of the system given by the above expression to be a probability or probability magnitude? If we can, the n in the energy expression would be the number of bits in the system. I think.

Computationally, the simplest computing machine is a finite automata, and I figure if anything's likely to be such a computer, it will be one of the standard model particles. Unfortunately, my grasp of group theory is weak at current, so a direct conversion between and analysis of computable particles is difficult for me. The holo-grail for this would be discovering what particle if any in the standard model is a naturally occurring qubit or finite quantum automata. Forgive the pun.

I know finite semi-groups are related to regular languages which in turn are related to finite automata.

>> No.2252333
File: 83 KB, 1399x576, 1293092447576.jpg [View same] [iqdb] [saucenao] [google]
2252333

>>2252316
If the energy-information content relationship holds, we can relate total/potential energy to total information of the encoding of a body. Once we have the total information content of a particle, we can start looking at action tables which fit that bit-length.

http://en.wikipedia.org/wiki/Turing_machine#Formal_definition

We find a natural qubit, we find the backdoor to the source code. I suspect at this time that it will be the photon which turns out to be the natural qubit.

>> No.2252347
File: 16 KB, 515x515, photon.gif [View same] [iqdb] [saucenao] [google]
2252347

>>2249647
Unfortunately, we don't happen to have any blackholes handy that we can measure, so my preference would be something attainable and controllable. The photon, electron, proton and neutrino are primary in my concerns.

Another thought. The distinction I make between classical and quantum information content arises from concerns about superdense coding.

>> No.2252853 [DELETED] 

bump for ideas and feedback.

>> No.2253022 [DELETED] 

Bump for new eyes.

>> No.2253383

No thoughts? No ideas? No comments about whether or not the energy of states can be thought of as a probability amplitude or not?

>> No.2253427

I have a strong mathematical background but I am afraid I am not really comfortable with whatever is going on in this thread.

Can someone tell me what's going on?

>> No.2253514

>>2253427
Mathematical formulation of quantum mechanics. Asking if the energy of a state in a Hilbert space represents a frequency-based complex probability, a probability amplitude.