[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math

Search:


View post   

>> No.15256672 [View]
File: 414 KB, 1522x1542, Abstract.pdf.png [View same] [iqdb] [saucenao] [google]
15256672

>>15256556
None of these things can be demonstrated. We have no access to the asserted other universes. The MWI postulates that there is something called a wave function floating around somewhere outside of the spacetime and everything from measurements in physics experiments to just asserted environmental decoherence (supposed) events create new worlds that the observer in the world that made the measurement has no access too. None of this is testable. And superdeterminism is more unfalsifiable sillyness.
>Conscious agents do not cause a wave function to "collapse". "observer" in quantum mechanics is a metaphor referring to anything that interacts with, thereby detecting, a quantum particle
There is no 'wave function' goofbal NPC. The wave function is a math model used in conjunction with/governed by various versions of the Schrödinger equation which requires the born rule to connect it with spacetime results in terms of making predictions. And if conscious agents don't 'cause collapse' (what you are really trying to say is that space time values of observables are defined or become available for an observer to interface with) then you would not even be able to experience the physical world. The 'physical world' would just be the un-rendered internal calculations of possible future outcomes of measurements.
> "observer" in quantum mechanics is a metaphor referring to anything that interacts with, thereby detecting, a quantum particle
You learned about QM from watching youtube vids I see. Non conscious detectors are virtual. They don't even have values in spacetime themselves if not being observed by a consciousness, like ALL matter, see pic. The data stored as results of detection is itself rendered probabilistically and only on demand when the observer/physicist either checks the memory output or looks at the screen, or whatever the method of the verification of the defined values of the rendered effect of the experiments.

>> No.15250800 [View]
File: 414 KB, 1522x1542, Abstract.pdf.png [View same] [iqdb] [saucenao] [google]
15250800

>>15250625
I will give you also some seminal papers on the topic. the most important is this one

On Testing the Simulation Theory
http://users.cms.caltech.edu/~owhadi/index_htm_files/IJQF2017.pdf

Another is by fredkin, one of the originators of digital physics. He has a different idea than pic related. Instead of a top down, probabilistic model, where you render only on demand, and only to the specs of the observers, you render the whole universe from the planck scale up. This is probably not the right model, but the paper is excellent any way.

Digital Mechanics
Edward Fredkin
http://52.7.130.124/wp-content/uploads/2015/07/digital_mechanics_book.pdf

>> No.15206662 [View]
File: 414 KB, 1522x1542, Abstract.pdf.png [View same] [iqdb] [saucenao] [google]
15206662

>>15206421
>I don't buy it, and neither does sabine.
Of course she doesn't. She believes in, against all observed evidence, in reductive materialism as a presupposition. That is, she believes that our reality/universe is rendered at full resolution, down to the planck unit scale, at all times. And she believes that this smaller scale goings on is somehow causing events to happen in space time locally from within spacetime. This is not how realities work. We know this because we developed to the point where we created our own realities and the causation in these realities is NON-LOCAL to the virtual space of the realities, ie it comes from processing and the processor can't be within the output that is resultant to the processing. Even IF the universe could self process itself, which is illogical, but if it could, it still would be a vastly non-optimal way in terms of computational complexity to calculate and render all spacetime values at all times in the whole universe at full resolution down to the micro scale when no observers or the measurement devices developed by observers are even making measurements/observing. The smart way would be to render only on demand and ONLY to the resolution which accords to the specs of the consciousnesses immersed in the reality, or their measuring devices. So render the EFFECTS of the micro world, not the actual goings on at that level, see pic.

>> No.15180314 [View]
File: 414 KB, 1522x1542, Abstract.pdf.png [View same] [iqdb] [saucenao] [google]
15180314

>>15180294
>which doesn't have to be a "consciousness" btw
Bullshit. You can present no experiment that has controlled for that. You are just repeating what someone else has said without researching it. No such experiments have been conducted. But they ARE currently being conducted, see pic. specifically this part
>then to achieve low computational complexity, such a system would, as in a video game, render content (reality) only at the moment that information becomes available for observation by a player and not at the moment of detection by a machine (that would be part of the simulation and whose detection would also be part of the internal computation performed by the Virtual Reality server before rendering content to the player). here is an over view of the experiment
https://www.youtube.com/watch?v=72qVppAoCc8&t=0s

>> No.15157452 [View]
File: 414 KB, 1522x1542, Abstract.pdf.png [View same] [iqdb] [saucenao] [google]
15157452

>>15157415
>"Observer" in practice is "measuring interactor", and life or consciousness is not needed. Just like how God's particle is not literally some divine Holy Particle, "observer" is not actually human with eyes
Point to an experiment controlling for that. You can't. There aren't any. They are only just now being conducted, see pic, specifically this part
>We investigate this question based on the assumption that if the system performing the simulation is finite (i.e. has limited resources), then to achieve low computational complexity, such a system would, as in a video game, render content (reality) only at the moment that information becomes available for observation by a player and not at the moment of detection by a machine (that would be part of the simulation and whose detection would also be part of the internal computation performed by the Virtual Reality server before rendering content to the player). Guided
Stop getting your opinions from youtube videos dumb fuck. Think for yourself.

>> No.15133885 [View]
File: 414 KB, 1522x1542, Abstract.pdf.png [View same] [iqdb] [saucenao] [google]
15133885

>>15133520
By the way, there are experiments being conducted to test this
>>15133876
hypothesis, see picrel
Sabine doesn't have shit in terms of experiments. She never will. SD can't be tested. In before
>toy model
She doesn't have shit. Watch the kastrup debate here
>>15133876

>> No.15045432 [View]
File: 414 KB, 1522x1542, 8.pdf.png [View same] [iqdb] [saucenao] [google]
15045432

>>15030020
Impossible to 'prove' ontological claims from within the reality, but you can look for evidence to inferentially and inductively and abductively draw conclusions. See pic
On Testing the Simulation Theory
http://users.cms.caltech.edu/~owhadi/index_htm_files/IJQF2017.pdf

>> No.15035154 [View]
File: 414 KB, 1522x1542, 8.pdf.png [View same] [iqdb] [saucenao] [google]
15035154

>>15033808
This is irrelevant. This is pre-supposing that there is a bottoms up materialist event causation in the universe. In a reality where the physical world is an output of computation the causality is coming from 'outside' the virtual space of the reality. This is by the way why bell type correlations in terms of faster than light correlations can be faster than light by the way, ie instant. All points are equadistant from the processor, and so space is a virtual thing and processing is the causation, not anything from inside physicality. None of the stuff down at that level that that paper is talking about ever even has to be rendered. And stuff down at the planck scale certainly never has to be rendered, unless we can some time in the future get down there to make a measurement. All that has to be rendered is stuff at the resolution correspondant to the specs of the consciousnesses logged on to the reality, or things down to the resolution of instruments devised by the instruments that the consciousnesses invent to prob down to a fine resolution. See pic. You render (calculate and define values and present to an observer) only on an as needed basis and in a multi fidelity manner.

Navigation
View posts[+24][+48][+96]