[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 371 KB, 500x375, snow.gif [View same] [iqdb] [saucenao] [google]
5354510 No.5354510 [Reply] [Original]

1. Virtual reality creation. A virtual reality usually arises from “nothing”, which matches how
the big bang theory proposes our universe did arise (see next section).

2. Maximum processing rate. The maximum speed a pixel in a virtual reality game can cross a
screen is limited by the processing capacity of the computer running it. In general, a virtual
world’s maximum event rate is fixed by the allocated processing capacity. In our world, the
fixed maximum that comes to mind is the speed of light. That there is an absolute maximum
speed could reflect a maximum information processing rate (see next section).

3. Digital processing. If a world is virtual, everything in it must be digitized, and so discrete at
the lowest level. Plank’s discovery that light is quantized (as photons) could then generalize
not only to charge, spin and matter, but also to space-time. Discrete space-time avoids the
mathematical infinities of continuous space-time, as loop quantum gravity theory argues [18].

>> No.5354514

4. Non-local effects. The processing that creates a virtual world is not limited by the space of
that world, e.g. a CPU drawing a screen is no “further” from any one part of the screen than
any other. All screen points are equidistant with respect to the CPU, so VR processor effects
can ignore screen distance, i.e. be non-local. If our universe is a three-dimensional “screen”
it’s processing is “equidistant” to all points in the universe, so the non-local collapse of the
quantum wave function could be such an effect.

5. Processing load effects. On a distributed network, nodes with a high local workload will slow
down, e.g. if a local server has many demands a video download may play slower than usual.
Likewise a high matter concentration may constitute a high processing demand, so a massive
body could slow down the information processing of space-time, causing space to “curve”
and time to slow. Likewise, if faster movement requires more processing, speeds near light
speed could affect space/time, causing time to “dilate” and space to extend. Relativity effects
could then arise from local processing overloads.

>> No.5354516

gosh I never get tired of this thread being posted twice a day

lol magnets xD

>> No.5354517

6. Information conservation. If a system inputs no new information after it starts, it must also
not lose the information it has or it will “run down”. Our universe has not run down after an
inconceivable number of microscopic interactions over 14+ billion years, so if it is made of
information it must conserve it. If matter, energy, charge, momentum and spin are all
information, all the conservation laws could reduce to one. Einstein’s transformation of
matter into energy (e=mc2) would then be simply information going from one form to
another. The only conservation law VR theory requires is that of information conservation.

7. Algorithmic simplicity. If the world arises from finite information processing, it is necessary
to keep frequent calculations simple. Indeed the core mathematical laws that describe our
world are surprisingly simple: “The enormous usefulness of mathematics in the natural
sciences is something bordering on the mysterious and there is no rational explanation for
it.” [28] In VR theory physical laws are simple because they must actually be calculated.

>> No.5354520

8. Choice creation. Information arises from a choice between options [29]. A mechanical or
predictable choice is not really a choice in this sense. Einstein never accepted that quantum
events were truly random, i.e. no prior world events could predict them. That a radioactive
atom decays by pure chance, whenever “it decides” was to him unacceptable, as it was a
physical event not predicted by another physical event. He argued that one day quantum
random effects would be predicted by as yet unknown “hidden properties”. Yet if the source
of quantum randomness is the VR processor, which is outside the physical world, this
predicts that no hidden variables will ever be found.

9. Complementary uncertainty. In Newtonian mechanics one can know both the position and
momentum of objects, but for quantum objects Heisenberg’s uncertainty principle means one
cannot know both at once. Knowing one property with 100% certainty makes the other
entirely uncertain. This is not measurement “noise”, but a property of reality, e.g. measuring
particle position displaces its momentum information, and vice-versa. In a similar way virtual
reality “screens” are typically only calculated when they are viewed, i.e. when an interaction
occurs [12]. If complementary object properties use the same memory location, the object can
appear as having either position or momentum, but not both at once.

>> No.5354525

10. Digital equivalence. Every digital symbol calculated by the same program is identical to
every other, e.g. every “a” on this page identical to every other one because all arise from the
same computer code. In computing terms, objects can be “instances” of a general class.
Likewise every photon in the universe is exactly identical to every other photon, as is every
electron, quark, etc. While the objects we see have individual properties, quantum objects like
photons seem all pressed from identical moulds. VR theory suggests that this is so because
each is created by the same digital calculation.

>> No.5354549

hm interesting.

>> No.5354555

> If a world is virtual, everything in it must be digitized
Nope.

>> No.5354582

Are emotions digitized?

>> No.5354605

>>5354555

please explain

>> No.5356012

>>5354582
You are not seeing the big picture.

>> No.5356040

>>5354605
Well what do you mean by if a world is virtual?

>> No.5356043

>>5354605
Not the guy. But you can use general formulaes for say, circular lighwave spreading and save a shitton of processing power by simulating a generalized wave instead of a gigabillion photons.

Also, OP is wrong about maximum processing rate: memory is more important. the processing rate only means you can't run it in realtime or accelerated timeframe.