[ 3 / biz / cgl / ck / diy / fa / g / ic / jp / lit / sci / tg / vr ] [ index / top / reports / report a bug ] [ 4plebs / archived.moe / rbt ]

If you can see this message, the SSL certificate expiration has been fixed.
Become a Patron!

/sci/ - Science & Math

View post   

[ Toggle deleted replies ]
File: 167 KB, 620x615, qprocessor.jpg [View same] [iqdb] [saucenao] [google] [report]
9438204 No.9438204 [Reply] [Original] [archived.moe]

Hello /sci/. I created a thread in an attempt to get people to join my quantum computing group last month. In that thread, I had a couple of interesting and productive conversations with some people here. I want to talk about quantum computing with you again.

>> No.9438592

Quantum computing will never happen. It is predicated on a non-reality: that a bit can be a 0 and 1 at the same time.

That is outrageously false.

>> No.9438617

I did some work on the topic and I don't really understand why it's getting so much attention recently. Even with superconducting qubits, aren't there still a plethora of problems with decoherence times? Is it possible to scale up quickly from the dozen or so qubits available nowadays?

>> No.9438620

You'll never know.

>> No.9438624

Okay. Go ahead and provide any empirical data that there exists a bit that is both 0 and 1.

>> No.9438631

The nature of quantum mechanics is Suh that properties of a system are in superposition before the probability wavefunction collapses on a single result.

>> No.9438640

I said empirical evidence of a functioning qubit. I know what the theory says, theres never been any evidence for it though

>> No.9438649

do double slit, acts as wave
start observing it
still doing double slit, acts as particle

>> No.9438653

>what is the double/single slit experiment

>> No.9438657

Quantum state tomography and implementations of quantum algorithms on existing qubits says otherwise.

>> No.9438659

Nothing about the double slit supports the existence of something being able to do 2 things at once.

The theory of the double slit is that there is a collapse of one form into another, not that 2 forms exist at once.

>> No.9438662

That's your response

>> No.9438663

>he thinks double slit is not just a thought experiment

>> No.9438664

Qubits dont exist. Link me an article or a video of a bit that is a 1 and a 0 at the same time

>> No.9438667


>> No.9438669

My response was a brief description of the double slit experiment making the point that there is nothing "2 things at once" about it

>> No.9438670
File: 5 KB, 223x223, mohippo smallsquare.jpg [View same] [iqdb] [saucenao] [google] [report]

qubits are just microscopical wind vanes. they can be in any fucking state, but to measure their state you literally to blow on them. and then take either a 1 or 0 state.

>> No.9438677

Thats my point. There is nothing "quantum" about quantum computing. Its just fast classical computing.

>> No.9438682

Sounds like someone has never read or tried proving Shor's algorithm.

>> No.9438686

I meant double-slit with "observer" experiment

>> No.9438695

I dont need to prove it because it does utilize a bit as both a 1 and 0. It uses the statistical approximations of atomic particles to rapidly produce a 1 OR a 0.

Like I said, all they did was move classic computing to a small, efficient level and rename it "quantum"

>> No.9438699


>> No.9438742

It uses two qubit operations to establish a superposition and perform meaningful operations on their phases to produce the final result. Yes, we can only finally measure a classical result but the algorithm itself is quantum.

>> No.9438802

What does this actually do? What does the fancy quantum stuff do that traditional computers do not and why is it advantageous.

All I have ever read says quantum is real great because it is quantum and you know what that means quantum.

What are the real world applications?

>> No.9438877


>I don't really understand why it's getting so much attention recently

People are beginning to pay attention because the coherence times for superconducting qubits have reached a point where it is possible to perform quantum error correction.

> aren't there still a plethora of problems with decoherence times?

Yes, but fortunately the coherence times for superconducting qubits have been improving more or less exponentially for the past ~18 years.

> Is it possible to scale up quickly from the dozen or so qubits available nowadays?

It's possible to scale up, but it's not easy. The big problem of quantum computing is figuring out how to increase the quantity of the qubits while making sure that they are of high quality. There are currently no signs showing that this cannot happen.

>> No.9438890

People are beginning to pay attention to it outside of obscure science papers because of the cryptocurrency and other world events such as spying and hacking. ''

to op: there is 2 paths u can take to figure this out.

1) abstract algebra
2) quantum physics

>> No.9438896


> What does this actually do?

It can theoretically perform some computations more efficiently than classical computers. I say theoretically because the quantum computers that exist now are not powerful enough to beat classical computers. Hopefully not for long.

>What does the fancy quantum stuff do that traditional computers do not and why is it advantageous

The problems that quantum computers solve are in the complexity class of BQP; which are problems that can be solved by a quantum computer in polynomial time. Most people believe (but has not been proven) that P != BQP.

>What are the real world applications?

Simulating quantum mechanics and approximating solutions to some optimization problems more efficiently than classical computers.

>> No.9438898

Thats wrong. It says it uses a particle to store information and relies on the statistical output of that particle. Read the study on the shor algorithm wiki page. 4th citation.

>> No.9438915

>They are good for quantuming and doing work with quantums and might be faster in theory.
So literally nothing anyone in the real world should care about, thanks.

>> No.9438917

>Go ahead and provide any empirical data that there exists a bit that is both 0 and 1.

There is no such thing as a bit that is both 0 and 1. I think what you are referring to is a qubit. Which is not a object that is both 0 and 1; but a two-level quantum system. It is a linear combination of |0> and |1>; which is not the same thing as a bit that is both 0 and 1.

>> No.9438924
File: 241 KB, 362x480, maga_pepe_large.png [View same] [iqdb] [saucenao] [google] [report]

The following are complete science fiction
>quantum computing
>speed of light
>time dilation
>attosecond cameras
>atomic clocks
>atomic bombs
>space travel
>genderfluid identities
>negative mass
>gene sequencing
>artificial gene modification
>higgs boson
>artificial intelligence
If you believe to be fact rather than fiction anything listed above, you are a liberal democrat and still mad Hillary lost, biding your time until the next republican takeover. You are also very, very, very, very dumb.

>> No.9438926


The real world should care because Google is going to perform a quantum supremacy experiment on their 49 qubit device in the next year or so. It is going to prove that a quantum computer can perform a task that cannot be executed on the world's most powerful supercomputers.

49-50 qubits are as advanced as most devices are right now. Once we get better at building these machines, there is no reason that we cannot build computers that contain thousands to millions of qubits.

>> No.9438934

>They have a monopoly on something nobody wants or cares about.
Great enjoy your 49 qbit device, I'm going to be over here not caring in the slightest.

>> No.9438937

>t. no idea how computers work

>> No.9438940


Can you elaborate?

>> No.9438980


this is what matters actualy

people in the 1950s were doing the same experiments on CLASSICAL computers.. like they knew the math worked just how to do it where u press buttons..

>> No.9438992


>people in the 1950s were doing the same experiments on CLASSICAL computers

What do you mean? Same experiment of searching an unordered database or doing it using Grover's algorithm? Cause Grover's algorithm was discovered in the 90s.

>> No.9439011

It seems to me like you're content without knowing how we go from encoding information into a particle (or whatever choice of qubit representation we choose) to obtaining meaningful measurements, which would explain why you don't understand what exactly makes a quantum computer different from a classical computer. A classical computer cannot perform the phase gates and entangling operations necessary to perform these algorithms. Therefore it's inaccurate to call them a classical algorithm.

>> No.9439022

Unsurprisingly it's possible to more efficiently simulate quantum mechanics using quantum computers. So anyone who is working in chemistry, condensed matter physics, and microbiology (like making drugs) is going to find a lot of use for it when we hit some larger qubit numbers.

At even higher qubit counts you can quickly crack a lot of security protocols, so you can easily rob the bank and steal bitcoins. There's also speedup in searching databases.

>> No.9439035

>At even higher qubit counts you can quickly crack a lot of security protocols, so you can easily rob the bank and steal bitcoins. There's also speedup in searching databases.

Is this true or is this more quantum fanfiction?

>> No.9439043

These are actually the first and most famous algorithms people discovered that produced a lot of interest in the topic. Shor's algorithm allows you to easily find factors of large semiprime numbers, which appears to be difficult classically which is why the bank and lots of security algorithms have tied their security to this problem in the form of RSA and such. Grover's algorithm is the search algorithm.

You can google them fairly easily.

That said there's no need to worry about them yet because they require fairly high qubit counts.

>> No.9439048

Here is a Discord link to our group if you want to discuss quantum computing in depth.


>> No.9439071

Lmao lemme see them credentials nig u go to harvard???

>> No.9439086

>This is more quantum fanfiction?
Thanks for clearing it up.

>> No.9439102

D-wave essentially did that part for me.

>> No.9439117

D-Wave is not a quantum computer. It is a quantum annealer. When we talk about quantum computers, we are talking about devices that can perform universal quantum computation.

>> No.9439126

If you think physics is fan fiction; then yes using a quantum computer to speed up an unordered database search and computing the prime factors of large numbers more efficiently than classical computers is fan fiction too.

>> No.9439154
File: 388 KB, 1208x794, Barber.jpg [View same] [iqdb] [saucenao] [google] [report]

>If you think physics is fan fiction
No, I think that there are a lot of things that are theoretically possible but hard to achieve and people talk about them like they are a certainty.

Moon habitats.
Flying cars.
Personal robots.

>> No.9439178

> hard to achieve

That is definitely true with respect to quantum computing. But in my opinion, I don't think that it is harder or as hard as moon habitats. Theories about quantum computers were created in the 1980's; followed by a decade or two when experimentalists didn't know how to build them. The first 2 qubit quantum gate was demonstrated in 1995. The first demonstration of a 2 qubit quantum algorithm was in 1998. The first quantum algorithm on a 2 qubit superconducting quantum processor was implemented in 2009. Now IBM has a 50 qubit superconducting prototype. It's not exponential progress, but it's noticeable.

>> No.9439209
File: 26 KB, 632x756, 1515979470095.png [View same] [iqdb] [saucenao] [google] [report]


>> No.9439912

A qubit is a coin and a bit is a face on the coin. To do anything in computers requires a 0 or 1 binary input. If we measure the 0, it is not 1. If we measure the 1, it is not 0. Now if we had two concurrent measurements for each face, we can measure heads and tails at the same time. This doesnt help anything. This is not quantum. This is not even helpful. Getting simultaneous results of 00100100 and 11011011 is not helpful, cause one of the bit strings is completely invalid. The most intelligble possible thing would be designing the under measurement to interpret 1's as 0's and 0's as 1's so that both simultaneous measurements read 00100100, and this too is still not helpful as we're simply getting 2 identical results where one of them is completely unecessary for a single input.
A qubit is not binary, and is emulated by 2-bit computing.
0, 1, 2, 3, for 4 total states, which could have been mechanically reproduced at any point in time if it were a valid method, all without needing to measure atoms at absolute zero. The most essential thing that could possibly come out of anything quantum is quantum entanglement, but quantum entanglement makes the jews mad cause it means they can't keep lying about the speed of light.

>> No.9439924

That's literally what it means, that the thing exists as both a WAVE and a PARTICLE at the same time and only once measured does it become one or the other.

>> No.9440003


You are making the same mistake most people do when they try to explain quantum mechanics; trying to relate them to classical concepts.

>A qubit is not binary

>and is emulated by 2-bit computing
Not true. A qubit is analog, not digital. Like i said on the post you commented on, It is a linear combination of |0> and |1>. |state> = A|0> + B|1> where A and B are complex probability amplitudes such that |A|^2 + |B|^2 = 1.

>quantum entanglement makes the **** mad cause it means they can't keep lying about the speed of light.

No. Quantum entanglement does not violate the laws of relativity; which says that information can't travel faster than the speed of light.

>> No.9440026


Observe two logical facts.
The bit is either one or zero.
The bit consists of electric current or absence of it.

If time contains the electric charge and absence of it as a whole then it is safe to assume that by rapidly turning it on and off to a certain oscillation encompassing or harmonizing with time will render it in a position of both since it is a fraction of the whole.

>> No.9440034


>> No.9440092

I don't have much to input but I see potential in it. To make use of quantum bullshit, you gotta apply it to things that seem to have a sort of quantum nature. I don't think quantum computers are the only option though. There are other choices such as computers that depend on organic type shit to function.

To me it seems that instead of doing calculations, we should be modeling conditions of the macro scale at the micro scale.

One of the things people have trouble grasping with quantum shit is the idea that they "know" things ahead of time, because they have to, because things would make even less sense if they didn't. But I think there are plenty of things in life that seem to have this same quality. For example lightning. It always takes the fastest path to the ground. But how the hell does it know? Fuckin magic yknow?

But yeah if we make use of quantum computers in an "analog" way, and instead of doing computations, we basically do literal microscopic modeling of things, wouldn't that be so much cooler n stuff? Instead of computing all the light in a game for example, we actually construct it in some form on an elementary level and then just take the results

>> No.9440137

>That is outrageously false.

Moron! Einstein's theory of Relativity. When people first heard of time dilation they said it had to be complete bullshit. time is the same for everyone, now we know that is not true.
Quantum bits ARE in both states, 1 and 0, at the same time, it just IS.

>> No.9440138

mah it isn't intuitive to me so it must not be real

>> No.9440151

I also created Slack channel for the people who prefer that over Discord. Here is the invite link: https://join.slack.com/t/qcbuilders/shared_invite/enQtMzAwMDQzMDA2MTY2LTg5ZDdlZjE1ZTZhODE2Zjg0OTY4MzQ1YjkzNGVlYWU3MTc2MTA5ZGIzNGI3MWM4YWFmNzE5MDU4NTFlMjM4NmY

>> No.9440172

Time dilation doesn't occur. You missed the greentext memo here didnt you.

The only possible way of recording it in the accuracy defined by past tests is with synchronized atomic clocks, but atomic clocks, synchronized or not, do not exist, nevermind true synchronization. That the second was redefined in the 60's to be 9 ghz of some immeasureably small cyclic occurence of some shit happening with caesium-133 is all the proof you need, when 9ghz computing was not feasible until after 2010 and even then requires some static, constant refridgeration technique on par of quantum computing in terms of the amount of constant cold temperature and the size of the effort which cannot be contained inside a small metallic box that atomic clocks present themselves with. Contrary to the implication that oscilloscopes could do the measurement at any point before 1990, even oscilloscopes with 100ghz range and 20 billion samples per second would not be designed to analyze and count every minute modulation within the bandwidth, nevermind the implication that 20gs/s is cyclically faster than the digital possibility of a 20ghz processor being inside the device which by itself completely decobstructs the validity behind such olliscopes actually performing as advertised in any way beyond a convincing lie.

It is not enough to claim because a frequency exists, you can capture every miniscule moment of it, when mechanical is not fast enough and analog is too fast to distinguish.

The redefined scientific second is complete bullshit, and so are atomic clocks which used in tests to verify time dilation are either simply desynced by flaw of design or drift for any other reasoj completely unrelated to time measurement, like i dunno, turbulence for one. Any little jolt that would otherwise cause a normal computing device to hiccup and flaw would still apply to the portable atomic clocks used in such tests.

>> No.9440180

OK? I will step away quietly from the crazy man.

You do know that time dilation MUST be true because without it GPS signals would not be computed accurately. The GPS in your phone is proof of time dilation, because without considering it you would have less accuracy.

>> No.9440187

"Real-World Relativity: The GPS Navigation System"

>> No.9440198

Gps isn't done by satellites, and certainly not for every phone that has gps. Gps is done by triangulation of cellphone towers. If GPS were done by satellites, zuma wouldn't be necessary to track aircraft flying over oceans beyond the range of radar towers, like cellphone towers. So right there everything is wrong, even before going into how you assume time dilation effects objects in orbit.

>> No.9440201

I was hoping there could be some interesting discussion in this thread (quantum computing being something I don't know much about and want to learn in more detail), but this thread seems almost entirely shitposting spawned by a sole obvious troll.

>> No.9440229
File: 114 KB, 256x256, otfr7LYS.png [View same] [iqdb] [saucenao] [google] [report]

>i was hoping to learn more about bullshit but it seems there is just truthposting shattering my gullible belief system

>> No.9440416

Why don't you try to make the thread interesting by asking good questions and making a useful contribution to the conversation instead of complaining?

>> No.9440856

>One of the things people have trouble grasping with quantum shit is the idea that they "know" things ahead of time

Could you please elaborate on this? Are you referring to the probabilistic nature of quantum computers?

>> No.9440860

welcome to /sci/

>> No.9442925

I'm only speaking from a popsci stoner perspective desu. The nature of the collapsing waveform to me is the result of an eventual interaction in which one thing promulgates as a wave, before it comes into contact with something, and is then able to follow a path that corresponds to how we would predict it to behave if it were particular. Idk if lightning is actually a good example. But comparing how to predict it, a normal computer would have to calculate the constantly changing energy in the cloud and determine the location of initiation of lightning which would result in the shortest path to the ground. But particles themselves don't have to do any calculations because they can basically think like a wave, and wherever that wave makes contact with something, it then retroactively directs itself as a particle in only that direction. So any sort of computation that doesn't rely on calculation would naturally return results much faster.

Sorry if none of that makes sense I don't actually know shit but it seems like we use computers to stimulate all these things in nature when we should be using all those things themselves. If we could someone create something on a microscopic level that models conditions in the real world, we wouldn't have to calculate for results, we'd simply gather them from the process. Maybe it's about logic gates. We run numbers through logic gates one at a time but what if we had many logic gates, all with many logic gates after them, and the number we are running decides on its own which logic gates to take that would make the most sense. Whichever logic gates would bring it fastest to the result would be the ones that it would choose

>> No.9443152

Have you studied Lagragian mechanics and the principle of least action? I believe that's fairly similar to what you're describing in your first paragraph, together with the Feynman's path integral formulation of QM.

However, as far simulation goes it's not a novel thought, technically we already have perfect simulations of quantum mechanical systems in the form of the system we are looking at, but it's not very programmable, controllable or measurable, so cannot be used to solve a broader class of problems.

>> No.9443239

People hate Quantum Mechanics because it completely disproves determinism and they're still stuck in their enlightenment mindset that the whole universe is a grand machine and with enough variables recorded you can predict everything.
It isn't and there is no way, no matter how hard you measure, a determined outcome. This scares them because in their minds, it means that science can never explain everything, they will never have a perfect prediction machine. They need to get over it.

>> No.9443296

you need to understand the difference between the deterministic system as mathematical object and causality as philosophical concept.
we have many macroscopic stochastic systems such as society.

Name (leave empty)
Comment (leave empty)
Password [?]Password used for file deletion.