[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 350 KB, 670x1023, FE568FC1-2E1E-4B00-BA5F-0F834EFDC9BE.jpg [View same] [iqdb] [saucenao] [google]
10692082 No.10692082 [Reply] [Original]

Machine Learning didn’t have a chance to explode until now, with the Information Age bringing is massive quantities of data. Scientists can parse through this data gut it takes massive amounts of time and energy.
Moore’s law is still right, but we won’t be worried about shortening transistor lengths. Our computing power exponentially grows through quantum computing

>> No.10692102

>>10692082
Moore's law is only a specific instance of exponential progress in information processing. Exponential progress can continue in CPUs up to Yottaflop levels once we discover room temperature superconductors.

>> No.10692103

>>10692082
Quantum computing is only useful for an extremely small subset of algorithms and is not useful for any algorithms that can be computed efficiently on classical computers. Will it change things? Yes. But it won't avert Moore's Law for classical computing tasks

That being said I think Moore's Law is overrated since we can continue squeezing more power out of our transistor count using techniques like SIMD accelerated instructions

>> No.10692123

>>10692082
> Moore’s law is still right

There's a reason it was right for so long. What most people don't realize is Moore's law was a prescriptive law like car speed limits not a natural law like gravity. For forty years CPU manufacturers used it as a general guideline for how soon they should get a new product to market.

>> No.10692172

>>10692082
before quantum:
-3D chips aka multi-layer
-transistors that have more than 2 states (on/off)

>> No.10692246

>>10692082
Moore’s law was a sunday funday article in a random life magazine.
i cant believe that people took it as gods will.

>> No.10692252

>>10692246
>what is exponential growth
in any explosion the doubling time is worth knowing

>> No.10692298

>>10692082
>Our computing power exponentially grows through quantum computing
No it doesn't, quantum computing is just a different type of computing.
Regular computers are still and will be stronger for an extremely long amount of time, if not forever (with forever being humanity dying, not using any of these computers anymore, universe ending etc.)

>> No.10692914

>>10692082
The only people that think quantum computers can beat transistor computers are cs people that have never been in a physics lab and everyone that works with quantum stuff says this.

Multi qubit and even single qubit systems are extremely hard to set up and keep free of noise. This is not what it's like before big breakthroughs, we have already slammed up against the asymptotic limit of this kind of tech.

In 2009 they got shor's algorithm working for the number 15 and there hasn't been anything since.

The main application of quantum computing is to simulate other quantum reactions.

Look up the Deutsch problem. They invented this problem as an example of something a QC could do better then a traditional computer and in 30 years no one has found an application for this algorithm.

>> No.10692937

>>10692082
Wow, another quantum computing thread made by someone who has no idea what quantum computing is! Actually, you probably have no idea how classical computing actually works in the first place anyway.

t. PhD student in quantum information group

>> No.10692942
File: 172 KB, 863x498, 1544587974555.jpg [View same] [iqdb] [saucenao] [google]
10692942

>>10692082
>quantum computing
I'll have the London Broil, medium rare please.
>I had this conversation talking with a pal at … a nice restaurant near one of America’s great centers of learning. Our waiter was amazed and shared with us the fact that he had done a Ph.D. thesis on the subject of quantum computing. My pal was convinced by this that my skepticism is justified; in fact he accused me of arranging this. I didn’t, but am motivated to write to prevent future Ivy League Ph.D. level talent having to make a living by bringing a couple of finance nerds their steaks.
https://scottlocklin.wordpress.com/2019/01/15/quantum-computing-as-a-field-is-obvious-bullshit/

>> No.10692954

>>10692942
>the number of states we’re talking about here for a 4000 qubit computer is ~ 2^4000 states! That’s 10^1200 or so continuous variables we have to manipulate to at least one part in ten thousand.
Wow, another pseud who has no idea how basic quantum control works (let alone quantum computation). Into the trash it goes.
>actually cites Dyakonov’s confused senile ramblings at the end
o i am laffin

>> No.10692963
File: 29 KB, 400x400, 1553282172960.jpg [View same] [iqdb] [saucenao] [google]
10692963

>>10692942
>Our waiter was amazed and shared with us the fact that he had done a Ph.D. thesis on the subject of quantum computing
>>10692937
>t. PhD student in quantum information group
OH NO NO NO

>> No.10692969
File: 169 KB, 1080x810, perel.jpg [View same] [iqdb] [saucenao] [google]
10692969

>>10692942
>A quantum computer capable of truly factoring the number 21 is missing in action. In fact, the factoring of the number 15 into 3 and 5 is a bit of a parlour trick, as they design the experiment while knowing the answer, thus leaving out the gates required if we didn’t know how to factor 15. The actual number of gates needed to factor a n-bit number is 72 * n^3; so for 15, it’s 4 bits, 4608 gates; not happening any time soon.
I'm not a physicist, but isn't this clearly wrong?
https://phys.org/news/2014-11-largest-factored-quantum-device.html

>> No.10692991

>>10692969
>I'm not a physicist, but isn't this clearly wrong?
No That article's about a useless method applied purely as a way to grab headlines which works for only a very small collection of special integer cases. Not Shor's algorithm.

>> No.10692995

>>10692103
we got another 20 years before bottoming out on silicon

>> No.10693954

>>10692995
at the minimum, at least before other breakthroughs.

or alien tech gets shared if we ever meet them

>> No.10693998

>>10692102
>the room temperature superconductor guy again
Yeah, and warp drive will let us continue our exploration into space into the yottameters.

>> No.10694290

>>10693998
Sounds easier to make a room temperature superconductor than it does to transcend the speed of light.

>> No.10696048

>>10692937
Quantum computers don't exist realistically and even if they did how do you change an entirely different mindset than Boolean logic when CPUs are based on flip flops, clock cycles and binary concepts. Pajeets definitely aren't grasping states being true and false

>> No.10696054

>>10692102
>Moore's law is only a specific instance of exponential progress in silicon manufacturing processes
fixed for ya

>> No.10696057

>>10692082
Quantum computing doesn't bring massive quantities that's a claim without any logic. Storing data in dna though instead of disks that's when data mining will truly be at it's worst. It's getting more cost effective day by day

>> No.10696106

The solution to the limitations of Moore's law = virtualization.

>> No.10696125
File: 646 KB, 904x401, 1537911493072.png [View same] [iqdb] [saucenao] [google]
10696125

>>10696106
>we can't make chips run any faster so we'll just use them to emulate a faster computer

>> No.10696213

>>10696125
just do many calculations at once and join them together; combine parallel and distributed processing into one solution

>> No.10696218

When can I have a quantum desktop?

>> No.10696237

quantum is a meme in the short term

in the meantime processing will keep improving due to transistors still shrinking, CPUs being replaced by hybrid/heterogeneous SoC chips with accelerators (GPUs, FPGAs, neuromorphic), and applications being mapped much more efficiently to the computer architectures

>> No.10696239

>>10696057
>Storing data in dna though instead of disks that's when data mining will truly be at it's worst.
>>10696106
>The solution to the limitations of Moore's law = virtualization.
>>10696213
>just do many calculations at once and join them together; combine parallel and distributed processing into one solution
>>10696218
>When can I have a quantum desktop?
This thread is both hilariously retarded and yet also not any more retarded than where the professional quantum computing field is at today.

>> No.10696253 [DELETED] 
File: 86 KB, 638x1000, uTXY2Yi.jpg [View same] [iqdb] [saucenao] [google]
10696253

>>10696125
this is my most appropriate wojak.

>> No.10696259
File: 86 KB, 638x1000, uTXY2Yi.jpg [View same] [iqdb] [saucenao] [google]
10696259

>>10696213
this is my most appropriate wojak

>> No.10696268

>>10696057
>Storing data in dna though instead of disks
that's a really stupid idea though.

>> No.10696305

>>10696239
computing is one of those things where every idiot thinks they know something about it just cause they use a computer

>> No.10696357

>>10696305
Yeah, actual comp is bad enough for this, but when you append a "quantum" to it you get some seriously next level retardation. I can't imagine ever voluntarily deciding to go into the "quantum computing" field and I'm not at all surprised hearing about those quantum computing PhDs ending up as waiters or grocery store cashiers. It's literally the most memed up concept in existence to date. People who get impressed and imagine magic futuristic flapdoodle whenever they hear the word "quantum" getting what they deserve I guess.

>> No.10696406

>>10696125
no matter how fast or efficient you go you will still hit limits of what you can fit in your pocket or on your desktop and so will have to turn to virtualisation

>> No.10696540

>>10692082
>Machine Learning didn’t have a chance to explode until now
So-called 'machine learning' is HIGHLY overrated and TOTALLY untrustworthy.

>> No.10696554

>>10696406
do you even know what that means?

>> No.10696557

>>10696540
No, machine learning, unlike quantum computing, isn't a meme at all and is both overrated and underrated depending on the context. You can't say enough good things about the ability to decouple what tasks a program accomplishes from the need for explicit instructions on how exactly to perform those tasks.
But at the same time to your point people who don't work with it (salesmen and business exec buyers mostly) will just imagine it's an unlimited magic solution for everything when you're not going to get any benefit out of an attempt to model a functional relationship that never existed in the first place.

>> No.10696569

>>10696557
I never said it was a 'meme', I said it was *untrustworthy*.
>But at the same time to your point people who don't work with it (salesmen and business exec buyers mostly) will just imagine it's an unlimited magic solution for everything when you're not going to get any benefit out of an attempt to model a functional relationship that never existed in the first place.
THIS is why I said it's *overrated*.
I should also say it's *dangerous* simply because most people think it's Fairy Dust and Unicorn Farts and can do anything. Fact of the matter is, it can do damned little, and when you get some strange output or action from the thing, you have *no idea* why it did what it did, and *no way* to find out. That's one of the reasons why I'll have nothing to do with SDCs; I know better.

>> No.10697197

I'm going to visit d wave tomorrow
Give me one (1) good question to ask them

>> No.10697293

>>10696569
just wait until they solve the oracle problem
it will all fall into line after that

>> No.10697430

>>10697197
Why do they continue to delude themselves and the public into thinking that a quantum annealer is a quantum computer?

>> No.10697738
File: 30 KB, 400x374, 1509721680814.jpg [View same] [iqdb] [saucenao] [google]
10697738

>>10692942
>https://scottlocklin.wordpress.com/2019/01/15/quantum-computing-as-a-field-is-obvious-bullshit/

Wow, that's a real eye opener.

How did this attain any legitimacy?

>> No.10697776

>>10697430
ok but what if the Dwaves are not in the box then where is the box and how is that verified

>> No.10697781

Reminder that nobody has ever factored a number on quantum computer without knowing the factors in advance.