[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 379 KB, 1600x1200, _animepaper_wallpapers_katamary-damacy_sakuyalove_20353-jpg[1].jpg [View same] [iqdb] [saucenao] [google]
6900461 No.6900461 [Reply] [Original]

"Memcomputing NP-complete problems in polynomial time using polynomial resources"

http://arxiv.org/pdf/1411.4798.pdf

Discuss

>> No.6900465

Based on my limited understanding of things I'm excited by the title

>> No.6900472

>mathematician banter that will never amount to anything more than serving as orgasmic fuel for autists in academia

>> No.6900495

>>6900472

lol if you think a paper that has a functioning prototype to go with it constitutes masturbatory academics, you haven't been in school long enough

>> No.6900525
File: 43 KB, 500x410, KO4aW.jpg [View same] [iqdb] [saucenao] [google]
6900525

>http://arxiv.org/pdf/1411.4798.pdf

>Most importantly, it has been proved mathematically that UMMs have the same computational power of a non-deterministic Turing machine [2], but unlike the latter, UMMs are fully deterministic machines and, as such, they can actually be fabricated.

>Although these theoretical features are very appealing, no experimental realization of such a machine, able to solve very difficult problems in polynomial time, has ever been reported.

>In this paper we demonstrate an experimental realization of a memcomputing machine, theoretically proposed in Ref. [2], and show that it can solve the NP-complete [9] version of the subset-sum problem (SSP) in polynomial time with polynomial resources.

>Here, it is worth stressing that our results do not answer the NP=P question, since the latter has its solution only within the Turingmachine paradigm: although a UMM is Turing-complete [2], it is not a Turing machine.

>Finally, it is worth mentioning that the machine we have fabricated is not a general purpose one. However, other realizations of UMMs are general purpose and can also be easily built with available technology [18–22]. Their practical realization would thus be a powerful alternative to current Turinglike machines.


So they basically built the first specialized UMM (==NDTM in power) that can solve SSP in one step (O(1)), but they didn't "solve" the P=NP question because it's not a general UMM, which in turn is not a TM?

What? What is this UMM wizardry, and why wouldn't it be proof of P=NP if the paper is factual? Can't a reduction be done? And who cares if it's not a TM: they are basically saying that they can solve NP-complete problems in constant time, which would have ENORMOUS consequences.

>pic related

>> No.6900555
File: 25 KB, 560x407, 1415387117926.png [View same] [iqdb] [saucenao] [google]
6900555

I have no idea what this is about.

>> No.6900667

>>6900525
The exponential blowup in resources is being hidden, by shoehorning the parallel processing needed into numbers that take exponentially many bits just to write down.

So if the memcomputing model is able to manipulate exponentially-long integers in a single time step, then it’s going to require exponential resources to simulate the memcomputing model

>> No.6901408

>>6900461
>one scientific paper
first mile of a long road

>> No.6901443 [DELETED] 

>>6900461

Ayy lmao

>> No.6901493

Neither /g/ knows what the fuck is that

>> No.6901530

> the capability of storing more information than the number of memory elements
> a number of memprocessors that scales linearly with the size of the problem
I had this idea when I first heard about the P/NP problem. I'm pretty sure everyone does at some point. Why not just have a program of infinite size, and encode all the possible configurations into it?

Unfortunately, this reduces to the same amount of computation as is required using a naive algorithm on one processor. (If you could dynamically generate that code... but that's also the same as some algorithm on one processor. There's a poly time mapping.) You're just limiting your problem to some size and solving quickly for that size, and it looks like you're solving it efficiently because you're throwing bigger and bigger lookup tables at it without considering the cost of generating those lookup tables.

Fortunately, the authors are nice enough to point out that they haven't solved P/NP. But I'm still unconvinced that what they've done here is anything new.

>> No.6901540

>>6900667
So it's substituting NP time complexity for exponential space complexity?

>> No.6901596
File: 2.23 MB, 650x294, Bruteforce.gif [View same] [iqdb] [saucenao] [google]
6901596

ITT:
>the fuck is NP-completeness?
Something that uses less jargon and will make more sense to mathematicians:
An NP-complete program is a program that processes N units of data and it has an upper-bound of at least f(N) units of time, were f is not a polynomial. For example, if N is the amount of data put into a program, an NP-complete problem may take 2^N time to solve, instead of being a polynomial, like N^2 + 1.

This is significant because computers, up until now, have only been capable of solving non-polynomial programs in non-polynomial time, however this can solve non-polynomial programs in *polynomial* time.

It's really fucking magical. Where did the other operations go?

(Correct me if I'm wrong on any of this stuff, CS nerds)

>> No.6901627

>>6901596
Explain more. Why is this significant and what does it mean for actual hardware and software?

>> No.6901639

>>6901596
You're wrong on this stuff.

>> No.6901645

>>6901639
What parts?

>> No.6901661

>>6901627
Right now there are some problems that take a very long time to solve, even with the fastest computers.

This might allow us to solve those very hard problems in a regular amount of time.

>> No.6901676

>>6901596
What if it's happening that the problen had a computational complexity function that is somehow approximable "enough" through a polynomial, e.g. f(n) being an infinite composition of analytic functions, and thus was able to be solved using polynomial resources in polynomial time?

>> No.6901680

>>6900461
>>6901493
See >>>/g/45319303

>> No.6901681

>>6900461
tl;dr

>> No.6901704

>>6901639
I'm going to assume you're trolling unless you have anything more to say.

>>6901676
You would have to ask someone with more experience here for a more technical answer, but f(n) is never that complicated, it can normally be modeled from an analytic standpoint.


Also, the limit of (g∘g∘...∘g)(n) approaches infinity such that n != 0, and g(n) != 0 ... there might be some theorem for this, but it works from an intuitive standpoint, anyway.

>> No.6901728

>>6901676
>>6901704
What? Why would you want to approximate the computational complexity of a function?

>> No.6901735

>>6901596
>>6900525
>>6901530
>>6901540
>>6901627
See
>>6900667
And
>>6901680

Tldr the paper's trying to do infinite precision real numbers again. If you have such things you can do NP in polynomial time, but it's not physically possible. The precision of the paper's setup is enough to solve trivial problems but no more.

>> No.6901751

>>6900555
tl;dr

I can prove my invisible unicorn exists by measuring the lack of invisible unicorns in the room.

>> No.6901759

>>6901728
I was being fucking retarded, battery died when I was typing response.

I interpreted >>6901676 as asking, "what if the computational complexity was determined by 'an infinite composition of' [polynomials]", so I provided an incomplete, incorrect "proof", that didn't even address his question.

>> No.6901768

>>6901759
S-sorry anon :3

>> No.6901866

>>6900525
>they are basically saying that they can solve NP-complete problems in constant time
No, they are saying they can solve specific NP problems in P time - this has happened before, problems we previously thought was NP we found a P solution for (finding primes for example).
That is not anywhere close to proving all NP problems are P - in fact it's still infinitely far away.

>> No.6902730

>arxiv

can't be true

>> No.6902771
File: 17 KB, 600x512, 1410994481867.jpg [View same] [iqdb] [saucenao] [google]
6902771

>memecomputing
>>/s4s/