[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 61 KB, 329x337, vision.jpg [View same] [iqdb] [saucenao] [google]
4419728 No.4419728 [Reply] [Original]

>Human brain is probably a quantum computer

Discuss

>> No.4419739
File: 17 KB, 250x250, 1327352826450.jpg [View same] [iqdb] [saucenao] [google]
4419739

>implying implications

>> No.4419738

I'm pretty okay with this hypothesis.

Photosynthesis seems to run on quantum mechanics as well, which is interesting to say the least.

>> No.4419742

>>4419738
So, is quantum computing the key for imitating nature?

>> No.4419750

this begs the question, will a sufficiently human-like quantum computer have consciousness? i think yes.

>> No.4419751

Where the hell are you going to store the quantum information without it immediately undergoing decoherence?

>> No.4419753

>>4419751
why can't neurons do this?

>> No.4419754

>>4419738
All chemistry is quantum mechanics. Quantum computing it is not.

>> No.4419759

>>4419728
No it isn't, you fucking moron. The fact you even suggest that shows little you know about neuroscience, and the fact you said "probably" without providing any evidence whatsoever shows that you're a complete idiot.

>> No.4419761

I though nerological signalling as at the molecular level. You know, what with molecules being the basis for all signaling.

>> No.4419764

Where's your evidence?

Actual brain researchers would disagree.

>> No.4419765

>>4419754
Chemistry segues into quantum mechanics for very simple, very small structures. Even then-

Fuck, you're a troll. Nevermind.

>> No.4419775

>>4419765
You certainly need quantum mechanics to understand the highly delocalized molecular orbitals of the chlorophyll molecule.

>> No.4419779

>>4419742
For simulating thought? Nope; it would be central to it, but we cannot come close to examining and representing the myriad connections and structures or processing or methodology of brain.
We certainly have almost no concept of mind.

>> No.4419798

>>4419753
neurons are transmitters, not memory chips.
Remember, we're not trying to make a mind resemble a computer, but the other way around.

>> No.4419806

>>4419761
>I though nerological signalling as at the molecular level.
>You know, what with molecules being the basis for all signaling.
Cellular level; molecules are part of the method, not the foundation.

Or, backward: to call molecules the foundation, each molecule would have to be significant separately.

>> No.4419812

>>4419798
well then this brings me back to a question i've been asking for years.

what the fuck is memory and how is it stored/how does it work? is it about the connections between neurons?

>> No.4419883

>>4419812
The last description I read talked about how the range of pathways of any new experience passes or occurs near similar pathways; this creates associations that, over time, make memory more and more associated with other things.
That was fine, but it explained how associations are built, and how we relate one memory to another, rather than how the data is 'stored' per se.
The implication was that more experiences sharpens all kinds of related memories; it fits nicely with the speedy development of intelligence and knowledge in children.

>> No.4419886

>>4419812
>connections between neurons
I've been studying this for a little while and have come across this:
http://www.numenta.com/

Is this a step in the right direction?

>Explained:
A hierarchy of transmission between layers that correlate to a transitional context of information.

>> No.4419913

>>4419886
>A hierarchy of transmission between layers that correlate to a transitional context of information.
That doesn't really mean anything.
it's saying there is a relationship of tissues,
the relationship is transitional,
but not much more.

That definitely doesn't describe a method.

>> No.4419919
File: 60 KB, 750x600, 1266749743357.jpg [View same] [iqdb] [saucenao] [google]
4419919

>>4419728
>Human brain is probably a quantum computer

Nope

\thread

>> No.4419927

>>4419913
I could go into detail, but it's just easier to watch his talks:
http://www.youtube.com/watch?v=oozFn2d45tg

>> No.4419929

>>4419886
>I've been studying this for a little while and have come across this:
http://www.numenta.com/

>Is this a step in the right direction?

Oh, I see; it's a data-mining algorithm, not a model of mind.
It's a good step for making use of data, but it won't tell you anything about human mind or memory.
It might reveal something about critical thinking processes, or valuation, or correlation (which I find fascinating) by modelling.

>> No.4419937

>>4419929
>data-mining algorithm, not a model of mind
Thanks for replying. Why do you think that this isn't a feature of the mind?

>Side note:
Even if this is not the answer (and it most surely isn't at this point in time), wouldn't the brain logically function on a heirarchy of information?
Example: Not needing to relearn the alphabet to learn a new word.

>> No.4419951
File: 4 KB, 125x125, i_are_smart_mug-p168402319706766338z8xyg_125.jpg [View same] [iqdb] [saucenao] [google]
4419951

Well if you consider that new neurological connections are constantly being made. The sheer amount of connections that exist in the frontal lobes of you brain replicate a interconnected system that is far more advanced than today's greatest computer. It is also believed that the mind has an endless memory capacity, even though we may not be able to effectively draw on this stored information with every whim it will always be there. Where as a memory within a computer many be overwhelmingly large it still has its limits.

>> No.4419962

>>4419927
From the blurb about it:
>[Hawkins] gives a brief tutorial on the neocortex and then explains how the brain stores memory

Except he does neither of those things:
he does absolutely NOTHING to describe the neocortex,
and the metaphor he uses to describe processing is nothing but the classic Turing machine paper-tape model.

He seems to be on his way to a decent discussion about context being relevant to processing, though, and by the end of this bit it doesn't look good.

>> No.4419965
File: 26 KB, 333x333, 1326334898802.jpg [View same] [iqdb] [saucenao] [google]
4419965

its not a quantum computer. its not like a neuron can be in superposition or some wacky sit.
its a kind of sort of digital computer that was programmed through an incredibly convoluted natural process.

>> No.4419970

>>4419951
>endless memory capacity
This is physically impossible. Simplest example: You need to memorize everything that can possible be remembered. There is an infinite amount of "information". There is only so many neurons that can fit within the bone confines of your skull.

>> No.4419977

>Implying the mind isn't separate and distinct from the body

>> No.4419981

>>4419962
>Only watched the "introduction for retards"
>Didn't skip to the 4th video to see if it was "instantly understandable"
Just skip ahead. If it looks easy-peasy, then let me know.

>> No.4419992

>>4419981
>meant 3rd

>> No.4420011
File: 4 KB, 222x211, 1303611360098.png [View same] [iqdb] [saucenao] [google]
4420011

>>4419728
>Physical reality is probably quantum mechanics

Discuss

>> No.4420048

>>4419951
>The sheer amount of connections that exist in the frontal lobes of you brain replicate a interconnected system that is far more advanced than today's greatest computer.
More complex, not more advanced; or, in a different way -- the brain is better at dealing with movement and shapes and patterns. To be more advanced, it would have to be consistent and accurate, wouldn't it?

>It is also believed that the mind has an endless memory capacity,
I think the current assumption is that it has about 300 years of capacity.

>Where as a memory within a computer many be overwhelmingly large it still has its limits.
Capacity limits, but almost none for accuracy, reliability, or capability to be applied to new algorithms.

>> No.4420063

>>4419937
>>data-mining algorithm, not a model of mind
>Thanks for replying. Why do you think that this isn't a feature of the mind?
It could be; it's just that the fact that it is useful doesn't mean it describes what is actually happening.
The Turing model doesn't reflect relationships well, for instance; it's very digital.

>>Side note:
>Even if this is not the answer (and it most surely isn't at this point in time), wouldn't the brain logically function on a heirarchy of information?
That would be a familiar method, but it doesn't seem to be what we actually do.
What we do is definitely related to relationship-building and context. We know things by other things they are connected to or similar to; not by categories they fit under.
That kind of hierarchy is a very useful critical-thinking method, but it has to be learned and applied; it doesn't happen naturally for us.

>Example: Not needing to relearn the alphabet to learn a new word.
If you mean how to say it: right, we know about sounds by other words we know, and the significance of context and other letters.
If you mean its meaning: we don't learn meaning by letters, but by chunks of words or related terms. The chunks or similarity refers to other words which have meaning to us (because of their relationships, iterative).
Even if we learn a wholly new word, we are probably having to put it into context of a sentence we understand before we have much idea of meaning to apply.

>> No.4420141

>>4420063
>The Turing model doesn't reflect relationships well
The Turing model describes relationships just as well as binary bits describe computer programs. I'm thinking far more abstractly than that.

>That would be a familiar method, but it doesn't seem to be what we actually do.
Why? It seems to me that it is a primary methodology for our learning process. *Build the lowest level first as it happens faster* *Build onto the next level as those patterns from the first layers become obvious* *Repeat*

>If you mean how to say it
Say it? Verbally? No, I don't mean that nor do I fully understand your argument on this point. I wouldn't mind if you rephrased it in some way. Sorry.

>If you mean its meaning
I didn't mean this... but I can still apply this for the inverse of my reasoning.
Example: What is water? (simple idea) H20? A liquid? Something to drink? Each of these ideas branch out into their own meaning and are therefore the children in the hierarchy. This is not to be confused with the fact that each of these ideas also exist as parents to other children that can easily loop back around to "What is water?".
>This is the reason that dictionaries try to avoid using similar verbiage because of our use of a hierarchically recursive formation for "meaning"

>What I mean:
You know the alphabet. You can write a few words. Someone tries to show you a new word. You don't need to relearn the alphabet... really... you just learn that this "word" (with its own meaning) connects to this whole other context. A context that is far simpler and can easily be divided into pieces, which in themselves also have a context and meaning.

>Any thoughts?

>> No.4420282

>>4420141
>The Turing model doesn't reflect relationships well
The Turing model describes relationships just as well as binary bits describe computer programs.
Yes; that's what I'm saying: computer programs do not relate metadata, relationships, correlations, or similarities the way a human does.

>That would be a familiar method, but it doesn't seem to be what we actually do.
Why? It seems to me that it is a primary methodology for our learning process. *Build the lowest level first as it happens faster* *Build onto the next level as those patterns from the first layers become obvious* *Repeat*
That presumes organization in the first place: we rarely have that kind of understanding for a new thing.
What we do, instead, is learn something and it's relationships to something else, then later categorize it into a hierarchy, when we have 3 or more things to build that. The hierarchy might be the most useful to us, but it doesn't seem to be the first things we learn.

>> No.4420397

>>If you mean how to say it
>Say it? Verbally? No, I don't mean that
I mean, pronunciation is one of the first things we learn if we are taught a word, and since you used the alphabet context, I thought pronunciation might be what you had in mind.

>Example: What is water? (simple idea) H20? A liquid? Something to drink?
>Each of these ideas branch out into their own meaning and are therefore the children in the hierarchy.
Ah; I can work with this; you are making connections to other related things here. You are not hierarchically recalling information: water> definition, makeup, properties> relevance, uses, cultural> forms found in, places found, places used.
Hierarchical information is strictly categorized, and the generality or importance of each fact is the most important thing.
Relative correlations just make connections; clusters of related facts, each of those with related facts, no particular organization (at first).
(In fact, without a broad context, the hierarchy has to be able to change drastically, so it might be impossible to 'file' information that way even after considerable knowledge.)

>> No.4420408

>>4419977
>>4419977
>dualism
>>>/x/

>> No.4420412

>>4420282
>Yes; that's what I'm saying.
You're claiming that there are things that are uncomputable. I have to question: why?

>it might be impossible to 'file' information that way even after considerable knowledge
Ooooh, you might be right. That makes sense... hmmm.... man, you're making me wanna research this for another few months....

>Question:
So, are there any existing theories about relative correlation algorithms? (... that you know are worth reading)

>> No.4420415

>Example: What is water? (simple idea) H20? A liquid? Something to drink?
>Each of these ideas branch out into their own meaning and are therefore the children in the hierarchy.
Ah; I can work with this; you are making connections to other related things here. You are not hierarchically recalling information: water> definition, makeup, properties> relevance, uses, cultural> forms found in, places found, places used.
Hierarchical information is strictly categorized, and the generality or importance of each fact is the most important thing.
Relative correlations just make connections; clusters of related facts, each of those with related facts, no particular organization (at first).
(In fact, without a broad context, the hierarchy has to be able to change drastically, so it might be impossible to 'file' information that way even after considerable knowledge.)

>> No.4420427

>>4420412
I will only suggest looking at the people writing about Theory of Knowledge stuff,
instead of the Critical Thinking stuff.

Critical Thinking people can't seem to decide what their topic is, and a large amount of it seems to be how to apply to business decisions particularly.

>> No.4420433

>>4420427
Most (if not all) of the AI course at my uni is based around Critical Thinking. How can I bring this up to my professor?

I'd like to present him with some information to get him on the right path to distinguish both.... that, or have an AI II that illustrates the Knowledge Theory.

>> No.4420446

>>4419728
nope

just billions of little microprocessors all intricately wired.

>> No.4420448

>10 years later
>>Human brain is probably a <insert appropriately futuristic technology> computer
>Discuss

>> No.4420457

>>4420448
>What you said:
>10 years later
>>Human brain is probably a <insert appropriately futuristic technology> computer

>What will happen
>X years later
>>Human brain is mimicked by metallic-processing technology

>> No.4420476

>>4420457
>1960
>I believe that in 20 years AI will surpass humans in every aspect

>> No.4420480

>>4420476
>2012
>I believe that eventually AI will surpass humans in every aspect... it's just a matter of time and advancement.

>> No.4420548

>>4420433
Oh, I won't criticize all books on the topic; he must have found some that applied to what he needed properly.

But I think I should clarify what I liked about Theory Of Knowledge writing: it includes a lot of psych and introspective examination.
So if you are good at introspection, or (better) can participate in a good group discussion, you can reach some really interesting perspective on all this stuff.
(My students love those discussions a lot!)

>> No.4420568

>The human brain does not operate on a scale where quantum effects happen.

DISCUSS

>> No.4420573

>>4420548
>a good group discussion
Oh, I had hoped that my class would develop into a good group for discussion, but there is one loud tard (that shouldn't even be in the class) that will oppose every point of view, even if he has to defend that jumping off a bridge is the most logical thing to do.

>That being said, in some situations it IS the most logical/omniscient thing to do... but he has no idea what that means.

>> No.4421019
File: 59 KB, 360x532, ProperSitting_eng.jpg [View same] [iqdb] [saucenao] [google]
4421019

>>4420573
>>4420568
Yang I read quantum suposition in the brain doesnt last long enough to affect DNA folding proteins