[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 58 KB, 300x444, cave johnson.jpg [View same] [iqdb] [saucenao] [google]
3243830 No.3243830 [Reply] [Original]

"The point is if we can store music on a compact disc why can't we store a mans intelligence and personalty on one?" Really though, will it ever be possible to put your consciousness into a computer? Guesses at to When? How? etc

>> No.3243841
File: 52 KB, 363x360, 1305518008461.gif [View same] [iqdb] [saucenao] [google]
3243841

>>3243837

The grass under me had turned to a sheepskin rug. The moon, still where it was, now hung framed in a picture over the fireplace, the red flames reflecting on the white marble floor much as the moonlight had once shimmered in the pond. The girl sat in a slung bucket chair with a high flared back fit for art deco royalty. Between us was a chess board in late play, her pieces—a full set including two queens—white, my lone piece black.
"Or you wouldn’t be here,"” I corrected. "I actually have a brain."
She cocked her head at me, so I explained: "You’re being run on our general purpose simulator. I'm jacked in from the real world."
"You’re still alive?!"
"No, no—I’m a tinc."
"Oh. Yes. That’s very confusing," she said, shaking her head. "So, you’re an artificial intelligence in the real world talking to a real intelligence in an artificial world."
"Except that I was once human as you were."

>> No.3243837
File: 30 KB, 350x300, 1305517725975.gif [View same] [iqdb] [saucenao] [google]
3243837

"Backing up, you said the brain is like the computer and the mind like the program. But you can’t put one person’s mind inside another person’s brain, can you?"
"Yes, true. A better analogy for the brain than a computer would be, oh, a graphics chip. A graphics chip is some specialized hardware that implements a particular set of algorithms very quickly—algorithms to draw pictures. But the same algorithms can be implemented in software on a general purpose computer, just not as fast. So the brain is like a graphics chip, with a great deal of the ’software’ built-in to the hardware. So, you are right, you can’t just move a mind from one brain to another because each brain, and correspondingly each mind, is unique. You can, however, move a mind from a brain into a generalized brain simulator, just as you could read the circuits and firmware of a graphics chip and run—simulate—them on a general purpose computer."
"Obviously,"” she said, “"or we couldn’t be here. Your move."

>> No.3243847
File: 31 KB, 512x384, emergence.gif [View same] [iqdb] [saucenao] [google]
3243847

"You freeze the brain, you slice it very thinly, these very thin slices you scan using an electron microscope or some other form of microscope, at the nanometer precision. You get these images, you process them in a computer three-dimensional model -- Where are the different synapses and neurons, what are the strengths, what are the connections? You use that to create a computer simulation of the brain, and then you start the simulation."

-- Anders Sandberg, Computational Neuroscientist, Royal Institute of Technology, Sweden.

>> No.3243846

So far there doesn't seem to be any good arguments on why it shouldn't be possible, we just need to map out the brain completely and get technology to read/copy it.

>> No.3243850

> implying you have any evidence having consciousness has something to do with "algorithms"

>> No.3243852

>>3243850

Shit, who let Roger Penrose near any form of communication again?

>> No.3243864

>>3243846
It is impossible because of the laws of physics

>> No.3243866

>>3243864

Elaborate.

>> No.3243875

Humans are just complex carbon based machines. We need to find the identity statements holding between a brainstate B and a psychological state P. Then we need to create a language for said processes, interpretable by computers.

I mean, if you're a materialist/physicalist then there's nothing theoretically stopping us from uploading our respective minds onto a hard-drive with an advanced understanding of the neurosciences.

>> No.3243885

With brains, hardware-software is not a meaningful distinction. Heck, processor-memory-bus isn't meaningful either.

And it's not just neurons and synapses, which may themselves have native ability to compute. It's also the chemical environment, chemoclines, depletion and replenishment. Literally every physical action in the brain, and arguably the body as well, contributes to the computations going on to produce a mind.

So I would imagine one would need to simulate the physics of the brain down to the atomic level to make a proof copy. Then we could start to work on abstractions that don't damage the mind therein.

>> No.3243900

>>3243830
The day when you can x-ray a brain for neuron structures and frequencies is the day this is possible. Unfortunatly the level of resolution is WAAAY greater than x-rays can deliver... and.. their smaller than atoms... so we'd have to find a particle even smaller and then control it, which will be harder.

Or there might be another system... perhaps quantum entanglement. But that would be very laborious.
Basically every system will evolve particles and waves and is thus inherently flawed.
At the most we may be able to guess at a brain using dead ones, sliced up as a clue to go on.

>> No.3243904
File: 40 KB, 420x712, 768-1.jpg [View same] [iqdb] [saucenao] [google]
3243904

>>3243885

>And it's not just neurons and synapses, which may themselves have native ability to compute. It's also the chemical environment, chemoclines, depletion and replenishment. Literally every physical action in the brain, and arguably the body as well, contributes to the computations going on to produce a mind.

Agreed entirely. The brain is not isolated from the mind, and not just in the "Oh we'll just hack together a nice little virtual input/output machine". If the heart pumps faster, it can feed more Oxygen into the brain; if the person eats something high on glucose it will be fed consumed accordingly by the brain, etc. Then there's the fact that the neurons, which were once thought to be simple binary switches, are far more complex, with dendritic trees branching all over the place and connecting in all sorts of differnet places with varying plasticity and potential differences. Hell, even the pH of the environment affects computations. Then there are the neuroglia...

>> No.3243928

>>3243904

I can't instantly find a link, but you could google it yourself;

The experiment in evolving clocks on a circuit board?

Successful clock one looked pretty much like the clocks people design for computers. Every component explicable, with a comprehensible purpose.

Successful clock two worked FOR NO APPARENT REASON. And when they removed any component, even those not connected to the circuit, it stopped working. And if they took it out of the lab, it stopped working. Turns out it was picking up the signals from the clocks in some computer in the lab.

But the point being, the brain is that times an uncountable number. When people describe it as being as simple as getting enough switches with enough connections to match neurons and synapses, I actually wince. They don't get it.

>> No.3243932

This thread reminds me of a story I read in Heavy Metal Magazine. There's a group of humans, perfect really save for their shaved heads, who've basically mapped out the human body entirely, enabling them to eliminate disease, save for one small symbiot they detected deep within everyone's mind. They take defeating it as their final triumph, and decide to do it. But a skeptical woman says she isn't certain it's a good idea, because they haven't determined what it's purpose is. The others scoff at her hesitance, but she insists on being the "control subject" and not take part in the destruction of the symbiot, while everyone else on the ship does during cryostatis. When she wakes up, she finds they've all mentally devolved into apes. It was a very interesting and thought provoking read, especially coming from Heavy Metal magazine.

>> No.3243935
File: 519 KB, 1280x1699, 1299435101309.jpg [View same] [iqdb] [saucenao] [google]
3243935

>>3243928

Yes, I heard of it; the clock evolved to use the radio waves from a computer in a nearby table as ticks.

Anyone who seriously thinks "it's just a matter of computing power" is delusional, but still we've made great progress. We simulated a rat's cerebellum, mother of fuck. And what happened to that open-source group that intended to upload a worm's mind?

>> No.3243936

>>3243928
>But the point being, the brain is that times an uncountable number. When people describe it as being as simple as getting enough switches with enough connections to match neurons and synapses, I actually wince. They don't get it.
And to a 15th century guy our modern computers would look infinitely complex and indestructible from magic, so what is your point?

>> No.3243943

>>3243936
> indestructible from magic
areyouawizard.jpeg

lawl.. I assume you meant indistinguishable? Still, very good point.

>> No.3243952
File: 19 KB, 119x158, 1307663263183.png [View same] [iqdb] [saucenao] [google]
3243952

>>3243936

>indestructible

>> No.3243956

>>3243943
Yes, fucking auto spell correction...

>> No.3243963

>>3243935
Uploading is a real sticky wicket. The only people who could say for sure that it worked would be those who went through it. For the rest of us, we wouldn't even be able to take their word on it, since they would certainly still claim to be intact.

I'm just picturing the worm in cyberspace. I'll buy it when they train the worm to do a maze or something, upload it, then download it and the cloned worm does the same maze. Then they can do an insect. And a lizard, and a mouse, and upwards. But it would still be a far cry from what I would consider foolproof human uploading, that which I would subject myself to.

>>3243936
My point is that the problem is likely a generation or two of computing technology further than many suspect. Not impossible, and I even said that some abstraction could be added without a loss of mind, just we won't know until we have the complete simulation.

>> No.3243966

>They think that a carbon based life-form can be copied onto a silicon computer chip and still actually feel

>2011

>> No.3243971

Screw uploading my mind; when am I getting those longfall boots?

>> No.3243973

>>3243966

>Vitalism
>2011

>> No.3243981

Eventually we can probably store the information of how to replicate your consciousness. I doubt that we we will be able to store your continuous sentience in the interest of your immortality though. If you died and had your consciousness recreated it obviously wouldn't really be you. It would just be an identical copy of the neural structure that made your consciousness possible.

In my opinion immortality can only be attainable if you indefinitely maintain your organic brain alongside man-made augmentations and technology.

>> No.3243988

>>3243973
Its not vitalism...

I just dont think it will work. The elements are different. Until we see life made out of anything else you cant say that its possible.

>> No.3243996

>>3243988

I don't understand, what do you think uploading is?

>> No.3243998

>>3243988

You said "feel". You are implying that a thing with the same information processes as the human brain can't really be consciouss if it's implemented in a substrate other than this very specific wet carbon organ.

Yes it's either vitalism or biochauvinism.

>> No.3244000

>>3243996
>Uploading

>Will do anything other than store a copy of your brain that doesn't actually feel anything

Nope.

>> No.3244007

>>3243998
>You are implying that a thing with the same information processes as the human brain can't really be consciouss if it's implemented in a substrate other than this very specific wet carbon organ.

Thats exactly how I feel.

If it were possible for any element to be able to make up anything... The world would be a lot different.

>> No.3244009

>>3244000

Still with this thing about feeling. If you mean physical feeling, then you can hack together a little machine to handle virtual I/O with the brain. If you're talking about feelings as in conscious thought; then shit, this conversation isn't worth it.

>Cars don't move they just copy motion

>> No.3244011

>>3244009
Ya its gonna look like its feeling but its not gonna be FEELING

>> No.3244020

>>3244011

You wouldn't know until you uploaded.

And you don't know right now about anyone but yourself. Except that your brain is primed to accept humanoid things as having similar internal worlds to yourself.

>> No.3244022
File: 208 KB, 800x1230, 05_25_11.jpg [View same] [iqdb] [saucenao] [google]
3244022

>>3244007

>If it were possible for any element to be able to make up anything... The world would be a lot different.

You don't get it. We're not talking about replacing every Carbon atom in the human brain with Silicon.

We're talking about scanning the brain, finding the regular patterns and the patterns that build them, taking those patterns, designing information processes to create, handle, and destroy those patterns, implement it in a lump of computing matter, and fire it up.

It's not a brain in a jar. It's a map of a brain in a map of a jar. And yes, maps are the same as terrain when the map is a full-blown reconstruction of every atom in the terrain.

>> No.3244031

>>3244022
Ya, and im saying that thats small time.

That wouldnt be useful at all...
It could store your memory's for you and shit but Im gonna argue that it couldnt do much else.

>> No.3244037

>>3244031

And if you run one hemisphere on computer, and one hemisphere in brains?

>> No.3244038

>>3244031

>It could store your memory's for you and shit
>it couldnt do much else.

Computer programs are static now?

>> No.3244044

>>3244037
You wouldnt be able to exist in both...
It would run and it would just continue on where you left it... It will go on without you...

How would it achieve anything?

>>3244038
computer programs have always been static

>> No.3244052
File: 6 KB, 244x246, idiot.jpg [View same] [iqdb] [saucenao] [google]
3244052

>>3244011
Well you obviously haven't seen I, Robot.

>> No.3244055
File: 26 KB, 313x343, John Searle.jpg [View same] [iqdb] [saucenao] [google]
3244055

>>3243928
This made my whole day. This is the crux of the argument against Strong AI in all of its forms.

>>3243998
I prefer the term 'Rational Skepticism.'

>>3244022
No it isn't, and your wild claims to that effect are pure speculative optimism. Get back to me once you handle the problem of quantum uncertainty. Or do you think a statistical approximation to the position and state of every atom in the brain will be good enough for your thought experiment?

>> No.3244056
File: 52 KB, 926x587, 1302489334488.jpg [View same] [iqdb] [saucenao] [google]
3244056

>>3244044

>computer programs have always been static

>> No.3244070

>>3244044

There are decent arguments to be made on the side of vitalism, I really do think that. But you don't seem to get the central argument of our side.

The mind is not an object. It is a pattern and process of information, algorithms, systems. If you hold that it is possible to simulate a calculator using a physics simulator of a computer, and the simulated calculator would perform the same calculations as the real one, then, to us at least, simulating a human mind is just a harder problem. The mind inside the computer is just as 'mind-y' as the mind inside the brain. Now, one could argue about continuity of consciousness when transferring between the two states, but I don't think you can argue that either one is less conscious than the other. Unless you hold that there is some vital element of the human brain which cannot be explained by physics alone.

Have I been trolled softly?

>> No.3244078

>>3244055
>the argument against Strong AI in all of its forms

And yet you yourself are a strong AI.

>> No.3244086
File: 96 KB, 600x337, 1296196816725.jpg [View same] [iqdb] [saucenao] [google]
3244086

>>3244055

>quantum uncertainty
>derp

Well anyways, simulation of every atom is horribly inefficient, but it provides a sort of laboratory setting to abstract the processes of the brain.

Which is more comfortable than laminating a brain while keeping the patient alive.

>I prefer the term 'Rational Skepticism.'

No, no, you see, he wasn't being skeptical about anything, he was outright saying computers can't feel, as if there was some sort of dualism between feelings and the rest of human consciousness, which looks something you'd pull out of a James Cameron movie.

>computer can orchestrate global thermonuclear war
>derp i am incapable of feeling lol xD

>This made my whole day. This is the crux of the argument against Strong AI in all of its forms.

Hold on a second here. He was arguing that it was <span class="math">\it{hard}[/spoiler], not that it was impossible. It can't be really an argument against AI if all it does is say that it won't happen in our lifetimes, rather than outright disproving the possibility. Moreover, when did we start talking about artificial intelligence? This thread is about mind uploading. They are inherently separate things.

>> No.3244087

>>3244070
Hurrr

I dont care if you can develop a math problem to solve consciousness.

If you take that math problem and do it on a computer chip its just not going to do the same thing IN MY OPINION.

You are saying consciousness in a RESULT of all these algorithms and shit... IN CARBON BASED LIFE.

If you do the same algorithm with something else its different. Also you can never get consiousness down to an algorithm because math is human made... Im sure there is chaos thrown in somewhere. You cant represent chaos in an algorithm.

>> No.3244093

If we took an atomic map of the brain, learned what each cell's behavior is and then programmed each type of cell in the brain and put them in every location we've seen every specific cell as well as simulating the exact chemical make-up of the brain we could possibly replicate consciousness. It makes sense that it would work but to be honest we don't know nearly enough about the brain to even begin to start this without it being a major gamble at best. If I were to guess I'd say another 25 years until we have a proper understanding of the brain, so much so that we could replicate consciousness.
Computing power isn't the problem, it's our lack of understanding of the biology of the brain that truly limits us.

>> No.3244095

>>3244078
>A strong AI that took 1 billion years to create.

>> No.3244101

>>3244095

>without a strong AI working on the problem

>> No.3244103

>>3244095
Doesn't that just mean we have a lot to learn before we can create our own strong AI?

>> No.3244098

>>3244070
You can't argue internally which is why you get the Turing Test ambiguity, but you can differentiate the two systems in two fundamental ways from the inside:

1) If I happen to be the individual in question I certainly have a significant 1st person experience which the 'copy' isn't going to recreate.

2) As the evolved clock example in >>3243928 illustrates, there's a significant difference between a deterministically implemented algorithmic system and an emergent one.

>> No.3244117

>>3244093
I misspoke, it would be a simulation of consciousness. Not a replication.

>> No.3244119
File: 89 KB, 894x894, Steampunk_Spider_Sculpture_10_by_CatherinetteRings.jpg [View same] [iqdb] [saucenao] [google]
3244119

wtf a compact disk , this isnt 1999 wheres your sd cards fool

>> No.3244123
File: 340 KB, 1356x509, computer_comaprison.jpg [View same] [iqdb] [saucenao] [google]
3244123

>>3244087

>You are saying consciousness in a RESULT of all these algorithms and shit... IN CARBON BASED LIFE.

On the left, Babbage's Analytical Engine. On the right, a planet coated in diamondoid circuits.

Both are functionally equivalent, Universal Computers as defined by Alan Turing. They can both compute the Computable Numbers, they can both simulate each other, they are both bound by what machines can do.

A file is a file is a file whether it's implemented in a cranking steam-powered computer or the most advanced gadget in all the world. The abstract information is the same.

>> No.3244126

>>3244098

>1) If I happen to be the individual in question I certainly have a significant 1st person experience which the 'copy' isn't going to recreate.

It may not create a continuity of consciousness. But it could create another consciousness, identical to the original. And, like I say, only that copy would know for sure.

>2) As the evolved clock example in >>3243928 illustrates, there's a significant difference between a deterministically implemented algorithmic system and an emergent one.

That example was posted by me as well. I may be dense, but I'm missing the point here. I was saying that the brain is very, very complex. This makes it a hard problem to reverse engineer and simulate it. It doesn't mean that the simulation would never be complete.

>> No.3244141

http://www.youtube.com/watch?v=acW-axefwaM

>> No.3244148

>>3244087
>most likely believes in free will

>> No.3244150

>>3244101
>Self circling reasoning

>> No.3244154

>>3244119
It's a quote from Portal 2.

The quote comes from a recording of a guy living in the 70's or something

>> No.3244155

>>3244103
No, it means the upper bound to creating an AI as powerful as a human mind is 1 billion years. Considering all the interactions that were required to create this outcome, you have to assert a currently absurd amount of computational energy.

>> No.3244177

People dont know that you cant create an
AI. It has to become self-aware on its own. Otherwise its just a flawed but well replicated depiction of what its creator believes is consciousness.
Rocks dont have a consciousness and im pretty sure trees dont either. We need to figure out how to create life first.

>> No.3244179

>>3244155
The fact that it took thoughtless natural processes billions of years to develop life like us says nothing about how long it will take a concerted effort of intelligent beings to make life like us.

>> No.3244182

>>3244177
>People don't know invisible unicorns don't exist.

>> No.3244185

>>3244182
Why would it matter if the did or didnt.

>> No.3244189

>>3244182

inb4 devil's proof EXTRAVAGANZA.

>> No.3244190

>>3244155
You're not factoring in the exponential gains gained by technology. It may have taken humans 1 billion years from nothing, but that's organically based, lacking the nanosecond computational power currently existing, and that's doubling every 10 monthes.

>> No.3244193

what if you slowly replaced failing bits of your brain with technological augmentations. If my left brain is biological and my right is technological am I the same person? If I then replace my left brain as well have I perceived a continuous stream of consciousness through the procedure? It goes without saying that the biological bits retain consciousness until cell death but is that consciousness of consequence?

who am I

>> No.3244197

>>3244193
boat paradox

>> No.3244200

>>3244123
The fact that you can view two different systems in such a way to produce an equivalence between them does not mean the systems are identical, or even generally equivalent.

>>3244126
The problem is the word 'identical' and it's, in fact, always the problem people have with this. The systems created by any process we can devise will never be identical, they will be isomorphic with respect to the current state of our knowledge.

The point about the evolved clock is that it shows that when you try to reductively analyze an emergent system the analysis breaks down the functions of the system. It's basically like a version of the 2nd Law of Thermodynamics for information Entropy. If you learn how such a system works well enough to simulate it, you're basically going to destroy it.

>> No.3244210

Except, even if you did upload your brain, it would not still be YOU. Consciousness is not going to "carry over" from your physical body to a computer. It will be exactly replicated, in the same way that a rock forms on one planet atomically exactly the same on another planet, but they are not the same.
Your consciousness will end with your death, the transfer will not be "you", it will have created a separate "you" that thinks the transfer was a success.
The only way to achieve immortality is biologically keeping the brain alive.

>> No.3244213

>>3244190
Doesn't matter. All you need to factor in is the grahams number of interactions that occured to fashion the human mind in it's current configuration.

All you're doing right now is waving a magic wand, which is asserted under the guise of exponential progress.

Qualitative progress =! quantitative computational progress

>> No.3244216

>>3244200
>The problem is the word 'identical' and it's, in fact, always the problem people have with this. The systems created by any process we can devise will never be identical, they will be isomorphic with respect to the current state of our knowledge.

That's fine. As our knowledge increases, we'll approach the point where we can simulate a perfect brain in a computer, good enough to make a mind.

>The point about the evolved clock is that it shows that when you try to reductively analyze an emergent system the analysis breaks down the functions of the system. It's basically like a version of the 2nd Law of Thermodynamics for information Entropy. If you learn how such a system works well enough to simulate it, you're basically going to destroy it.

That's fine too. You'll destroy a few, learn which kinds of abstractions are okay to factor in, and not destroy the rest.

>> No.3244226

>>3244197
How do you know you are the same person you were 30 seconds ago?

Science depends on the "observer" so when examining what it is to be an observer I'm afraid to say you need philosophy. Metaphysics can be rational and science does play a role but yeah, this is a whole new line of thought.

>> No.3244253

>>3244216
This is precisely the kind of thinking that was very popular around the turn of the 20th century regarding Mathematics and Physics. Such unfettered optimism strikes me as being both highly irrational and highly unscientific.

If anything we've learned that our knowledge has certain tradeoffs, and that clarity in one domain comes at a price in another.

>> No.3244272

>>3244213
Someone earlier in the thread suggested we can't transfer consciousness, but rather that an A.I would have to develop it's own. Now suppose we created that A.I, and upon becoming self-aware, it develops a consciousness. However, unlike ourselves whom upon achieving this, took years computationally with our primitive organic brains, the A.I will be able to do so near instantaneously in nano-seconds with the provided computational power, thus not only achieving human like thought process but surpassing it.

>> No.3244280

>>3244253

Again I'm not certain I get you. When you say that we trade clarity in one field for obfuscation in another, I assume you mean that when we learn more we realise how much more we have to learn? It's not like findings in one area actually mean we forget stuff in another area.


And for now, until it is shown otherwise, simulating the brain seems like a problem of sufficient scanning technologies and sufficient computational power. It is not unscientific to be optimistic. I am not making hard claims about this without backing them up, we are merely speculating.

>> No.3244285

>>3244213
Except we already have the end product of all the interactions that occurred through evolution. There's no need to brute force the development of an AI with natural selection of invented genes when we already have our own intelligence and genetics to study.

>> No.3244307
File: 177 KB, 1024x768, fractal_168.jpg [View same] [iqdb] [saucenao] [google]
3244307

>>3244272
Again, you're venturing into magic wand territory in another form. These nanoseconds you speak of denote your skepticism at the finely tuned speed of our own neuronal interactions.

You're still in a situation where the knowledge you need is the same knowledge 'evolution' requires to create a computationally efficient consciousness.

You want this AI to some how have an exponential rate of growth with respect to sentience, but there is no currently feasible manner in which this is accomplished.

On the other hand, the core debate centers around a deterministic algorithm versus an evolutionary algorithm.

If we accept that for AI conciousness will require the evolutionary algorithm route, then we have to posit a computational device, with as I stated, an ability to perform a grahams number of interactions for fitness to proceed towards consciousness.

Given the arbitrary nature of evolutionary algorithms, we have yet to even define the set of fitness measurements that would suffice to create this consciousness.

>> No.3244317

>>3244307
>magic wand territory
Yes because spontaneous combustion existed a long time ago.

>> No.3244318

>>3243830
If we could do that, then that would mean we had a complete understanding of how the human brain works, and if we had that knowledge then we could produce an electronic version of it and have REAL artificial intelligence. We don't have that knowledge yet, though, we've barely scratched the surface of how the brain works.

>> No.3244323

>>3244280
I guess quantum mechanics isn't your thing, then. No matter how wonderful our technology is we will never be able to perfectly encode the state of an atom.

We will always have, at best, a statistical approximation that has coupled uncertainties. Thus the tired old copy and paste thought experiments are, to my mind, ridiculous and overly optimistic.

>> No.3244333
File: 2 KB, 126x100, 1303345631697.jpg [View same] [iqdb] [saucenao] [google]
3244333

>>3244307

>> No.3244337

>>3244323

So you think that the brain is not wholly deterministic, that it relies on some probabilistic component?

Well, you may be right. Right now, I'd say that modelling it to the atom would be okay. But if we find that some function of the brain would not function without a probabilistic element, then we'll all change our minds.

>> No.3244356

>>3244317
>only took 3.4 billion years of sentience

>> No.3244369

>>3244337
Sure, I can agree with that.

I basically think that the better we understand consciousness the more we'll come up against the epistemological issues that show up in quantum mechanics like Uncertainty, the no cloning Theorem, etc.

That's not a reason to stop working at it, of course.

>> No.3244370

brain is not an isolated machine, the entire human body is one giant machine and each part has it's place in the consciousness. you can't go about replicating just the brain, it doesn't work like that.
the graphic card analogy is perfect. as the graphic card alone won't get you cowadooty. you need the entire computer

>> No.3244398

Create a computer,
But instead dont create any software for it.
Be sure to cut everything on but only supply power.
Eventually you will see random patterns take form and grow into an AI.

i eventually it will realize its existence and

>> No.3244416
File: 42 KB, 400x362, youmustknow2.jpg [View same] [iqdb] [saucenao] [google]
3244416

>>3244398