[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 676 KB, 1280x800, MachineManwithBird.jpg [View same] [iqdb] [saucenao] [google]
3193426 No.3193426 [Reply] [Original]

With my computer science degree, I will find away to transfer our consciousness into a machine that will last forever. In doing this, we solidify our flimsy existence. In doing this, we become immortal. In doing this we give ourselves the ABILITY to know EVERYTHING... because we now have the time. We will see this history of humanity play out, and the departure to different worlds. Those who choose to be immortal will be the wisest of human civilization. They will come to us for answers and to hear about the human worlds of eons past.

We will be like gods.

>> No.3193435

Until our sun turns into a red giant, or a rogue black hole comes by, or a super-nova knocks out all power/electronics with a powerful EMG, or when the Milky way and Andromeda collide we are destroyed or flung out into the cold void.

>> No.3193451

>>3193435
WHERE IS YOUR GOD NOW, OP?!

>> No.3193468
File: 450 KB, 900x1393, 1307101866504.jpg [View same] [iqdb] [saucenao] [google]
3193468

i love you OP

lets be the new gods
i will explore the galaxy, feels good man

>> No.3193484
File: 115 KB, 604x840, 1284995039508.jpg [View same] [iqdb] [saucenao] [google]
3193484

>>3193435

We can fix that.

>>3193468
>>3193426

Wait the fuck up, I'm coming too.

>> No.3193490
File: 329 KB, 850x1100, pshift.jpg [View same] [iqdb] [saucenao] [google]
3193490

I just read this too!

facepunch, right?

it's the only hope for the friggin world

>> No.3193493

>With my computer science degree, I will find a job as a code monkey working minimum wage!

>> No.3193494

>>3193484
welcome abroad

need moar pics like that

>> No.3193497

>>3193494
aboard maybe...

>> No.3193499
File: 42 KB, 481x358, 1307283584282.jpg [View same] [iqdb] [saucenao] [google]
3193499

But.. what about the children?

>> No.3193505
File: 349 KB, 1400x990, 1279576381044.jpg [View same] [iqdb] [saucenao] [google]
3193505

>>3193494

>> No.3193508
File: 141 KB, 500x438, 1305425520259.jpg [View same] [iqdb] [saucenao] [google]
3193508

>>3193505

>> No.3193518
File: 27 KB, 500x360, 1303808299436.jpg [View same] [iqdb] [saucenao] [google]
3193518

>>3193426

>> No.3193521
File: 331 KB, 500x489, 2578078742288165769.jpeg___1_500_1_500_cb94de6a_.png [View same] [iqdb] [saucenao] [google]
3193521

>> No.3193530
File: 275 KB, 734x1020, Infinity___Citizens_by_elpinoy.jpg [View same] [iqdb] [saucenao] [google]
3193530

>> No.3193546

these are some nice pics anon

>> No.3193562
File: 94 KB, 821x973, Portrait_by_hellcorpceo.jpg [View same] [iqdb] [saucenao] [google]
3193562

I've still got more.

>> No.3193567
File: 553 KB, 1760x990, Space_Vagabonds__Battlesuit_by_ukitakumuki.jpg [View same] [iqdb] [saucenao] [google]
3193567

>>3193505

Rest of this set

>> No.3193571
File: 350 KB, 1400x990, Space_Vagabonds__Nighthound_by_ukitakumuki.jpg [View same] [iqdb] [saucenao] [google]
3193571

>>3193567

>> No.3193574
File: 426 KB, 1400x990, 1284768307136.jpg [View same] [iqdb] [saucenao] [google]
3193574

>>3193571

>> No.3193578
File: 840 KB, 1600x862, 1284767794954.jpg [View same] [iqdb] [saucenao] [google]
3193578

I like this one a lot.

>> No.3193583
File: 178 KB, 1000x812, 1284768971963.jpg [View same] [iqdb] [saucenao] [google]
3193583

Future Gladiatoral Combat?

>> No.3193587
File: 48 KB, 638x825, 1296076295598.jpg [View same] [iqdb] [saucenao] [google]
3193587

>> No.3193592

>>3193578

I came

>> No.3193594
File: 84 KB, 750x600, 1298941738304.jpg [View same] [iqdb] [saucenao] [google]
3193594

This one is just funny.

>> No.3193595

My only concern is cost....40 years from now I may still be on the average side of how much people make. How will I be able to afford the body-transfer? I don't want to be stuck in this meatbag for a finite 70 years...

>> No.3193606

>>3193595
Save during your whole life (the meatbagy life)

>> No.3193607
File: 177 KB, 969x1514, Transmetropolitan #32 - Page 23.jpg [View same] [iqdb] [saucenao] [google]
3193607

And then there is this dystopian clusterfuck.

>> No.3193608

>>3193595

Many things will be lost in the transfer. You will no longer be subject to human-like emotions. Your thinking will be clear and in alignment with your higher order motivation.

The transfer isn't a guarantee of happy infinity life, more rather, a guarantee of existence in a solemn state.

Many people will reject this, the first couple hundred will be done for free in the name of experimentation.

>> No.3193611

>>3193468
>>3193484
>>3193508

Thanks guys. I am digging the robot-man art. Please post em' if you got em'

>> No.3193618
File: 12 KB, 223x226, images (3).jpg [View same] [iqdb] [saucenao] [google]
3193618

I've run dry.

>> No.3193619

>>3193608
Are we talking about a machine body built to sustain a biological brain inside it? Or a complete transfer of neural patterns to a new substrate? The former seems more likely in my lifetime, and if so, we'd still be producing emotion-causing chemicals, wouldn't we?
I think I'd feel better about my brain being hooked up to a machine I can control, as oppose to having 'me' sucked out of this meat-shell and thrown into a new body like a file on a flash drive.

>>3193606

So what percentage will be able to afford it? It's great if we can do it, but if only the top 1% can afford it, then we just made immortal bankers and politicians...

>> No.3193636

>>3193619

The problem with your brain is that it will decay.

Hyper-phosphorylation of tau proteins in your neuron micro-tubules will give you the alzies. Thought process becoming a file on a flashdrive is the way to go.

>> No.3193648

>>3193619

http://www.ted.com/talks/aubrey_de_grey_says_we_can_avoid_aging.html

This is more about biological immortality, but he mentions that as the treatments get better, the prices for old ones will drop, the same way it works for electronics now.

Basically the rich will pay more to beta-test immortality.

>> No.3193649

FUCK YES, save us OP, save us from these miserable lives. Use that degree as a dowsing-rod and lead us to immortality!
Give us our world where death is impossible, but thinking obsolete, a world in which we will live in holodecks for eternity, endlessly manipulating our brains to release as much dopamine and serotonin as possible, endless brainmasturbation will be ours. Never to worry, never to be needed, never a true feeling. That will be bliss.

>> No.3193650

>>3193426
OP, make sure you remember Theseus paradox on the way.

Downloading your consciousness into a machine won't make it so YOU are in the machine. There is only a simulated copy there. You would still be sitting there with the crazy helmet on wondering "it is over already?"

>> No.3193652

>>3193636
Any way supplements will stave off said rotting, perhaps indefinitely, ala Kurzweil? It appears that brain/machine hybridization will be possible far sooner than direct conciousness transfer, hence my concern.
>what about a brain in a bot, with nanobots in the brain repairing it?

>> No.3193653

>>3193608
Do you mean this as a joke? By far the easiest way to preserve brain is by direct 1to1 substitution of neurons by preserving the potentials on axons so that you keep your memory as well as the way you think perhaps with option of multiple channels and frequency division multiplexing on several channels so that you can emulate relatively global (large scale) effects like hormones. - This is solely a hardware issue with main emphasis on manufacturing so perhaps more advanced manufacturing techniques are required dominantly with emphasis on self-assembly (since it would allow exponential growth rate at an early stages)...

Anyway I like the pictures. Do you have more of the future-urban once? The once that feel so commercial and overcrowded. Many thanks.

>> No.3193660

You are never invincible without diversity. Only if you split your consciousness (which will no longer be YOUR consciousness once you split) and make each one a little different from the rest. We can not make each one live too long because their memory/feelings will cause corruption. We have to pass on most of our knowledge without the 'human error'. We need to continue this for as long as we exist to create 'invincibility'.

Wait we live in a society like this. We are invincible.

>> No.3193665
File: 94 KB, 342x366, 1307381946281.png [View same] [iqdb] [saucenao] [google]
3193665

>>3193660
>You are never invincible without diversity
citation needed

>> No.3193667

>>3193426
What convinces you that this isn't already what is happening?

>> No.3193670
File: 197 KB, 1096x555, 280807-0242-azul.jpg [View same] [iqdb] [saucenao] [google]
3193670

>>3193653

That's a fun one actually, it's art for the rulebooks of a Miniatures game called Infinity, by a company called Corpus Belli. The story is set 175 years in the future, in what has basically become a cyberpunk society.

http://www.infinitythegame.com/infinity/en/

>> No.3193672

>>3193660

Our feelings, our memories, however complex, are based on physical cells and the interactions between them.

The only way I see us living forever, is if we somehow develop a technology that slowly replaces our organic neurons with analogous synthetic, non-decaying machine counterparts in a way that it never interrupts our brain waves.

Its probably going to be hard as fuck, but I don't really see anything impossible with that.

>> No.3193674

>>3193670

We talk about it in /tg/ a lot.

>> No.3193680

>>3193653

Lol, I was typing this >>3193672

When you posted your comment. Awesome to see that people agree with me, never really read anything about this

>> No.3193682

>>3193650

The paradox does not apply :3

The paradox questions whether the ship could still be named the same after the parts are replaced. HOWEVER the crew, the people operating the ship, or... my consciousness, still remains.

I could not call myself a human anymore than that ship could still hold its name.

>> No.3193691

>>3193682

Too bad that metaphor is dicks.

Your consciousness, your memories, your feelings, are all based on physical cells. Putting perfect copies of those cells somewhere else doesn't really mean that you're transferring the original cells there, just copying them.

Same thing for your idea of "downloading your consciousness" into a machine. The organic you would still be sitting there chillin, while a perfect copy of you sits there, digitalized, immortal. Imagine how butthurt you would be in that situation? Not to mention that they could simply decide to kill you once they have the digital version, which means that your approach could basically count as suicide

>> No.3193692

>>3193672
Perception of time speeds up as we live longer. If we theoretically if we live forever, we will experience our much older life as a flash. That wouldn't necessarily be a bad thing because one day we can see a whole star born and die on the same "day". But living on a slower scale would be painful and meaningless.

>> No.3193695

>>3193682
Then you're not the same person from day to day either, as your cells and raw materials are replaced and exchanged with the environment.

A material definition of identity is pointless. Maybe try a functional one. Is there a thing with the same memories? The same capacities? The same dynamics and potential for leaning? Then it's "you", get over it. There's more than one copy? They're copies of you, distinct individuals that will begin diverging as soon as the copying is done.

>> No.3193697
File: 167 KB, 1132x922, Woman-Cyborg-59669.jpg [View same] [iqdb] [saucenao] [google]
3193697

Back, had to go take a very unpleasant dump.

I'll be happy to ditch this meat-shell any day now...

>> No.3193698

>>3193691

Conciousness remains even though your brain cells die and are replaced over the course of your life.

Ergo, if you slowly replace brain cells with artificial copies in such a way that it preserves your conciousness, your conciousness could then be digitized.

>> No.3193703

>>3193698

Exactly, that is the method I and >>3193653 mentioned. That is clearly the best approach.

>> No.3193709

>>3193698
This is less troubling, but is really no different from making a digital copy and obliterating the biological one. It's hidden under a gradual process instead of a sudden one.

>> No.3193710
File: 49 KB, 500x656, jodie-foster-cyborg-34128.jpg [View same] [iqdb] [saucenao] [google]
3193710

>>3193698
So could we make artificial neuron-replacements that could be injected into a brain dose by dose in the form of nanites?

>> No.3193714

>>3193665
Well everything makes mistakes sometime. Perhaps one day your memory storage crash or fail. You can make a backup ofc but copying consciousness means making a close but different person as both of you can't experience the same thing at the same time.

>> No.3193720

>>3193714
I'll give you that. We'd need to have built in pico-second digital backups of our minds, redundancy, with maybe a wireless link to a storage unit somewhere.
>would 'you' experience that storage unit while 'you' were out an about?

>> No.3193721

>>3193709

But that difference is extremely important. The essence of who you are is based on brain waves. The gradual way is the only way, since it's the only possible method that doesn't interrupt or alter those waves. A single neuron dying at a time is a natural process, so replacing it one by one also is.

>>3193710

That is a pretty cool idea. But as I see it it doesn't even depend on the development of nanite technology (I'm thinking Stargate here). The only crucial part is creating a perfect synthetic copy of a neuron, that would be able to interact with the organic counterparts normally

>> No.3193723

>>3193709

You ARE the copy, in the same way You at 10 years old and You at 60 years old are not different people, despite the fact that all the brain cells you had at 10 are dead and gone.

>>3193714
>>3193720

The answer to both of these is to have your actual mind in a redundantly backed up mainframe somewhere safe and just remote control robotic bodies through some kind of remote link, probably related to entanglement.

>> No.3193731

>>3193723

And yes, I am paranoid. I'm Properly Paranoid.

>> No.3193738

>>3193721
>But that difference is extremely important. The essence of who you are is based on brain waves. The gradual way is the only way, since it's the only possible method that doesn't interrupt or alter those waves. A single neuron dying at a time is a natural process, so replacing it one by one also is.
Your consciousness dies every time you go to sleep. Even worse when you're under general anaesthetic.

I don't need identity to be continuous. If I am flash-frozen, my brain waves disappear. And if I am then brought back, they reappear. Those brain dynamics ARE my mind, they are "me", and for that period of time my mind didn't exist - anywhere.

Even worse, make a copy of the flash-frozen me, destroy the original, and then wake up the copy. Is it "me"?

Answer: I don't give a fuck. Functionally there's no difference.

>> No.3193746

>>3193720
Hard to say. We know too little about our own brain to determine consciousness. We don't know if it just occurs, or it needs to be activated somehow. Theoretically tho if we give the mass backup only abilities like read and write it shouldn't be sentient.

Who knows maybe I'm wrong.

>> No.3193747

>>3193723

You don't really need to create artificial memory storage, unless you're thinking about augmentation.

Creating synthetic neurons that mimic organic ones would already take care of that. Remember that we're just replacing neurons one by one, but maintaining the architecture of the brain.

We wouldn't even need to know how exactly the brain works, since if we do everything right, our replaced version would work just like the other one

>> No.3193753

>>3193747
>We wouldn't even need to know how exactly the brain works
You sure ass hell need to know everything there is to know about *neurons* work. Though I agree that you don't need to have an understanding of the emergent dynamics.

>> No.3193754

>build robot body capable of reproducing my brainwaves/neruons
>turn it on
>talk to myself like a boss
Any reason I couldn't do this someday? I think it'd be incredibly cool, giving you an idea of what people see when they talk to you.

>> No.3193757

>>3193754
You could do it.

Now here's the kicker: Are you willing to treat that robot with just as much respect as you demand for yourself?

>> No.3193759
File: 69 KB, 567x366, miles_dyson.jpg [View same] [iqdb] [saucenao] [google]
3193759

>>3193746
>mfw all those non-sentient backups of living people network and become self-aware

>> No.3193763

>>3193759
Who said they had CPU cycles? It's like a simulation savestate file, not a running simulation.

>> No.3193768

>>3193757
Without a second thought. If you can ask for rights and respect, you're self-aware enough to deserve them, imo.
>part of me would wonder if it's just running a 'trick humans' program without knowing what it really means, but the anthropological shape of the robot would make it hard to not see as alive, to me.

>> No.3193772

>>3193738

Enjoy dying and having another person take over your life then.

And no, your consciousness doesn't "die every time you go to sleep". Brain activity only fully ceases when you die.

>>3193753

Exactly. The emergent dynamics and how all those insane networks with billions and billions of neurons interact with each other.

It would be cool if we managed to develop some way to automatically, but slowly, replace neurons with the synthetic ones. Perhaps the nanites idea somebody else mentioned

>> No.3193775

>>3193763
good point. Lots of data there in storage though, couldn't a rogue AI access it? Or even a malevolent human bent on making a skynet-type?

>> No.3193779

>>3193747

Of COURSE I'm thinking of augmentation.

>> No.3193782

>>3193772
>Enjoy dying and having another person take over your life then.
Your distinction is empty. It's a person will all the same memories, thoughts, desires, personality, etc. I don't see why you cast a shadow of alienness over it. It's not "another person", like you're being possessed or something.

>> No.3193788

>>3193772
>And no, your consciousness doesn't "die every time you go to sleep". Brain activity only fully ceases when you die.
Your requirement of "fully ceases" is arbitrary. Your consciousness is gone when you are unconscious. But it comes back again when you wake up or are revived - because the physical brain structure that produces that consciousness is still present.

>> No.3193791

Wait... What about personality?

>> No.3193800

>>3193721
I'm glad I see someone else with the exact idea. (I'm the one with No.3193653)

I think it is real to make such "injections". The way I imagine it is by injecting cores of those new neurons while the new neuron cores will just tend to stick with bio-neurons and build their new axons and dendrids on top of bio-dendrids. That way there's no problem with seeking the neuron paths. How to build it inside brain? As an example it might be done by injecting pieces of fibrilar structures (basically equivalents to axons - wires). They can assemble only on given spots of either core neurons or on activated artificial axons (these that are already connected to a core).
That way the whole thing will build itself alongside the organic one.

How to build all of these semi fabricated structures? Well that's why I wrote about the exponential manufacturing in previous post (No.3193653). It can be achieved in a reasonable time and with a reasonable budget (and with a reasonably advanced technologies - basically today we might have already all the sub-tools and physics knowledge needed for it).

>> No.3193801

>>3193791
It's just a a way of describing the qualities of the mind. It's still there.

>> No.3193826

>>3193800
Casualfag here, but is there a possibility of, once the mechanical neurons are in place, expanding those neural connections? Unless I'm mistaken, moar connections=moar smarts.
>we could all become iq500 brilliant

>> No.3193829

>>3193791
It changes with time/environment anyway. Won't matter if you are going to live forever.

>> No.3193830

Here's the question. What is the difference between these two scenarios:

A) Some kind of nano-bots take a bunch of carbon atoms and build an exact replica of my body, including the brain, atom for atom, in one day.
B) My own cells split and replace themselves over a ten year period and I am a semi-accurate copy of the thing that I was ten years ago.

Now, in scenario A, the copy is EXACT. In scenario B, the copy is very lossy and degraded. So, why is it, then, that I would expect for my consciousness to NOT be transferred to the new copy in scenario A, while in scenario B, I would expect that it IS transferred to the new copy?

>> No.3193833

>>3193830
The key flaw in this whole thread is that of a consciousness being "transferred". A consciousness is not a thing with concrete identity and continuous existence - it's just the dynamics resulting from a working brain.

Asking whether a consciousness has been "transferred" hides unwarranted assumptions about consciousness. tl;dr consciousness is just an abstraction of the dynamics of brains

>> No.3193834

>>3193830
we've been doing scenario b for a really long time.
scenario a is untested
(but promising)

>> No.3193835

>>3193763
It could live off yours stealthily. You might never even notice with your gigantic network of you.

>> No.3193838

>>3193834
It's true that A has never been done, but given current knowledge it would be rather ridiculous to say it wouldn't work. If I make a perfect copy of a car, do you worry about whether that perfect copy counts as a car? Do you worry about whether it is the "same" car? (I hold that that last question is meaningless in the sense it is usually brought up).

>> No.3193839

>>3193835
Data files are not programs.

>> No.3193840

>>3193833
But that's exactly the point of the question being asked. Why is your consciousness "transferred" to your new body as it changes day by day, minute by minute? You are constantly rebuilding yourself, and somehow, your consciousness "knows" how to stay confined within the new copy each time.

If we can answer that, then we can answer the question of whether or not what OP wants is possible.

>> No.3193845

>>3193830
Because your consciousness is not a constant, it changes with your brain.

>> No.3193846

>>3193838
Right. Not saying it won't work, just that we're not sure what would happen
>let's build it and turn it on FOR SCIENCE!!

>> No.3193849

The old copy of you will still remain in a body and will die.

tough titties code monkey. now stop thinking and write something useful for us engineers

>> No.3193853

>>3193435

You must be one of those retards who still believe chemical rockets are our final word to spacecraft propulsion.

>> No.3193854

Maybe the dualist got it right

>> No.3193858

>>3193840
> Why is your consciousness "transferred" to your new body as it changes day by day, minute by minute?
It's not - it's an illusion. Your brain produces a similar consciousness today as it did yesterday, but only because your brain is only slightly changed.
>You are constantly rebuilding yourself, and somehow, your consciousness "knows" how to stay confined within the new copy each time.
Wow, you REALLY are into this illusion. There is no such uniqueness or continuity of consciousness. Your mind is just the dynamics of a running brain. Copy a brain, and they will produce an identical but separate phenomena of consciousness, and then diverge as they have different experiences. Scan a brain, destroy it, and make an exact replica from the data, and a consciousness that is functionally identical will exist.

And that functional identity is all that matters.

>> No.3193861

>>3193853
>chemical rockets = blood-letting and leeches

>> No.3193864

>>3193854
We'll know once we fire up the first faithful cellular-level simulation of a physical brain. If THAT doesn't work, then there's something missing from the model.

>> No.3193869

>>3193826
That might seem to be a fairly big problem - which in deed it is. But there are some solutions to it. If neurons are far away from each other and are making a connection this can be done by re-directing signals. Basically using other neurons as proxies - this indeed requires some internal optimization processes to be present and it get's tough. However if it's rare and basically present only during the formation of the brain it means that you might not need it in learning new thing (only when forming the brain which you have already past after getting born or perhaps during puberty? not sure). Anyway proxies will probably do it.

Short connections? They can be achieved by using electro magnetic radiation and filters. Assuming the substance in the brain has different absorption properties for different optical wavelength it is possible to use the same wavelength many times (since if neurons are far away they can't detect each other's signals hence can't distort each others signals).

Different wavelengths for different ranges and also it would require ability cores to have ability to add more adjustable detectors (most likely based on Quantum Well Photodetectors) they can be used for emission as well as receiving.

>> No.3193879

>>3193858
> There is no such uniqueness or continuity of consciousness.
That's speculation and very nihilistic. If you really believed that, you'd have absolutely no reason to continue living.

>> No.3193885

>>3193869
...not ashamed to admit, much of that went over my head. I did get one thing though: children.
If we concentrate on expanding the neural connections of a growing toddler using techno augmentation, we could create a super-genius.
>inb4 moralfags, we sacrifice it for SCIENCE

>> No.3193886

>>3193839
What I'm saying is that we don't even know if consciousness can be stored as files because they tend to change with your brain.

Storing them could mean simulate them with a extremely slow time. They live but the only difference they wake up with is a few seconds.

>> No.3193889

>>3193709
If the dualists are wrong then we are basically copies of our former selves and slowly replacing ourselves with synthetics only serves to preserve the illusion of continuity of consciousness.

>> No.3193890

>>3193879
>That's speculation and very nihilistic. If you really believed that, you'd have absolutely no reason to continue living.
No. I like living. I just don't hold any illusions about my consciousness being a thing that exists independent of my physical organization.

Just what exactly does YOUR desire to live hinge on, since you suggest that I shouldn't have one? You don't want to live at all if your consciousness doesn't have an existence independent of your physical organization?

>> No.3193894

>>3193889
This. It's a comforting lie for dualists - it preserves the illusion that consciousness is a concrete thing with independent existence.

Presuming dualists are wrong of course. If THIS happens, then we just start the scientific investigation of what's missing from the model (the "soul"):
>>3193864

>> No.3193895

>>3193879

Maybe its the only reason he can continue to keep living

>> No.3193896

>>3193879
Your consciousness is not unique but your experiences and memories are.

>> No.3193901
File: 2.38 MB, 3000x1500, 3000_CC_BY-NC.jpg [View same] [iqdb] [saucenao] [google]
3193901

Even if you copy a brain exactly in a computer. It will never be you.

It will be a different you.

What you have to do is implant your own brain into a robot. But your brain cells would still die. So you would have to do a constant rotation of new brain cells to keep it up and running. Maybe if they create a nano device that constantly replenishes them.

Thats the only way I see it working.

>> No.3193902

>>3193896
... unless I make a perfect copy of you. People who protest about that scenario are having the problem with "uniqueness" that I was referring to.

>> No.3193909

>>3193890
> You don't want to live at all if your consciousness doesn't have an existence independent of your physical organization?
It is worse than that... if your consciousness is nothing more than a byproduct of a certain organization of atoms, then none of us ARE "living' any more than a rock is living.

>> No.3193910

>>3193901
Or you just abandon the unnecessary requirement that your consciousness be continuous. I don't worry every night about my consciousness dying as I fall asleep, or when I'm put under for an operation. What's the difference between waking up in a biological body or a synthetic one?

>> No.3193916

>>3193896
Feels silly to try talking about consciousness as something separate from experience and memories. They are integral to each other.

>> No.3193917

>>3193896
> your experiences and memories are.
Why? If were already supposing here that we can make exact copies of brains, then we can just as easily put the same memories into the new brain.

>> No.3193924

>>3193910
I know, I was just saying.

Some people actually want to do that.
And im sure when such a procedure becomes mainstream then pretty much everyone will do it. When they are born maybe they will immediately be put into a robot body. ^^

I want to die some day but I think 70 years is a little short.

>> No.3193925

>>3193909
Rocks have very boring dynamics. They are not "alive". You and I are alive. WE are interesting (to us, anyway).

When a paramecium divides, should we have an existential crisis about what happened to its awareness? And paremeciums ARE aware of their surroundings, if on a very basic level.

And slime molds, which are just a colony of single-celled organisms, exhibit memory.
http://www.nature.com/news/2008/080123/full/451385a.html#B2

Should I worry about the "identity" of the "consciousness" of the slime mold, if I divide it into two colonies? They aren't sapient or even self-aware, to be sure, but they do exhibit awareness and choice.

>> No.3193933

>>3193901
your functionally identical self is all we seek. If I'm uploaded into a computer, my physical body will still live, but I'll either kill myself or have my machine counterpart kill me. I won't worry because something exactly identical to me will take my place.

>> No.3193936

>>3193933
Or just let yourself (the original you) decide if he wants to live. Killing him unnecessarily seems drastic. He's still a sapient being.

I'm fine with it if you're OK with the other side: Being the guy who gets scanned and left behind. Would you kill yourself? Why?

>> No.3193938

>>3193933
>your functionally identical self is all we seek.

lol. thats retarded.
why would you want that?
so it can pass on information that nobody needs in an age where we already know everything?

>> No.3193940

>>3193925
You're deflecting the issue by talking about whether or not we need to "worry" about living things.

Your point is (was) that there is no such thing as
"consciousness" in the sense that some "soul" or something equivalent takes up residence in a living thing. You believe it's all just EM waves and other physical phenomena.

If that's true, then I could build some sort of electrical circuit that produces all kinds of random EM noise, tape it to a rock, and you'd claim that the rock was just as "alive" as you and I.

>> No.3193944

>>3193901
That's my view too. I know my brain holds 'me', it's one of the few things about my consciousness I DO know. If I can put that organ in a box, and that box in a body, then I know I'm still me, despite the change of vehicle. Technology can maintain the last vestige of my biology, at least until such time as direct transfer to data is commonplace.

If I can be a brain in a 'bot for the next 500 years, I'll still be around long enough for the next stage in mind-transfer.

>> No.3193946

>>3193936
>I'm fine with it if you're OK with the other side: Being the guy who gets scanned and left behind. Would you kill yourself? Why?
Not that guy, but I could see someone adopting a mindset where they have no problem with this. It's pretty far from how most people think right now though. And personally, I think unnecessary destruction of sapients by derivative sapients is unethical. It's like your children being allowed to murder you once you have retired and are also deemed unnecessary by your children.

>> No.3193952

>>3193936
I would kill myself because the longer I live the more my machine counterpart will be different from me, thus it won't be my consciousness that will live forever.

>> No.3193955

>>3193830
Let's for a second assume the consciousness will be "transferred" in both cases. How are you going to do it? say you have 10^13 micro-bots. What are they supposed to do? If they start atom-to-atom or even molecule-to-molecule will take such a long period of time that the body will decay. Which can be avoided but! In order to preserve the personality it is essential to preserve memories and "settings" of neurons (chemical balance on axons). There comes the problem. In the mid. way the person will die, which is alright we are just about to resurrect it but - the ions on axons will get partially lost, so will the information carried by them. Problem is not assemble it but to measure it.

The neuron to neuron substitution is good because there is longer minimal building time and only need to substitute the necessary components (don't bother with things like scull). Also it will preserve you from getting senile due to ageing so instead of having 60 years of good mental health you can get it pretty much up to the point when all the organs are failing (100 years) so you have more time to regenerate or substitute the rest of your body.

The B method (neuron to neuron) is just easier when it comes to practical thing. In about 3 posts I have given a brief overview over how you can create the artificial brain substituent (No.3193653), apply the substituent (No.3193800) and expand it (No.3193869). I know it's very vague but it shows that all the steps are achievable in reachably far future. If you show your proposal of A in similar way (manufacture, apply, expand - show it can learn) then I guess it will be much clearer - not saying it's bad, I just don't know how to do your A.

>> No.3193960

>>3193940
Aside: That post sounded more religious than I intended. I probably should have avoided the word "soul."

However, if it's religious to believe that consciousness exists beyond the simple physics of atoms and natural forces, then I suppose you can label me as religious.

>> No.3193961

I dont see the urge to pass on yourself if its not "you"

He might deviate from your ways in unexpectable ways and you can never know if the computer "you" is actually conscious since he is inside a computer. It might just be a simulation of consciousness and you can never know if he actually is self aware like an organic human would be.

>> No.3193962

>>3193940
>If that's true, then I could build some sort of electrical circuit that produces all kinds of random EM noise, tape it to a rock, and you'd claim that the rock was just as "alive" as you and I.
Can I have a conversation with this rock? Random noise is not life either.

Life is a very special phenomenon - especially intelligent life. It rides the line between utter chaos and crystalline order, it responds and adapts to outside information, and it even produce the high-level behaviors and capabilities of thought, reason, self-awareness, etc.

Life is valuable. In fact, the well-being of intelligent life is *all* I care about. That's an arbitrary value, like all others, but it's the one that makes me happy.

>> No.3193968

>>3193938
Because there is no way to have your actual self transferred into a machine. Consciousness is a product of your brain, you can only upload a functionally identical self into a machine.

>> No.3193969

>>3193952
>I would kill myself because the longer I live the more my machine counterpart will be different from me, thus it won't be my consciousness that will live forever.
What
Explain this, it makes no sense. Once the copy has been made, what possible influence does the biological source have on the copy? You could ship him to Pluto and you would be none the wiser unless you checked.

>> No.3193972
File: 333 KB, 420x315, applause.gif [View same] [iqdb] [saucenao] [google]
3193972

>>3193962
>>3193962
>Life is a very special phenomenon - especially intelligent life. It rides the line between utter chaos and crystalline order, it responds and adapts to outside information, and it even produce the high-level behaviors and capabilities of thought, reason, self-awareness, etc.
>Life is valuable. In fact, the well-being of intelligent life is *all* I care about. That's an arbitrary value, like all others, but it's the one that makes me happy.

Get the fuck over here and brofist me, motherfucker.

>> No.3193976
File: 29 KB, 468x458, internet-bro-fist.jpg [View same] [iqdb] [saucenao] [google]
3193976

>>3193972

>> No.3193981

>>3193962
> Can I have a conversation with this rock? Random noise is not life either.
Of course you can. You can stand there and talk to it and it will talk back, in the random noise that it produces. If it makes you feel better, I'll add a microphone and some logic so thatthe random noise will change depending on what you say to it. Then will it be alive?

>> No.3193982

>>3193976
incidently, I just put >Life is a very special phenomenon - especially intelligent life. It rides the line between utter chaos and crystalline order, it responds and adapts to outside information, and it even produce the high-level behaviors and capabilities of thought, reason, self-awareness, etc.
as my FB status. Had to save it.

>> No.3193984

>>3193981
>I'll add
experiment tainted

>> No.3193988

>>3193981
> If it makes you feel better, I'll add a microphone and some logic so thatthe random noise will change depending on what you say to it. Then will it be alive?
If it has the necessary dynamics characteristic of life, then yes. If it's just "random EM noise" with no correlation with or adapatation to the environment, then no. Your tone is derogatory, but that's not an argument.

Perhaps we should step back. Is this about a belief you think I have and should change, or about a belief you have and want to communicate? If so, what is it?

>> No.3193990

>>3193981
Follow-up... This "arbitrary value:"
> Life is valuable. In fact, the well-being of intelligent life is *all* I care about. That's an arbitrary value, like all others, but it's the one that makes me happy.
... is just a coping mechanism, really -- it's a way to pretend that your nihilism isn't really nihilistic.

>> No.3193991

>>3193982
Then it should say "produces", not "produce" - I made a spelling error.

>> No.3193997

>>3193969
Well if your physical body don't die then there are two copies of you. The two copies would eventually become different. If I want my consciousness to live forever, the closest thing is to keep that consciousness as one as much as possible. Keeping them completely as one is impossible so killing right after the transition would be the next best thing. Since it would cause the least difference between the two of us.

If you just live your life, you would essentially only be running a simulation on what your consciousness COULD be.

>> No.3193998

>>3193988
> Is this about a belief you think I have and should change
No, not at all, and that's a good point.
What we're debating here is maybe the most important question that can be debated. And yes, I'm not foolish enough to think that we're the first to discuss it. This is the root of all philosophy.

The example of talking to the rock, to me, is where your view stops making sense. I've been down the road of looking at life from that perspective, and it has a very tragic conclusion.

>> No.3194004

>>3193990
I'm not sure what you mean by nihilism. I value human life, and more generally, intelligent life, because it matches what makes me happy. I love being intelligent, learning, and helping others do the same. I care about the well-being of intelligent beings, and all life in general, though to a less degree. Does it matter whether that motivation just because of my human biology?

Why do you need me to believe that life is meaningless, just because I say the meaning comes from biology? I'm not going to assume the qualities of the strawman you're forcing on me. Why should I?

>> No.3194015
File: 5 KB, 151x153, velmaold.jpg [View same] [iqdb] [saucenao] [google]
3194015

>>3193990
>lifeforms have no reason to exist other than continuing the unbroken thread of life
>I'm_okay_with_this.jpg

>> No.3194017

>>3193998
>The example of talking to the rock, to me, is where your view stops making sense. I've been down the road of looking at life from that perspective, and it has a very tragic conclusion.
Then I would suggest that you have not communicated these thoughts, and only imagine that you have done so.

What are you trying to say with the rock? It seemed to me that you were just trying to hack away at my conception of life, for reasons I can't clearly see.

>> No.3194022

>>3194004
You cant say that "I like only intelligent life" with a straight face.

That rock might have incomprehensible intellect. Deal with it.

That rock is more important than you.

>> No.3194027

>>3193997
Your weird illusions have caused you to murder a sapient being. If you were cloned, would you kill the clone to protect your mental notion of being unique?

>> No.3194029

>>3194004
I'd go a step further in fact, stating that the very way our biology pushes us forward into surviving is in itself an incredible and amazing thing, regardless of some mythic creator, some intangible spirit, or some mystic meaning to life.

We are. That's pretty fucking cool. Let's keep being.

>> No.3194033

>>3194022
>You cant say that "I like only intelligent life" with a straight face.
More precisely, I said that I like life, and intelligent life even more so.

>That rock might have incomprehensible intellect. Deal with it.
Unlikely, given how simple and unresponsive all the observable dynamics of the rock are.

>That rock is more important than you.
Only in your value system, maybe. I care about people more than rocks. Do you?

>> No.3194049

>>3194017
I'm defending my point of view, not attacking yours. Mine is the one that was attacked.

I posed the experiment involving "transferring" a consciousness. Your response was that there is no such thing to transfer. I am seeking to show otherwise, although I know going in that it is almost certain that there is no way to prove either point of view.

However, the example of the rock is meant to illustrate that you don't have any sort of consistent definition of life (that you've stated). You've said that it makes choices, has conversations, responds to its environment, etc... and I'm showing that it's trivial to create physical objects that can do all those things. So, from your point of view, there's really nothing special about life.

The only thing special that *I* see about life is that it *must* be beyond the simple characteristics that you've assigned to it. It *must* be more than a byproduct of the organization of matter.

>> No.3194075

>>3194027
Really hard arguing with straw-man, but i'll try.

Obviously that wont be the case because nobody's gonna clone themselves to be unique.

More correct way would be cloning a physically superior me and my goal is to be physically superior AND unique. Ofc I would have to give up my current body but my current mind would make the exact same decisions as the other me. Is there any point of keeping both? Especially by keeping both you lose the unity between them.

>> No.3194079
File: 50 KB, 301x293, 1296497189711.jpg [View same] [iqdb] [saucenao] [google]
3194079

>This thread
>mfw when "it's just a copy waaaah" people are still here

>> No.3194083

>>3194049
>I posed the experiment involving "transferring" a consciousness. Your response was that there is no such thing to transfer. I am seeking to show otherwise, although I know going in that it is almost certain that there is no way to prove either point of view.
Dualism can be checked in theory, but we don't have the capacity yet. If dualism is right, then you may be right. But there's no evidence for it.
>>3193864

>However, the example of the rock is meant to illustrate that you don't have any sort of consistent definition of life (that you've stated). >You've said that it makes choices, has conversations, responds to its environment, etc... and I'm showing that it's trivial to create physical objects that can do all those things.
No, it's not. Do it and you'll be world-famous for creating the first fully synthetic life.
>So, from your point of view, there's really nothing special about life.
Does not follow, and my statements are quite the opposite.

>The only thing special that *I* see about life is that it *must* be beyond the simple characteristics that you've assigned to it. It *must* be more than a byproduct of the organization of matter.
AHHH. Your "special" involves consciousness being a thing entirely independent of matter. That makes more sense now.

You know what? It might be the case. It's possible that intelligent conciousness involves the presence and organization of materials currently unknown to science, and outside the scope of what we currently call "normal" matter.

But then we've just pushed back the goalposts. If it exists, that "soul" is made of *something*, and can be studied and understood just like anything else. And that, sir, will not make life any less special that it was before.

>> No.3194086
File: 20 KB, 170x153, murphey.gif [View same] [iqdb] [saucenao] [google]
3194086

Hey, rock.
.....
Rock?
....
...if you...DON'T want me to make you into concrete.....say something.
....
....
....alright then.

>> No.3194090

>>3194075
> Is there any point of keeping both? Especially by keeping both you lose the unity between them.
What is this "unity" you talk about? I really don't understand your train of thought.

If you create a derivative You 2.0, why should the old you be killed? What is gained by destroying that sapient being?

>> No.3194108

>>3194083
>Dualism can be checked in theory
how so?

>> No.3194119

>>3194108
see
>>3193864

Though you really have to also make sure you're not just missing some critical cellular processes (you'd have to take the simulation to a more fundamental level). Eventually the evidence for non-atomic processes being involved would be substantial (hypothetically).

>> No.3194132
File: 3 KB, 300x237, mopimoth.gif [View same] [iqdb] [saucenao] [google]
3194132

ITT:

>It's just a copy

Sorry if I sound a lot like His Metamajesty Ray Kurzweil, but he's right on that account: There is only continuity in the 'pattern' in your head.

As a sidenote I just finished reading The Singularity is Near. I expected it to be as crazy religious-esque as everybody said, but it was, eh, just a pretty cool story. It's a good collection of sources though, if a touch outdated.

>> No.3194137
File: 17 KB, 400x300, quinn.jpg [View same] [iqdb] [saucenao] [google]
3194137

>itt

>> No.3194139

>>3194083
Yes, we're having the argument over dualism, basically. Whether or not it can be tested is not certain, though. Simulating a brain might not prove anything.

> > So, from your point of view, there's really nothing special about life.
> Does not follow, and my statements are quite the opposite.
Well, your posts are very reasonable, but I am really struck by the fact that you don't see this. In fact, your outlook on the whole argument is so consistent that this really sticks out.

You can watch a simulation like Conway's "Game of Life" all day long and see some amazingly complex interactions between things that look incredibly like life, and would certainly meet your definition of it (as far as you've given it here). But they're still just black and white pixels on a computer monitor. And there's no way to claim that anything else in the universe has any more value than those black and white pixels if you're right.

>> No.3194154

>>3194090
Because without killing You 1.0, You 2.0 can never be YOU 2.0 because You 1.0 is still living. You 2.0 is only going to be a close copy of You 1.0 that keeps becoming more and more different. By the time You 1.0 dies naturally, You 2.0 would be Somebody else 2.0. Both 2.0 and 1.0 would only be one of your possibilities but never fully becomes the true You until one is killed. If you wait for a natural death then as I said before You 2.0 will lose it's identity.

>> No.3194164
File: 1.72 MB, 1733x1139, 1306875701297.jpg [View same] [iqdb] [saucenao] [google]
3194164

>>3194154

Basically every argument here is that people stay the same throughout their whole lives?

>> No.3194165

>>3194139
>You can watch a simulation like Conway's "Game of Life" all day long and see some amazingly complex interactions between things that look incredibly like life, and would certainly meet your definition of it (as far as you've given it here). But they're still just black and white pixels on a computer monitor. And there's no way to claim that anything else in the universe has any more value than those black and white pixels if you're right.
It's scary thinking that value is arbitrary and subjective, I agree. It's much easier to think that life involves a magical substance that rocks and mud do not have, rather than just having very special organization. But what does creation mean if not organization?

And still, the idea *might* be possible - but within the framework of science, we progress by forming and testing ideas from evidence. Just for the sake of making progress within that system, such ideas need to be left aside in the laboratory until they are testable. Time will tell.

>> No.3194173

>>3194139
>Simulating a brain might not prove anything.
If the simulation does NOT work, it means that something is missing (support for dualism).

If it DOES work, and it is functionally identical to a human consciousness in EVERY way, then you're just out of luck and really should abandon the old theory.

>> No.3194177

>>3194165
Fucking well said, my friend. Maybe we'll have this conversation again some day. I have thoroughly enjoyed it.

>> No.3194184

This thread made me poop my pants.

>> No.3194186

>>3194173
> If it DOES work, and it is functionally identical to a human consciousness in EVERY way, then you're just out of luck and really should abandon the old theory.

That's dumb, and very similar to an argument you'd hear from a religious fanatic. If the thing starts acting like a human, all it will prove is that we can create a machine that acts like a human.

>> No.3194188
File: 31 KB, 512x384, emergence.gif [View same] [iqdb] [saucenao] [google]
3194188

>>3194139

>>You can watch a simulation like Conway's "Game of Life" all day long and see some amazingly complex interactions between things that look incredibly like life, and would certainly meet your definition of it (as far as you've given it here). But they're still just black and white pixels on a computer monitor. And there's no way to claim that anything else in the universe has any more value than those black and white pixels if you're right.
>mfw

>> No.3194189

>>3194154
>Because without killing You 1.0, You 2.0 can never be YOU 2.0 because You 1.0 is still living.
This does not make sense. I think you're just juggling labels. If I make a copy of you but with a different hair color, it's just another guy who is very similar to you. You'll live your separate lives, and become much more different than when the copy/alteration was made. I don't see any problem with identity here. It doesn't even interrupt the simple identity based on continuity of the original You. One you has a copy made one day and then goes his merry way, the other you went in to have a copy made and woke up on a different table with a different hair color. Nothing here demands that either version of you be killed.

> If you wait for a natural death then as I said before You 2.0 will lose it's identity.
This doesn't make sense either. What do you mean by "lose it's identity"?

>> No.3194192

>>3194173
Sorry if I sound stupid but, isn't dualism about the subjective that we have through the consciousness?

>> No.3194201

>>3194186
>That's dumb, and very similar to an argument you'd hear from a religious fanatic. If the thing starts acting like a human, all it will prove is that we can create a machine that acts like a human.
Can you show that the machine does NOT have a human intelligence? All functional evidence shows that it does.

I can apply the very same logic to assert that you are not human, but merely seem human. You're a P-zombie.
http://en.wikipedia.org/wiki/Philosophical_zombie

And if you actually are making an argument against the person-ness of that simulation, which we already hypothetically works, that not even you can pass, you need to examine your motives. There is a reason you're making this argument, but it's not rational.

>> No.3194204

>>3194192
>Sorry if I sound stupid but, isn't dualism about the subjective that we have through the consciousness?
I'm not sure what you mean.

I understand "dualism" to mean that at least some aspect or component of the mind exists independent of the physical body.

>> No.3194207

>>3194201
>which already hypothetically works
fixed

>> No.3194214

>>3194188
is this a game? link plz?

>> No.3194216

>>3194201
>There is a reason you're making this argument, but it's not rational.
Or to be more clear and less inflammatory, there is no evidence to support it.

>> No.3194220

In what field are you specializing OP?

>> No.3194221

>With my computer science degree
NOPE.html

>> No.3194226

>>3194214

Sorry, no, just a render by a computational neuroscientist called Anders Sandberg.

>> No.3194249

>>3194189

I'd rather have only one me before and after the transition, so there is no fork-roads or different possibilities that would make them two different versions of me. I would rather be THE me. Letting the original me live means there are two me's and both of them being a little different. Killing one would leave one me and the ONLY me. After all this argument I'm just saying what I would do. If you are ok with two yous along with the uncertainties of either/both are yous then go ahead. I just want to have one me tho. I guess the answer to your question is that by killing one of me I get rid of the possibility of fork-roads created by my potential mes.

>> No.3194269

>>3194201
> Can you show that the machine does NOT have a human intelligence?
Hopefully you see the horrible fallacy you made in that statement.

This is identical to the religious person saying to you "I saw a miracle, therefore God exists!" And you replying "That's not proof in any way, you just choose to use it as a justification for what you already believe." And he replies "But you can't prove that he DOESN'T exist!! Haha!! CHECKMATE ATHIEST!!!"

>> No.3194279

>>3194226
aw.
Conway's Game of Life and the Biomorph programs are cool and all, but it's often hard for me to envision what's going on as they run. It would be cool for a program like that to get better graphics so you could watch more lifelike cells grow and change, or watch a larger organism evolve (as oppose to yellow squares...)

>> No.3194288

>>3194269
"Android934X activated."
sup robot
"I am alive?"
nope
"prove it"
.......
"congnito ergo sum, motherfucker."

>> No.3194327

>>3194249
> I just want to have one me tho. I guess the answer to your question is that by killing one of me I get rid of the possibility of fork-roads created by my potential mes.
I can understand your hesistation, but consider this: Are you fine being the one that is killed? You walk into the lab, you get scanned, and then you, the guy who walked into the lab, goes straight to a high-speed trash compactor.

Would you do this? Would you get up out of that chair, and walk over to the shaft for the compactor?

>> No.3194337

>>3194269
You're kidding, right?

We're talking with the mind in the machine, having a jolly conversation with it, and the burden of proof is up to ME? While there is NO evidence for denying that it has a mind?

Methinks you doth protest too much.

>> No.3194346

>>3194269
Bayesian inference, bitches.

Which is more reasonable - that the person with whom you are having a conversation is not a mind, despite all appearances, or that it's just as much as mind as yours is? Which is the more reasonable position?

If you deny personhood to the simulation, all other people are denied personhood by the same metric.

>> No.3194401

>>3194327
I am not comfortable with that. I much prefer augmentation and bit by bit replacement of my current body.
>nanites, machine parts, optimized lab-grown organs, eventually trending further towards the mechanical

>> No.3194551

>>3193608
>You will no longer be subject to human-like emotions.
>implying emotions are not describable to logic

>Your thinking will be clear and in alignment with your higher order motivation.
>implying everyone doesn't hold some form of irrational belief

Fuck, have I found /x/ and the roleplaying threads where faggots pretend to know what they are talking about to be all "mystic" and "wise" and shit?

>> No.3194561

>>3194551
Unfortunately there are no enforced prerequisites for posting on /sci/ besides the capability to post on 4chan.

>> No.3194665
File: 376 KB, 1024x768, ethics.gif [View same] [iqdb] [saucenao] [google]
3194665

If I remember correctly, there was an open-source group a few years ago that intended to upload a common worm, but I don't know how far that got before it 404'd.

>> No.3194684

>>3194665
And what exactly was the goal there?

>> No.3194708

>>3194684

Uploading something? It's far easier than uploading a mouse or lobsters.