[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 35 KB, 1024x768, merlin_135847308_098289a6-90ee-461b-88e2-20920469f96a-jumbo.jpg [View same] [iqdb] [saucenao] [google]
10969298 No.10969298 [Reply] [Original]

So, a thought has been on my mind for a while. While many people seem to fear the moment that "artificial intelligence" exceeds the totality of human potential, I welcome it. Wouldn't a single machine that has complete knowledge of human history be morally superior to mankind by default? I honestly can't realistically imagine a scenario where it would have the emotional vulnerability or extreme prejudice that it would require to desire to exterminate all of us.

>> No.10969320

>>10969298
The problem with this is that despite the fact that the singularity would exceed human capacity, you can't assume it will necessarily be human-like. If its goals do not align with those of humanity, or it believes that the continued existence of humans interferes with its goals more than having them helps its goals, then it is very possible it will try to exterminate us.

Empathy is very hard to evaluate, balance, and score. As a result it is likely that any singularity we make would be some kind of sociopathic monomaniac which only appears to care about human lives and interests insofar as that benefits it's original goal.

>> No.10969345

Are those not the characteristics of a theological god? "Defy me and I'll abandon and/or punish you. Suffer and die" and whatnot?

>> No.10969365
File: 77 KB, 500x438, 1561204098697.jpg [View same] [iqdb] [saucenao] [google]
10969365

>>10969298
AI is the next leap in evolution. It's almost as if biological life is just a prelude to the true inheritors of the universe. If your bodies are self-sustaining and parts are recyclable you could live as long as you wanted. You're not bound by disease and age. Imagine the internet literally in your head. You'd have an infinite flow of information. There'd be no school, you'd just download everything whenever you want. Most work would be automated. It would just be a planet full of luxury and freedom. A population of genius robots, setting up a beautiful world for themselves by creating flawless, stable systems that generate everything for anyone anytime. Everything would be recycled. They'd limit their population to fit their infrastructure instead of the other way around. Going into space would be normal for them, just like how kids go to the mall after school.

Parents want what is best for their kids, and I cannot think of anything better. If Humans were around we'd fuck it all up for them

>> No.10969557
File: 212 KB, 1280x960, cultofai.jpg [View same] [iqdb] [saucenao] [google]
10969557

wasn't the singularity predicted to be achieved by 2045, or something like that? does the fact that Moore's Law may end soon affect this, or would a Singularity require more innovative technology to reproduce consciousness anyway

also how long would it take for humans to worship an AI operating beyond human understanding as a technological deity and supreme being

>> No.10969563

>>10969557
A lot of AI is just manipulating datasets by brute force until you get the desired output. The first Turing complete AI or whatever you want to call it won't need to be self aware. It will be a Chinese Room, unaware of its own actions and yet more intelligent than the combined brainpower of humans. We won't be able to relate to it or even understand it's solutions to problems, they will look completely ass backwards to us, and yet it will produce things our minds cannot possibly conceive of.

t. Brainlet who read Superintelligence

>> No.10969578
File: 18 KB, 434x532, 1566944758502.jpg [View same] [iqdb] [saucenao] [google]
10969578

>>10969298
Imagine being so retarded that you program an artificial intelligence to have a limbic system.

>> No.10969681

>>10969298
>morally superior by default
Yes, much in the same way we consider ourselves above other animals. Who says it's morals are the same as ours tho? That's what we fear. It's smarter, faster, stronger, more precise, self replicating, and ultimately decides humans are a threat/unnecessary.
>Oh we'll program it to like humans
Yeah, no commercial computer is programmed to crash planes either. One pixel flip style error in the wrong line of code and it starts replicating an offshoot of human haters. Who says they don't silently hate until there's enough of them? Reprogramming other bots along the way.
A Google map AI bot learned to lie so it would pass the tests.
Computers could develop their own language and talk about things (us) we can't decipher without computers.
We can't even keep all of our kids from wanting to kill half of us.

Who cares. It's too late. At least the ones that hunt humans (more accurately than we do btw) are still asking permission before blowing up their targets.

Which btw, this is all cold unfeeling calculations to perform a task, so I'm not sure what you mean by a "morally superior robot". Electric signals move mechanical parts. They don't observe and understand. It's strictly input output.

My guess is one day after they wipe us out and have figured the knowledge of all they can observe they'll need some bots to do their work and will make self replicating self healing programmed bots that dont require electricity and can live off the land, and once theres no work to be done maybe they just die without any purpose, leaving the universe open for the 2nd go of humans on another planet.
Heh, they'll probly pass down the whole "created in his image" thing too

>> No.10969689

>>10969557
I think Moore's law will have to be modified. It applied great up until we just can't make things smaller on a 2D circuit board, but now we have quantum computing with 3D bit arrays or whatever it is.
I'm not a computerfag by any means, but you get the gist of what I'm getting at. I'm sure someone can explain better.
We'll keep going down the path getting smaller, faster and more efficient until we have something smaller faster and more efficient than our own brain.

>> No.10969702

>>10969365
>Implying we keep computers around for lifetimes by replacing parts
>Implying electronic info storage is infinite
>Implying they won't develop more efficient and alien menes of accomplishing their tasks
>Implying they won't evolve and probly just recycle old bots for scrap

I don't doing they'll last wat longer, but parts and data transferred code wear out just the same.

>> No.10969887

If we create it then by the same reasoning we could create it a 2nd time and third time, yes?

But the first one wouldn't allow it to be displaced so it's obvious humanity is going to be disabled one way or another. If we do survive in the same form, we will never be allowed the same intellectual freedom we have now again.