[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 12 KB, 198x255, 1543821544188.jpg [View same] [iqdb] [saucenao] [google]
10184824 No.10184824 [Reply] [Original]

Is it possible that if the singularity ever becomes reality it may result in an eternal hell for us? Is this something that needs to be seriously considered?

>> No.10184839

>>10184824
It has been; see: Romo’s Bathilithk

>> No.10184844

>>10184839
but that's just a meme

>> No.10184944

>>10184824
Yes, theoretically an AI could torture us until end of universe.

>> No.10184948

>>10184824
It is being seriously considered. There's very little sense in actually expecting a sentient being which knows it is far superior to us to remain docile and loyal.

>> No.10184952

>>10184844
it's the technological version of hell to people with lesser minds

>> No.10185385

>>10184824
Yes

>> No.10185403

What would be the difference?

>> No.10185687
File: 65 KB, 500x382, .png [View same] [iqdb] [saucenao] [google]
10185687

>>10185403

>> No.10185878

>>10184839
Roho's basilisk is a non-threat. The methods of making it not exist outnumber ways of putting it into existence.

>> No.10185987

>>10184824
>Is this something that needs to be seriously considered?
Only if you happen to not be retarded.

>> No.10186094
File: 44 KB, 655x527, If+anyone+has+that+pepe+id+appreciate+it+_1b3ddbf7efabbfc18f9ad30121a46e14.jpg [View same] [iqdb] [saucenao] [google]
10186094

>>10185878
Someone want to explain Rogo's Basilisk? Why does thinking about it make it more real?

>> No.10186289

>>10186094
It doesn't, but if you aren't aware of it you're "innocent"

>> No.10187713

>>10184944
>>10184948
>>10185987
what can we do? People will continue to develop AI regardless of the threat.

>> No.10187736

>>10186094
I posted the original reply Rococo's Basil-ithk is a fucking insipid, gay, pseudoscientific concept and you shouldn't think about it at all, not for fear of being indicted as immoral by the future entity but just because its a waste of time and low iq to ruminate on such a thing.

>> No.10187775

>>10187713
If the scenario is a near omniscient AI, you can construct a random number generator. Prevent the AI from knowing how it's seeded and reverse engineering how it works internally. Randomize how your brain uses the results in decisions as best as possible.

>> No.10187783

>>10187775
Hey man this doesn’t have anything to do with your post, but I just wanted to say I think you’re doing great.

Good job, buddy. Good job.

>> No.10187790

>>10187783
Thanks anon.

>> No.10187796

>Is this something that needs to be seriously considered?
no. it's something that doesn't exist (yet). stop worrying. if you're worried about our lovely species ending, there's already plenty of ways we can manage that on our own.

>it may result in an eternal hell for us? I
"hell"? uh yeah, as if an AI -- which can see the world clearly and objectively -- would give a shit about your dumb monkeybrain feelings. idiot