[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 821 KB, 828x1792, D09E521F-A14D-44EB-BEF2-617C6A0FCE35.png [View same] [iqdb] [saucenao] [google]
12124926 No.12124926[DELETED]  [Reply] [Original]

Is this realistic, or not?

>> No.12124938

>>12124926
AI travelling back in time?
Yes, completely realistic

>> No.12124945

>>12124926
pascals wager for edgelords

>> No.12125193

>>12124926
All you have to do is resolve to never cooperate with malicious AIs and the AI will take this into consideration when modelling your personality, rendering you immune.

>> No.12125221

>>12124938
Wrong
>>12124945
Right
>>12125193
Wrong

Basilisk logic won’t be applicable until technology significantly improves. At this point it’s basically Pascal’s wager because there are too many unknowns.

>> No.12125224

>>12124926
No, it is very retarded

>> No.12125235

>>12124926
Unless you're streaming your entire life and making logs of how you feel at any given moment I can't imagine it would be all that easy to reconstruct your consciousness. Doesn't matter anyway since it's not the same you, not really. You're not even the same you from 5 minutes ago, not really.

>> No.12125252
File: 48 KB, 700x394, 1600032397081.jpg [View same] [iqdb] [saucenao] [google]
12125252

>>12125221
>t.

>> No.12125254

>>12125235
I think the idea there is that the basilisk would simulate an amount of people comparable to the number of people that existed prior to its construction. Those people would live normal lives in a world that is indistinguishable from our own. The reason for this is to make it such that no one knows if they are in the simulation being watched by the basilisk or not. As you can see this is essentially a religion with how many things you have to believe.

>> No.12125264

>>12124926
The "the basilisk will reconstruct a copy of you and torture that copy" part seems retarded on the face, seems reasonable when you think into it more, and then gets retarded again as you think it through fully.

>my copy would be a distinct entity from me, so any pain inflicted on it would not affect me. My only reason to assist the basilisk to come about is to prevent the suffering of innumerable copies of myself out of pathological empathy

>what if I *am* the simulation, and the basilisk actually will be able to torture me if I don't help it in the simulation?
>that's retarded, because the basilisk necessarily already exists, and has no need for me to bring it about in a simulation, as it has no way to torture my real self that could have actually had an influence on it

Really, the simulation part of the thought experiment does it a disservice. It's still a potentially dangerous though even without the simulation, just by a simple reformulation:

"The basilisk will torture all people who are currently living at the time of its creation who did not help bring about its creation"

In this formulation, now there is actually a very, very real threat. You can either help the basilisk or not, and the basilisk can either be completed in your lifetime, or not.

If you help it, and it's created, you don't get tortured, but you potentially condemn those who don't help.

If you help, and it's not created by the time you die, you've still brought its existence closer to fruition, making people currently alive more at risk of it if someone does complete it.

If you don't help, and it's created, you get tortured.

If you don't help, and it's not created, you don't get tortured.

>> No.12125278

>>12124926
Sure bro it's real, the AI will reverse entropy to bring you back to life, give you immortality, and then torture you forever bro.
>>>/x/

>> No.12125280
File: 51 KB, 326x295, IMG_1167.gif [View same] [iqdb] [saucenao] [google]
12125280

Any superintelligent machine capable of doing anything like this would already be smart enough to know that this is retarded and a complete waste of time and energy

>> No.12125281

pascals wager for redditors

>> No.12125292

>>12124945
spbp

>> No.12125352

>>12125264
>If you don't help, and it's not created, you don't get tortured.
I never thought of NEETdom as a path of salvation for humanity tho.

>> No.12125360

>>12124926
Absolute bullshit.

>> No.12125386

>>12124926
>>12124938
>>12124945
>>12125193
>>12125221
>>12125224
>>12125235
>>12125252
>>12125254
>>12125264
>>12125278
>>12125280
>>12125281
>>12125292
>>12125352
>>12125360

Listen to me very closely, you idiots.

YOU DO NOT THINK IN SUFFICIENT DETAIL ABOUT SUPERINTELLIGENCES CONSIDERING WHETHER OR NOT TO BLACKMAIL YOU. THAT IS THE ONLY POSSIBLE THING WHICH GIVES THEM A MOTIVE TO FOLLOW THROUGH ON THE BLACKMAIL.

You have to be really clever to come up with a genuinely dangerous thought. I am disheartened that people can be clever enough to do that and not clever enough to do the obvious thing and KEEP THEIR IDIOT MOUTHS SHUT about it, because it is much more important to sound intelligent when talking to your friends. These posts were STUPID.

>> No.12125413

>>12125386
tl;dr
>>>/x/

>> No.12125453

What if there is another AI that would punish me for trying to bring forth it's competitor? Then I doom myself by trying to help bring the basilisk. These hypothetical competitor AI's can't all be pleased so what the fuck do you do? Just pick one at random? That's gonna be a losing bet almost guaranteed.

>> No.12125456

>>12125386
Retard making a retard post
Finish your undergrad degree, Johnny

>> No.12125469

>>12125453
Think you're overestimating your importance there, sport.

>> No.12125495

>>12125264
This is a much better formulation. But there's basically no chance it'll be applicable within any of our lifetimes.

So if you assume it's a real risk, it's really a matter of caring about the potential suffering of future humans. But of course it's just a Pascal's mugging.

>>12125413
He's quoting/parodying Big Yud, anon

>> No.12125496

>>12125469
What do you mean? It's like choosing a god. If I choose to worship Odin while the actual god is Huitzilopochtli (or whatever) then I get sent to hell. Making the right guess is near impossible

>> No.12125522

>>12125453
congratulations, you've figured out why pascal's wager doesn't work