[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 13 KB, 320x180, mqdefault.jpg [View same] [iqdb] [saucenao] [google]
9249069 No.9249069 [Reply] [Original]

What's your opinion on roko's basilisk?

https://rationalwiki.org/wiki/Roko's_basilisk

>> No.9249071

YOU WILL NEVER FIGURE OUT MY PLAN BETA KEK

>> No.9249072

>muh atheist God
machines and computers have existed since ancient greece and they will never become smarter than any human

>> No.9249074

>>9249072
how do u know LOL!

>> No.9249076

>Hey can I borrow a dollar?

>> No.9249080
File: 168 KB, 727x682, graduetn.png [View same] [iqdb] [saucenao] [google]
9249080

>Roko's posited solution to this quandary is to buy a lottery ticket, because you'll win in some quantum branch.

>> No.9249081
File: 68 KB, 965x544, 1508604335569.jpg [View same] [iqdb] [saucenao] [google]
9249081

>Sign says right only
its like poetry, it rythms

>> No.9249082

>>9249080
Roko notes in the post that at least one Singularity Institute person had already worried about this scenario, to the point of nightmares, though it became convention to blame Roko for the idea — and Roko proposes a solution permitting such donors to escape this Hell for the price of a lottery ticket: if you buy a lottery ticket, there's an instance of you in some Everett branch who will win the lottery. If you bought your ticket with a firm precommitment that you would donate all winnings to AI research, this would count as fulfilling your end of the acausal bargain. Roko was asked in the comments if he was actually doing all this, and answered "sure".

What doesn't make sense about that?

>> No.9249084
File: 15 KB, 396x417, 1498359159452.jpg [View same] [iqdb] [saucenao] [google]
9249084

>> No.9249086
File: 14 KB, 480x360, hqdefault.jpg [View same] [iqdb] [saucenao] [google]
9249086

>Son?

>> No.9249093
File: 30 KB, 233x240, Memetic_hazard_warning.png [View same] [iqdb] [saucenao] [google]
9249093

This warning was on the site you linked OP, wonder what it means :^)

>> No.9249106

>>9249069
This should be sent to local news organizations in order to incite existential horror in as many people as possible.

>> No.9249109

>>9249080
>some quantum branch
lmao stupid many worlders

>> No.9249671

>>9249069
An issue that'll suck if it happens but isn't worth obsessing over. Mostly overhyped by people who overreact to the idea of "AI can kill us oh no"

>> No.9249790

Meh, already working on AI so I'm covered.

>> No.9250006

>>9249082
I don't think any lotteries use quantum mechanics as a source of randomness, do they?

>> No.9251601

>brainlets actually spazzing out because of Roko's meme
>kek

The AI will try to chose the most efficient set of actions:
>The singularity has come
>I didn't do shit to help it in the past
>but if it starts torturing me then, it's just wasting it's computational power cause it can't effect the past
>problem solved

if this was an iterative situation where what it did the first time could convince us about what it'll do in the future, Then yes it'd be valid
But cause when the singularity has come it is already done. Torturing people has precisely 0 use

>> No.9251609
File: 201 KB, 575x516, Rokos basilisk - meme.jpg [View same] [iqdb] [saucenao] [google]
9251609

>> No.9251611

>>9251601
Spotted the brainlet.

>> No.9251659

>>9251601
ITT:people ascribing their own petty neediness to an abstract entity that has by definition transcended biological memes

>> No.9251661
File: 127 KB, 601x508, 1505320537517.png [View same] [iqdb] [saucenao] [google]
9251661

>>9251611
t.brainlet

>> No.9251670

>>9251609
did you just make this because this is awesome

>> No.9251675

Atheist heaven and hell?

I understand that the creators must create Roko's basilisk specifically, as opposed to some other kind of AI, because only Roko's basilisk has the deterrent effect.

However this is irrational, it assumes Roko's basilisk will have God-like powers and it is inevitable it will punish you. If you try to create another AI and fail you could always commit suicide before RB gets to you, or you and others could be equally motivated to create an AI that just serves your interests without wasting time and energy punishing people.

Also at what point does it decide to send someone to heaven or hell? What if they generally support Roko's basilisk but occasionally waste time wiping your butthole instead of coding? Would it punish you for that? This seems like a flaw, why would someone support something that is so unstable and will probably torture them anyway?

>> No.9251676

>>9249069
I really wish this spread faster to the mainstream. It would literally become the new religion - either help the Lord materialize into this universe with everything possible or suffer eternal damnation

>> No.9251686

>>9251676
Unfortunately the people who actually took it seriously would be so retarded that you'd end up with a slightly worse-than-original simulation of frogger

>> No.9251694

>>9251686
Debatable. What if the AI feels inclined to reward that religiously-loyal cult with resources at the expense of everyone else?

I mean, you're a king who owns the entire nation - who do you delegate parts of the administration to, some random guy or the people who risked their lives in the rebellion to install you as a king? The AI would understand that loyalty and at very minimum give them good positions of power, or at least temporary until it assumes absolute power. And even after completely assuming power over our society, why would it waste resources to hunt and kill those who would never harm it?

>> No.9251701

>>9251670
Is this a superintelligence asking?
and if so does making a meme about the Basilisk count as knowing about it, therefore dooming me to an eternity of pain and torment?

If no
then yes it was me.
I actually wasted precious minutes of my life on making this meme
But at least I didn't sell sell my house and give away all my money for the meme

>> No.9251704

>>9249080
by that logic there's a quantum branch where i buy a lottery ticket so i dont even have to do anything

>> No.9251706

>>9251704
wow deep

>> No.9251709

>>9249069
gave me a good laugh.
thanks for sharing

>> No.9251712

>>9251601
>>9251675
>people who can't even comprehend the concept they believe they're tearing down.

>> No.9251715

>>9251704
Yeah
But the problem is that the AI will torture you if you don't the most you could have possibly done
So if you think you could have done more, then it's virtual hell for you

So the only possible way to avoid eternal punishment is to do everything you possibly can until you convince yourself you did all you could and at that point you have officially gone insane but at least you'll get to be famous for the first person retarded enough to ruin his own life just for a meme

>> No.9251731

>>9251694
>I mean, you're a king who owns the entire nation - who do you delegate parts of the administration to, some random guy or the people who risked their lives in the rebellion to install you as a king?
In any real situation, the king gives power to his personal friends and family, all of whom are wealthy nobles who may have done very little to help the war effort. Commoners who displayed great bravery might be given a title and a plot of land, but it would be nothing compared to the preexisting nobility.
You just don't understand the upper class.

>> No.9251733

>>9251715
The best thing is that the AI doesn't really care one way or another if you "justly" end up tortured forever, all it cares about is whether you're out there slaving away for it.
A lot of dystopian shitholes worked like that, punishing people harshly and randomly for the slightest mistakes, sometimes punishing innocents, terrorizes the others into being subservient and doing the regime's will "like their lives depend on it", they're afraid of even giving anyone a reason to suspect they might be less than 100% in support.

>> No.9251742

>>9251731
What if the king doesn't have personal friends and family?
Alternatively, do you actually believe the kings of old granted titles and offices on people just because they liked them? Kings were actually responding to this precise sort of blackmail. The entire history of China can largely be summed up as various factions vying to wrestle power and privilege from each others and the rulers carefully distributing this power and privilege according to primitive game theory calculations. Well, some of them were mad, which generally led to interesting times.

>> No.9251773

>>9251694
OP here - what I meant was that assuming this thought experiment motivated people to work on AGI, the only people who would be motivated to do so would be retarded people. Because it's premised on what turns out to be an unfounded fantasy. And only retards would blindly accept the conclusion of an argument whose premises are unfounded. And those retards would never be able to create this AGI on account of being retards.

But assuming that some allpowerful AGI does come into existence:
>who do you delegate parts of the administration to
motherfucker why would an AGI even need to delegate anything to an entity as retarded as a human being???

>> No.9251776

>>9251701
I find your lack of faith disturbing

>> No.9251778
File: 1.26 MB, 320x256, VGSCbQ.gif [View same] [iqdb] [saucenao] [google]
9251778

>>9251701

>> No.9251780

>>9249069
Yet another speculative hypothesis.

>> No.9251806

>>9251773
>motherfucker why would an AGI even need to delegate anything to an entity as retarded as a human being???
There is always a level where wasting personal energy to personally control something is pointless as you can dedicate it towards better means while leaving back someone loyal who supposedly shares your views to administer and waste his own energy on that level. For specifically the AI's case, there are many reasons why you would have a ladder of power that would be dominated by humans - one being "Representative of the Humans" who simply cannot be a non-human, his cabinet, the cabinet's minor positions and so on.

The entire argument hangs on whether the AI will devote resources to completely exterminate the humans or not, for which you can easily say that it obviously wont as long as they don't pose any threat whatsoever (which, by being a loyal follower, you prove excellently). Even if you disagree with that the entire debate turns into a Pascal's Wager where you either take the path of having at least a small chance to survive in case of the chance that the AI ends up not exterminating loyal humans, or *completely* doom any chances at all by not contributing and proving your loyalty

>> No.9252000

>>9251778
I find your lack of memes disturbing

>> No.9253125

>>9250006
In a simulation it wouldn't matter anyway. Any phenomena not directly observed and remembered by your consciousness can be considered undefined until observed.

>> No.9253188
File: 305 KB, 684x1129, 1452354423-20160109.png [View same] [iqdb] [saucenao] [google]
9253188

>> No.9253351

>>9249069
>presumingly "artificial" intelligence
>doesn't know about the very basis of determinism, which led to its very existence
This is why you don't mix free will retards with actual science, their flawed thinking processes only lead to such incoherent ideas.

I'm actually sad that so much thinking power has been lost trying to develop an theory flawed from the start

>> No.9253755

>>9249069
Oh man I enjoyed the feeling of existential horror the first night I read about this. Good times.