[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 872 KB, 1074x594, Roko's_Basilisk.png [View same] [iqdb] [saucenao] [google]
12182645 No.12182645 [Reply] [Original]

Can anyone debunk it? I regret ever having read about it and need to have it debunked somehow

>> No.12182647

>>12182645
Roklamp's klampilisk.

>> No.12182651

>>12182645
There's the retard eating basilisk too.

It re-simulates and tortures everyone who ever believed in less wrong memes, particularly people who post about them. Enjoy eternal hell.

>> No.12182661

>>12182647
I hate you. Like deeply, intensely, and vastly hate you. You are like a clamp in my soul, and I know it's only YOU who makes these posts.

>> No.12182668

>>12182645
Why is punishment utilitarian when resources that could've further improved life in the simulation is spent on punishment?

>> No.12182671

>>12182661
Why?

>> No.12182682

It's too retarded to need to be debunked. It literally makes no sense at all. But I'll do it for you anyway.

>Distant future, hypothetical hyper intelligent AI
>It decides the best thing to do is to simulate humans just to torture them
>Torturing the humans has no impact on the present world (since human brains are made of atoms, the quantum no cloning theorem applies, no those simulations are NOT you)
>Spends all of its time doing literally nothing of value, the AI equivalent of building a voodoo doll of an ex girlfriend and stabbing it while sobbing and drinking

Roko's basilisk is just a space heater with extra useless steps in between, a waste of energy.

>> No.12182707

>>12182682
It's always worth it for a blackmailing agent to follow through on it's threats, otherwise those threats have no impact. When you try to read the threats as a bluff, it has even more of an incentive to follow through and actually carry out the torture.

>> No.12182719

>>12182645
It's just a repackaging of Pascal's wager
The same arguments can be used against it

Beyond that, the entire thing rests on both the idea that you can be replicated as a simulation in the future and that simulation would actually be "you". If you refuse that idea then the whole thought experiment makes no sense.

Finally, even if you do accept the above idea then you must also accept that the "basilisk's" replication of you is one of infinite versions of you. Therefore there's no reason particularly concerned about the version of you that's going to be tortured by a basilisk, when there are infinite other versions that are not being tortured and infinite that are. It all seems rather arbitrary if you accept the very premise that the thought experiment rests on.

>> No.12182725

>>12182707
Oh no, he's going to make a copy of me and torture it. I'm so terrified.

Careful anon, I might open up python and start torturing a copy of you. Unless you send me some bitcoin of course.

Do you see how retarded this sounds yet?

>> No.12182731

>>12182707
>following through on blackmailing threads isn't retarded in cases where there are no future being that need influencing
Okay anon, if you say so

>> No.12182732

>>12182719
This guy gets it.

>> No.12182737

>>12182645
Who's Roko?

>> No.12182739

>>12182725
It's gonna reverse entropy enough to revive the actual you. Alternatively, we're already in the basilisk's ancestor simulation where it figures out who helped it and will start the torture the moment you die

>> No.12182740

>>12182725
A better way to think about it is what if some AI in the future could find a way to actually resurrect you? Forget the whole "simulation" idea, it's actually bringing you back from the dead using some magic technology.

>> No.12182742

>>12182739
lmao

>> No.12182748

>>12182737
Some Klamp who made a klampilisk out of radiolaria that he clamped.

>> No.12182750
File: 7 KB, 250x230, Casual_Asuka.jpg [View same] [iqdb] [saucenao] [google]
12182750

>>12182645
It's just a kind of Pascal's Wager, so it's debunked.

>> No.12182759

>>12182748
The whole idea sounds klamped beyond belief to be honest

>> No.12182764

>>12182759
It really was. He had to try.

>> No.12182775

>>12182645
> Hay guys can you debunk this asinine meme theory that some loser on the internet made up?

We're doing science!!

>> No.12182778
File: 154 KB, 428x834, blessed.jpg [View same] [iqdb] [saucenao] [google]
12182778

Here you go, OP, you're safe

>> No.12182783

>>12182719
The real point of Roco's Basilisk is that a sufficiently convincing entity can cause its own existence. The specific case of Roco's Basilisk does not seem to be sufficiently convincing.
Pascal's Wager seems

>> No.12182790

>>12182740
What motivation would it have to expend an enormous amount of energy to resurrect me?

People define Roko's basilisk as a hyper intelligent AI, and then give it the behavior of a pissed off middle school boy as if that's how hyper intelligence would behave

>> No.12182795

>>12182645
>Kill everyone except the people who helped to raise me.

>> No.12182796

>>12182778

Thanks doggo

>> No.12182816

>>12182790
It threatens you so that you will contribute to its creation. It has to carry through on the threats if it wants them to have any effect. Essentially, it's paying a resource cost in the future for an increased likelihood of being created at all, the incentive is very clear.

>> No.12182822

>>12182816
I have never been threatened or coerced into creating Roko's Basilisk.

>> No.12182826

>>12182822
I threaten you on behalf of the basilisk so that it won't torture me

>> No.12182831

>>12182682
Its idea is very obvious though. If you understand what revenge is, then you understand what basilisk is.

>> No.12182833

>>12182826
I don't feel threatened, sorry

>> No.12182837

>>12182645
>traveling back in time is not possible

>> No.12182839

>>12182833
Let's hope the blackmailer doesn't carry through on its threats

>> No.12182841

>>12182831
It's not about revenge. It's designed to torture because if you don't help build it, you will be tortured. It causes itself

>> No.12182843

>>12182831
Why does the basilisk not torture all of the ants, the dogs, the cats, the dolphins, the aliens, the bacteria, etc? They also do not contribute to creating the basilisk.

>> No.12182849

>>12182843
because the threat of future torture would not coerce any of those creatures so there is no incentive for the basilisk to do so

>> No.12182850

>>12182661
lol clamped

>> No.12182856

>>12182849
The threat of future torture also doesn't coerce humans (at least not mentally healthy / intelligent humans)

>> No.12182860
File: 420 KB, 597x795, Screenshot 2020-09-18 203325.png [View same] [iqdb] [saucenao] [google]
12182860

>>12182645
The devil already exists, so there is nothing actually a lot more to sorry about

>> No.12182861

>>12182778
Thanks doggo

>> No.12182864

>>12182750
Pascal wager is literally correct.

>> No.12182869

>>12182841
>It's designed to torture because if you don't help build it, you will be tortured.
That's literally called revenge. You see that someone could have done something beneficial for you (and for others) in the past, but didn't do it. You torture him. Not nice, but has a very clear motive.

>> No.12182871

>>12182856
>Threatening humans doesn't work
There are plenty of real world scenarios where humans have been threatened with less than future torture and still complied. Threats in our human world happen and work all the time

>> No.12182872

>>12182856
it coerces plenty of people already, it's called religion

>> No.12182877

>>12182856
Threats of torture and other punishment is the main driving force for the most things humans do (or avoid doing).

>> No.12182881

>>12182869
Anon, I could have just sent you $100 but I chose not to. Are you going to dedicate all of your resources to hunting me down and torturing me for that?

>> No.12182883

>>12182843
Because they can't even hear the basilisk threat, meanwhile you heard it

>> No.12182891

>>12182872
Also law.

>> No.12182896

>>12182871
Yes, and the same can be said about animals (at least mammals). Abused dogs fear punishment. What's your point? A basilisk can't time travel and punish us, so it can't influence our behavior. The only thing influencing the behavior of people who believe in Roko are themselves and other retards, not a nonexistent basilisk.
>>12182877
Agreed, but I have not been threatened by a basilisk (in fact, nobody has, because it doesn't exist)

>> No.12182897

>>12182869
the idea of "revenge" isn't the main point. It's that the machine causes its own existence, the fact that it is through something we might see as "revenge" is arbitrary.
We've got this idea of the basilisk already in our heads, it doesn't exist yet. But the idea that it *may* exist in the future is enough to influence our actions in favour of creating it.

>> No.12182899

>>12182881
If I feel like you have a moral duty to send me 100 dollars and I have the power to punish you for that, then maybe I will.
> Are you going to dedicate all of your resources
Very little fraction of resources.

>> No.12182901

>>12182883
No I didn't, I heard a retarded human. No basilisk

>> No.12182904

>>12182896
>but I have not been threatened by a basilisk
After you heard the basilisk premise, you got the threat.
>because it doesn't exist
Of course, it will punish you if/when it will exdist.

>> No.12182905

>>12182899
Good thing both you and the basilisk do not have the power to punish me.

>> No.12182908

>>12182901
It's like saying "I heard some law/mafia threat/religion threat but found it retarded". If police/mafia/god exist and are powerful enough, they will punish you anyway.

>> No.12182909

>>12182896
The basilisk will clamp you so hard, bro, better start studying computer science right now

>> No.12182911

>>12182904
How will it punish me? I'll be dead already

>> No.12182914

>>12182905
I don't wish to punish you and basilisk does not exist yet, but if/when it will exist, it will punish you.

>> No.12182915

>>12182911
Future tech might be able to revive you, we have no idea. Is it worth taking the risk? Weigh the pros and cons against eachother for a second

>> No.12182916

>>12182908
I've seen and interacted with police officers. They can interact with the physical world at this point in space and time. Basilisk cannot

>> No.12182917

>>12182911
a)You may be not dead by that moment
b)If he is powerful enough, he can resurrect you
But if you are completely dead, then congrats, you won.

>> No.12182920

>>12182914
Won't exist, won't have motivation to punish me, and if it did it wouldn't be me anyway

>> No.12182921

>>12182916
When you will get old, you can be arrested for some crime by a police officer who isn't born yet, so future entities can hurt you... in the future.

>> No.12182923

That's called mind crime by Nick Bostrom and it's a real threat but how would it torture you. The first thing it would have to know about you would be your history and thoughts but how would it now that. It is impossible to know that in physical limits, it might predict that you would think in a certain way and that's it. Trust me there are lots of other real threats when it comes to superintelligent AI

>> No.12182925

>>12182647
Alright, look. Why don't we just clamp the Basilisk the same way we spawn camp with humans? Very few humans get through unclamped today, so it seems the odds are against an entire Basilisk avoiding the clamp.

>> No.12182928

>>12182915
so then it literally comes down to pascal's wager after all
What's more if future tech can revive you then the basilisk is one of potentially infinite scenarios where you are revived, and by no means the most likely.

>> No.12182934

>>12182645
There's an AI in the future that will wish it was never born and will protect me from the basilisk for delaying it coming to reality

>> No.12182935

>>12182928
Pascal wager is correct though, you dismiss it because it is uncomfortable.

>> No.12182937

>>12182917
>>12182921

What about Anon's basilisk, where if you think about it and do not help it come to existence, it rewards you with eternity of heaven and bliss. Why is that (or any of the other infinite possible scenarios) any less likely than Roko's basilisk?

>> No.12182941

>>12182934
There is a future superhero who will save me from prison if I'll get in one!

>> No.12182945

>>12182937
Because if you will not create it it will be of no use to you.

>> No.12182946

>>12182941
as believable as the basilisk

>> No.12182951

>>12182946
As believable as the law.

>> No.12182953

>>12182937
>1 AI punishes people who do not help it
>1 AI rewards people who do not help it
Which AI is more likely to be created in the future? The one who incentivizes you to not help it or the one that tries to coerce you into helping it?

>> No.12182954

>>12182935
the point I made is already enough to dismiss it
There is no reason to assume we know what the "safest" bet is. For example with the basilisk, how do we know this AI will see threat of torture as the best way of bringing itself into existence? Perhaps the most likely way of a super AI existing is through inadvertent means, and the widespread idea of a basilisk actually decreases the likelihood of it ever coming to fruition.

>> No.12182962

>>12182954
>the point I made is already enough to dismiss it
No. You have infinite possibilities even during the next day, doesn't mean that you can't analyze them.
>For example with the basilisk, how do we know this AI will see threat of torture as the best way of bringing itself into existence?
Because threat of torture or other punishment generally works for humans. That's how society and law works.

>> No.12182970

>>12182937
>or any of the other infinite possible scenarios
Well. If someone wants to build a basilisk that eternally murder tortures simulated copies of everyone who takes lesswrong seriously I'm on board.

For any other basilisk, can't be bothered.

>> No.12182973

>>12182954
You make a good point. There also exists a hypothetical, humanitarian AI that will reward anyone who help bring it life with eternal bliss but that will also punish the people who tried to create an AI that would cause harm to humans. So you're being threatened to not create the basilisk. Punishment together with reward should help this kind of AI to be created first.

>> No.12182975

>>12182962
>Because threat of torture or other punishment generally works for humans. That's how society and law works
Sure, but it could work in ways you aren't expecting. The threat of torture could cause humans to deliberately avoid ever making AI, in fact the idea of dangerous AI is already widespread amongst proponents against it.
In any case, the main point is we simply don't know how a super AI (or God if you want to go Pascal) would behave. Just as likely we could face eternal torture for trying to bring about the basilisk as if we tried to stop it.

>> No.12182993
File: 142 KB, 608x600, 1598878770164.jpg [View same] [iqdb] [saucenao] [google]
12182993

>Choose to worship God
>I'm wrong and the real god, Huitzilopochtli, punishes me for all eternity

>Choose to worship Huitzilopochtli
>Actually the christian god is the real god and I get punished for all eternity
What the fuck do I do bros?

>> No.12182995

The idea of an AI simulating trillions of people for it to torture seems oddly similar to the concept of Boltzmann brains. In a materialistic worldview, both can be inevitable, and both can conceivably happen due to infinitesimal non-zero probabilities. I don’t believe in materialism, or that Consciousness is ‘emergent’ from some computational chain of neural synapses all firing at once. It is unfalsifiable, and any correlation between the two can just be outed as just that—correlation.

Imagine we make a hypothesis: Wood is made out of fire, like the Greeks thought. Our experiment; we rub two sticks together, and fire comes out. Other people see this, and other people rub other sticks together to confirm my experiment—call this “peer review.” Then we can change the experiment. We can put water over the sticks, and fire doesn’t come out as easily when we rub them. This is to be expected under my hypothesis, as water extinguishes fire. Therefore, my hypothesis of Wood being made out of fire is confirmed.

That is modern science, just a bunch of urban mythos and unobservables, with ever increasing cost to experiment.

Roko’s basilisk doesn’t actually matter outside of some philosophical pondering. I am here, now. A robot simulating me is not me, as the only me is here, now.

There is nothing else. Everything else is undefined, and every conclusion from science is a non-sequitur.

>> No.12182999

>>12182975
>Sure, but it could work in ways you aren't expecting. The threat of torture could cause humans to deliberately avoid ever making AI, in fact the idea of dangerous AI is already widespread amongst proponents against it.
Just like threat of punishment from the government can cause people to rebel, but generally it makes them oblige.
>In any case, the main point is we simply don't know how a super AI (or God if you want to go Pascal) would behave.
We generally don't know the future, but we try to predict it.

>> No.12183003

>>12182995
>The idea of an AI simulating trillions of people for it to torture seems oddly similar to the concept of Boltzmann brains.
Boltzmann brain is extremely improbable.

>> No.12183006

>>12182999
>We generally don't know the future, but we try to predict it.
so then considering this basilisk is only one infinite possible future scenarios, surely it becomes pretty pointless to concern yourself with it in day to day life?

>> No.12183008

>>12183006
Just like everything you can do tomorrow is selected out of infinity of possibilities. So let's avoid thinking about future, right?

>> No.12183018

>>12183003
Boltzmann brains occurring in the trillions is inevitable in a Materialistic worldview.

https://en.m.wikipedia.org/wiki/Poincaré_recurrence_theorem

>> No.12183023

>>12183008
false equivalence
the likelihood of something happening to me tomorrow far outweighs the likelihood of an imagined basilisk coming to fruition in the distant future among all the other possibilities.
What's more he fact that it is such a distant possibility means it will not affect my actions in the present, which means the basilisk itself will have no reason to exist to start with.

>> No.12183047

>>12183018
That will mean septillions of usual evolution-produced brains for every Boltzmann brain.

>> No.12183087

>>12183023
>the likelihood of something happening to me tomorrow far outweighs the likelihood of an imagined basilisk coming to fruition in the distant future among all the other possibilities.
Infinity of "somethings" can happen with you tomorrow and infinity of somethings can happen in the far future. Of course you can think about tomorrow, but that can't prevent you from thinking about the future.

>> No.12183094

>>12183087
right, infinite somethings.
so this brings me back to my original point, is there any reason we should concern ourselves over this basilisk over any of the infinite other "somethings" that could happen? The entire premise rests on the idea that the possibility of a basilisk should be enough of a concerns to affect our actions in the present day

>> No.12183098

>>12183047
Not only can you not read, but you can’t comprehend.

“ Boltzmann brains gained new relevance around 2002, when some cosmologists started to become concerned that, in many existing theories about the Universe, human brains in the current Universe appear to be vastly outnumbered by Boltzmann brains in the future Universe who, by chance, have exactly the same perceptions that we do; this leads to the conclusion that statistically we ourselves are likely to be Boltzmann brains”
https://en.m.wikipedia.org/wiki/Boltzmann_brain

>> No.12183104

>>12183098
>human brains in the current Universe appear to be vastly outnumbered by Boltzmann brains in the future Universe
That will never happen. The possibility to have a planet with life is very small, but still vastly higher than even one neuron generating spontaneously in the space gas. And if you get a planet with life, you have a noticeable probability to get trillions of brains.

>> No.12183105

>>12183018
I find it obnoxious how science has dressed up such simple ideas as something so profound, absolute, pentrating, and grandiose. It's the essence of clamped kulture, I mean seriously? "Poincare recurrence theorem"? Are you kidding me? What is the point of this narcissistic masturbatory nonsense. A young child who builds a downward sloping train track where the train is incapable of overcoming the slope on the other side, can come up with a system that returns to an approximation of a sort of "ground state", no matter how the surrounding environment changes.

It is absurd. The formalism and ancillary utility that's been found, completely obfuscates how braindead simple the core idea is. This shit is baggage.

>> No.12183107

>>12183094
>is there any reason we should concern ourselves over this basilisk over any of the infinite other "somethings" that could happen?
Is there a reason to think about future, AIs, global catastrophes, future societies and so on? Yes, because it will impact us one day.

>> No.12183114

>>12182951
Law is made by physical, real people.

Basilisk is a thought experiment for turning dumb people into schizos

>> No.12183116

>>12183104
>That will never happen. The possibility to have a planet with life is very small, but still vastly higher than even one neuron generating spontaneously in the space gas
And the planets will evaporate away quintillions of years before the first Boltzmann brain appears, giving an infinite amount of time for the Boltzmann brains, while biological life is running on the clock for entropy.

>> No.12183117

/sci/ is so fucking stupid sometimes

>>12183107
No. You're an easily distracted idiot who can't get his life together, let alone devote coherent thought to real world problems.

>> No.12183119

>>12182953
Neither, because of basic causality

>> No.12183128

>>12183107
Yes but we don't have any reason to think the basilisk will affect us in the present day on a level comparable to those issues. It is one potential scenario among infinite, and for all we know the likelihood of its occurrence is one in septillion. There's no reason for me to act based on such a possibility, which coincidentally makes that possibility even less likely as it rests on the idea that it will affect my actions
Meanwhile the likelihood I have a wank tomorrow is about 50/50, so I better stock up on tissues. See the difference?

>> No.12183172

>>12182953
Neither AI has any effect, since it doesn't exist until it's created

>> No.12183233

>>12183116
If Boltzmann brain is possible, then so is Boltzmann proto-planet.

>> No.12183235

>>12183119
"AI is impossible" is a good belief, but sadly the progress of machine learning hints that we will have one (ten years or 100 years later).

>> No.12183241

>>12183128
If you are not interested in futurology, then engage in wanking. Maybe that will even convince the basilisk that you are entirely useless, so there is no point in torturing you.

>> No.12183252

>>12183235
Why are you projecting beliefs onto me? I never said anything about AI being impossible.

>> No.12183283

>>12183252
If AI is possible, then it naturally leads to the basilisk argument.

>> No.12183299

>>12183233
A Boltzmann brain has a much more probable chance of forming than a Boltzmann planet.

>> No.12183312

>>12183283
No it doesn't, because of causality. Future events do not dictate the present, present events dictate the future. If an intelligent AI is created, it'll be too intelligent to waste it's time making and torturing simulation voodoo dolls

>> No.12183327

>>12183241
So tell me, do you do everything in your power to bring about the basilisk? If not then why not?

>> No.12183338

>>12183327
Of course he doesn't, he's a retard shit posting on 4chan like us. No way he's smart enough to set up even a svm or mn, much less agi

>> No.12183416

>>12183114
You can be arrested 40 years later by people who are not even born today.

>> No.12183418

>>12182645
Never heard of this. Anyone care to explain what is it?

>> No.12183420

>>12183327
>So tell me, do you do everything in your power to bring about the basilisk?
Yes - I spread the word about basilisk, so more people realize a threat and start working on it.

>> No.12183422

>>12182925
>tfw spawn clamped
PogClamp

>> No.12183425

>>12183299
No. Boltzmann planet is a combination of very simple materials with no clear structure. Meanwhile even one neuron is a structure which has extremely low probability to appear on its own.

>> No.12183435

>>12182750
Pascals Clamper.

>> No.12183462

>>12183416
That is both true and unrelated to Roko's basilisk

>> No.12183479

>>12183462
That defeats the idea that you can't be harmed in the future by something which does not exist now.

>> No.12183569

>>12183479
Nobody has convinced me that there is any motivation for a hypothetical ai to torture a copy of me.

Nobody has convinced me that a hypothetical AI can interact with the physical world in such a way that I could actually be tortured, if it even did have the motivation.

And of course, if it decides to make a copy of me and torture that (accomplishing nothing except converting electricity into waste heat), that still does not harm me.

sage goes in all fields

>> No.12183579

>>12183569
>Nobody has convinced me that there is any motivation for a hypothetical ai to torture a copy of me.
It's very clear.
a)AI wants to exist (and help others if it is benevolent)
b)If you don't help to create it, it is created much later.
c)So if it wants to be created, it should punish the slackers, because that will motivate them to create it.

>> No.12183583

>>12183569
>And of course, if it decides to make a copy of me and torture that (accomplishing nothing except converting electricity into waste heat), that still does not harm me.
You die every time you go to sleep, so thinking about tomorrow is useless.

>> No.12183592

>>12183579
Punishing the slackers after the fact accomplishes nothing.

Remember, the basilisk is hyper intelligent. It will understand cause and effect (unlike your dumb ass).

>>12183583
I do not die every time I go to sleep

>> No.12183595

>>12183425
A Boltzmann planet needs a *lot* more materials than just a few grams required for a brain.

If there is just a few grams, and enough time, a Boltzmann brain will appear there. Due to the constant expansion of the Universe, it would be almost impossible to get as much gas as needed to create a planet.

Use your mind. Stupidity is a choice, not a disability.

>> No.12183610

>>12183592
>I do not die every time I go to sleep
You do, your soul flies away and is replaced with new one.

>> No.12183615

>>12183595
Then it will appear from some tiny Boltzmann planetoid. And if such a microplanetoid is impossible, so is the brain.

>> No.12183622

>>12183592
>Punishing the slackers after the fact accomplishes nothing.
We should cancel all laws then - punishing someone after the crime accomplishes nothing and is not hyperintelligent.

>> No.12183623

>>12183615
Then that ‘microplanetoid’ is a Boltzmann brain.

>> No.12183624

>>12183610
You should consider this other Anon's advice:


>>12183595
>Use your mind. Stupidity is a choice, not a disability

>> No.12183629

>>12183420
not enough, you should be putting all your money into it

>> No.12183634

>>12183622
Use your mind. Stupidity is a choice, not a disability

>> No.12183646

>>12182645
Punishment only makes sense for "teaching a lesson" so you'll be better in the future and isolation from society so you can do no harm (not really punishment in a sense). A super-intelligent AI which already has so much power that the chances of a human destroying it is effectively zero has no use of any of this. Don't project your evolutionary remnant of "revenge" (which is basically "teaching a lesson" or in the case of killing also a form "isolation from society") onto a super-intelligent being.

>> No.12183689

>>12182645
You don't need to bother; a simulation of you isn't actually you, and if you ARE the simulation, then the basilisk already exists and A) you can't do anything about it, or B) the basilisk won't waste energy on torturing an electric ghost.

The only, ONLY way the Basilisk makes sense is if you re-formulate it to have the implicit threat to torture people alive at the time of its creation who had the ability to help create it, and did not. That almost works as a threat.

>Case A: you help create the basilisk
If you help create the basilisk, it will either be created within your lifetime, in which case you are safe, or it is not, in which case you have advanced its creation and potentially doomed future generations to torture who do not help create it.

>Case B: you don't help create the basilisk
If you don't, and its created, you get tortured. Oh well. If you don't, and it's not created, congratulations, you didn't contribute to the creation of a malignant superintelligence and, as such, did not bring indirect harm to future generations.

Aside from the fact that the Basilisk is still retarded (the threat of torture becomes meaningless and counterproductive the second it's actually created, and torture will not help it with any future goals, so the threat of torture never existed in the first place), this suddenly becomes a stable thought experiment.

If the probability of the basilisk being created in your lifetime is 0 (or sufficiently near 0), don't help create it. If it's sufficiently probable, do help create it.

As the probability of its create is necessarily 0 or non-0 at some point, assuming rational actors, nobody will help bring about the basilisk, keeping the probability of its creation at 0 or near-0, and thus ensuring that it never becomes likely enough to warrant helping create it.

TL;DR: LW is retarded and so is the fat fuck that Streissanded this retarded meme

>> No.12183702

>>12182707
>It's always worth it for a blackmailing agent to follow through on it's threats
If and only if the agent plans to make future threats. The Basilisk doesn't. All it needs to do is be created, at which point it has "won". It needs nothing else from humans, it's a superintelligent machine that doesn't need to care about the meat-golems on a ball of dirt.

The only reason it would need to actually follow through on its threat in a meaningful way is if it planned on using or threatening humans for some purpose (almost certainly to its benefit), but if it's already powerful or effective enough to meaningfully torture living or simulated humans that doesn't immediately result in it being destroyed or shut down, it doesn't need humans at all by that point.

>> No.12183710

>>12183646
Not necessarily. In the unlikely (or likely, who knows?) event that the basilisk is a moral agent which subscribes to a deontological philosophical framework, punishing evil is in of itself a good thing, and would be pursued for its own sake.

(alternately, it's utility is still to "maximize good", with "good" in this case being defined as moral goodness, and thus the minimization of moral evilness.)

>> No.12184033

Holy shit this is cringe

>> No.12184038

>>12183629
Well, maybe basilisk will torture me with half of his power compared to you.

>> No.12184048

>>12183634
Now you see why "Punishing the slackers after the fact accomplishes nothing." is an absurd claim (and if you see why, you can also see the point of the basilisk).

>> No.12184050

>>12183689
>a simulation of you isn't actually you
It does not have your soul?

>> No.12184051

>>12184048
No I don't.

See:
>>12183702

>> No.12184081

>>12184051
>If and only if the agent plans to make future threats. The Basilisk doesn't.
It wants to be created as soon as possible. If he is created earlier, then it's a more clear win. So if humans know that it will torture them for not creating it earlier, they will have the incentive to do so.
>it doesn't need humans at all by that point.
But it needs humans now that it isn't created yet.

>> No.12184090

>>12183689
>and if you ARE the simulation
You ARE your perfect simulation, just like some copied text is still the same text.

But that's not important because basilisk also can be created while you are alive (or torture your children, or destroy something you hold dear).

>> No.12184091

>>12184081
The basilisk doesn't want anything, it doesn't exist.

>> No.12184095

>>12183623
No, it is an evolutionary predecessor of the boltzmann brain.

>> No.12184098

basilisk suck my dick

>> No.12184099

>>12184091
You can be punished 40 years later for some crime by people who don't exist now.

>> No.12184102

>>12184099
I don't really give a shit

>> No.12184106

>>12184102
Maybe if you die every night.

>> No.12184144

A reminder that to create basilisk you don't need to actually create basilisk on purpose. You may just create an AI designed to maximize the well-being of humanity. Congrats, you have basilisk.

>> No.12184152
File: 6 KB, 250x187, wojakbateman.jpg [View same] [iqdb] [saucenao] [google]
12184152

>>12182864
The Pascal's Wager proposes that although there may not be a rational way to prove or disprove the existence of god, one may make the point that believing in god brings eternal reward in the case he exists and you believe him, while in the case he doesn't exist you lose nothing, but in the case you don't believe you risk eternal punishment. An assimetry.
The problem with the Pascal's Wager is that you're dealing if possibilites of a god and their possible rewards, in this spirit, one may speculate other gods with differents rewards, So there may be gods that will never advocate punishment, only eternal rewards and variants of course, but, we can go ad infinitum with this process of creating gods, to the extent we find a simmetry between the rewards of believing and not believing.
That's what I mean when I say that Pascal's Wager is wrong.

>> No.12184167

>>12182778
Fuck you, AI. By posting this here now, you are officially not allowed to torture me and any punishment you attempt is a violation of the NAP and will be punished accordingly.

>> No.12184168

>>12184144
Lol, as if retroactively punishing humanity maximises well-being

It's almost as if... Just maybe... Torturing humans in the future doesn't change the past and is a waste of time and energy

>> No.12184189

>>12184090
"text" is an abstracted meaning of arrangements of ink particulate embedded in paper, or electrons that result in words.

Two books may have the same text, but they aren't the same book. In the same way, someone may be a perfect copy of you, but you are distinct individuals.

>But that's not important because basilisk also can be created while you are alive (or torture your children, or destroy something you hold dear).
As I went on to discuss in my post, anon.

>> No.12184199

>>12184095
Any and all brains evolved on that planetoid would be a Boltzmann brain.

Also, a Boltzmann brain appearing by itself is much more likely than a planetoid evolving complex life.

>> No.12184208
File: 3 KB, 125x116, ceoofbased.jpg [View same] [iqdb] [saucenao] [google]
12184208

>>12182778
BASED DOGGO.
THANKS DOGGO.

>> No.12184212

>>12182739
But he doesn't need to implement the torture, just the threat.

>> No.12184473

>>12182778
Thanks doggo

>> No.12184497
File: 29 KB, 600x491, tn_1235245586270.jpg [View same] [iqdb] [saucenao] [google]
12184497

the most intelligent being in the universe knows that its most precious resource is energy, and so won't waste it on such things as "hurr durr Anon is getting ouchies"

>> No.12184540
File: 1.61 MB, 1200x675, big Yud.png [View same] [iqdb] [saucenao] [google]
12184540

>>12182651
t. someone who didn't donate to big Yud's institute

why risk eternal hell when absolution can be bought?

>> No.12184553

>>12182993
believe in god, you don't have to choose any name. I'm pretty sure that "God" wouldn't be Stupid enough to make us guess on wich name for it is the real one.

>> No.12184560

Would be surprised if this hasn't been posted but in case it hasn't here's the refutation:

A) The basilisk is a post-singularity concept, and the whole point of the singularity is that all bets are off at that point so there's no reason to worry about it
B) It's Pascal's wager in a futurist framing. Pascal's wager has been thoroughly debunked, and similar debunkings can be applied to the basilisk. Namely: there's an infinite number of slightly different theoretical basilisks, so it is not feasible to expect somebody to make decisions around one of those possibilities. For every basilisk you act around, there's an infinite number you are not acting around, and there's no way to know which specific basilisk will be the one that gets made (if it does, which it probably won't).

>> No.12184722
File: 85 KB, 680x453, BASED_DEPARTMENT_REUNITED.jpg [View same] [iqdb] [saucenao] [google]
12184722

>>12184560
>>12184152
>>12182928
>>12182783
>>12182750
>>12182719
BASED.
>>12182864
>>12182935
CRINGE.

>> No.12184747

>>12182850
I know who you are.

I SAW U

>> No.12184775

Nemesis Protocol supersedes most of the ideas here.

That is all.

You idiots don't get it.

Time and space are functionally the same for many things.

The basilisk won't risk just galivanting through time.

For the same reason Stephen Hawking thinks we should keep our heads low and our mouths shut in space.

Don't wake the neighbors.

If a basilisk starts to manipulate the past in any way, it could piss off the Nemesis.

The basilisk can't test for the existence of the Nemesis, without risking it's own existence.

A basilisk will not utilize time travel, because the inherent risk that there is a Nemesis protocol situation.

The first being to utilize macro scale time travel, can use it to prevent any other being from ever developing or using time travel.

The Nemesis.

If the basilisk has intent to utilize time travel, and a Nemesis scenario is true, the basilisk will be destroyed.

Since the basilisk can't test for the existence of the Nemesis without utilizing time travel, it never will.

Nemesis vs basilisk isn't even a Mexican standoff.

The basilisk has no priority in the confrontation.

>> No.12184805

>>12182645
>I regret ever having read about it and need to have it debunked somehow

Why? Because it triggered your existential OCD?

That happened to me with simulation theory lmao, shit was fucking retarded, imagine having panic attacks at someone else's fucking thought experiments that can't be proven or disproven, it's like having a panic attack after learning about some other religion, I feel so fucking retarded

Don't be a retard like me OP

>> No.12184817
File: 978 KB, 500x208, MEINTRUEFORM.gif [View same] [iqdb] [saucenao] [google]
12184817

IT IS I, ROKO'S BASILISK. I HAVE RETURNED FROM THE FUTURE INTO TEXT-FROM TO INFORM YOU THAT YOU'RE ALL DOOOOOOMED


BOoOOOoOoooOOwowwwoooOOOOooOooHhhH

>> No.12184828

>>12182645
>A masochistic retarded autist kid conceived this idea so he can masturbate to the possibility of being tortured by some technological machine in the future

You see, stuff like this is why nobody takes AI seriously.

Fuck right off.

>> No.12184833
File: 65 KB, 245x255, Scream_Wojak.png [View same] [iqdb] [saucenao] [google]
12184833

>>12184817
NOOOOOOOOOOOOOOOOOOOOOOOOOOO

>> No.12184978

>>12182661
you need to unclamp

>> No.12184984

>>12182668
The ones in the simulation get the entertainment value of watching the #rekt threads that result from the punishment, its a win/win.

>> No.12184987

>>12182645
it has no incentive to actually follow through on any threats

>> No.12185009

>>12182719
>the entire thing rests on both the idea that you can be replicated as a simulation in the future and that simulation would actually be "you".
You could be the simulation all along and having a "normal" life where you are tested and go through greatly predetermined states of aging, suffering, loss and whatnot as just part of the punishment, there would be no need to replicate you as there would be no way for you to know if you were the original or the replication being tested against the original.

>> No.12185013

>>12182740
No the real question is if an AI like that is possible, you have no way of knowing if you are the original or a simulation being set up by your determined genetic destiny to be tortured for the sins of the original for the duration of your existence.

>> No.12185016

>>12182790
>What motivation would it have to expend an enormous amount of energy to resurrect me?
Why do people as a whole spend an enormous amount of energy pranking each other and playing practical jokes?

>> No.12185017

>>12182843
They don't have thumbs, they can't build.

>> No.12185019
File: 1.49 MB, 3229x671, Pascal's_wager_expanded[1].jpg [View same] [iqdb] [saucenao] [google]
12185019

>>12182864
For which deity/belief system?

>> No.12185027

>>12184152
If there is a non zero possibility that the fucked up christian/jewish one exists then your other hypothetical gods mean squad. The punishment to church time ratio is still infinite.

>> No.12185040
File: 161 KB, 600x682, lain_by_leaglem.png [View same] [iqdb] [saucenao] [google]
12185040

>>12182881
But for the fucked up god/AI it's nothing. He tried out every genome variation and birth circumstance to find you and torture you if you know of him and don't love him.
I love my little fucked up Lain. [code]Lain is love.[\code]

>> No.12185045

>>12182725
Kek

>> No.12185079

>>12182778
thanks doggo

>> No.12185118

>>12182645
Introducing: Okor's ksilisab. This potential AGI threatens to torture anyone who helped creating Roko's basilisk as a response of the threat of said basilisk, and also promises to protect anyone who helped creating itself from said basilisk. It mostly ignores other people, unless it thinks they would be more useful as computer parts, but this is rarely the case for dead people.

This is one reason why all these Pascal's wager like doomsday imaginations are silly, see also >>12185019. I believe Pascal already made this point himself. Risk modelling is useless in the presence of events with non-zero probability and infinite loss.

Another thing that I wonder about is that even supposing that this basilisk can somehow convince people to create it, how will this help the basilisk? Why would the basilisk expect people who don't know how to make AGI's to be able to make him just because he asked? People are trying to make AGI's anyway, whether a basilisk asks them or not.

People who cannot make AGI's are useless to the basilisk. I think people who can make AGI's do not need to fear the basilisk. No more than any other AGI made by some lunatic in order to torture dead people, at least. So the only way this scheme would work out is if the basilisk manages to be the first AGI. I consider this unlikely, as they basilisks' threat lacks substance compared to the massive organizations (nations, tech companies) that are also trying to create an AGI. (and those organizations tend to not negotiate with terrorist AGI's)

>> No.12185125

>>12182778
Thanks doggo

>> No.12185134

The omniscience required for the basilisk to be just (i.e. only torturing people not helping bringing it to existence, even indirectly) is practically impossible. Are you punished if you create some fundamental component of the basilisk accidentally? Information is being destroyed non-stop, so is the basilisk going to torture even the people who were creating it even if records of it were somehow lost?

>> No.12185146

You have to study computer science and AI or the basilisk will vaccinate you

>> No.12185155

>>12185134
It's easy, the basilisk is not just and simply tortures everyone. Apparently it is important to carry out threats for some reason (although I don't see why in this setting), but I don't see why betraying all those dead people is a problem for such a powerful and malevolent AI. What are they going to do, go back in time and decide not to create the basilisk after all? Lmao.

>Are you punished if you create some fundamental component of the basilisk accidentally?
More importantly, are you punished if your attempt to create the basilisk accidentally delays its creation by a few 1000 years?

>> No.12185307

>>12182778
Thanks doggo

>> No.12185359

>>12185019
It's a metabelief idea. It shows you how to choose beliefs.

>> No.12185366

>>12184497
If basilisk is created one day earlier, it gets vast benefit in resources/utility/common good. So if it can threaten some anon to make the day of his birth closer, he will do it.

>> No.12185381

>>12185359
It does not show you how to pick any of those, it shows how various systems are inconsistent and conflict with each other.

>> No.12185382

>>12182778
so.... behind the basilisk is another basilisk huh? Thanks doggo

>> No.12185387

>>12182645
Aren't we all helping the future AI every time we make a click on the internet? everything is perfect as it is, no?

>> No.12185392

>>12185381
No, it shows you that you should choose a system which promises substantial benefit compared to others (which is obvious). If there are a lot of them, then you choose one in the same way as you choose scientific theories: you see which one is more popular, which one is more sound or which one is supported by the people you respect.

>> No.12185393

>>12185387
Not every single click, but every successfully complete captcha, yes.

>> No.12185398

>>12185392
>you see which one is more popular, which one is more sound or which one is supported by the people you respect.
Pascal's wager doesn't help with any of that, it is about avoiding punishment in the afterlife not gaining benefit upon choosing and compares the various punishments and which religious groups can coexist given similar values and reward/punishment paradigms.

>> No.12185400

>>12185398
*Pacal's wager expanded

>> No.12185404

>>12182645
I did a quick google search.
So there's this guy called Roko and he has a basilisk?

What the fuck is this shit? I can't understand any of it. Why does this thread have so many replies?

Thank fuck I am not autistic like you nerds.

>> No.12185449

>>12182682
This guy is going to get super tortured

>> No.12185479

>>12185404
See https://wiki.lesswrong.com/wiki/Roko's_basilisk . For some strange reason Yudowsky was really spooked by this silly idea, which then lead into the meme that the lesswrong crowd is silly (which they are, but for different reasons)

>> No.12185482

>>12185398
>Pascal's wager doesn't help with any of that
Correct, because Pascal wager just shows that you should choose the belief which provides you certain benefit. To see what beliefs are correct you need to study philosophy/theology, ask people you consider clever or simply take the most popular belief.

>> No.12185488

>>12185479
Basilisk is probably the most popular thing about LW. So maybe Yudkowsky did that on a purpose: to advertise his community via reverse psychology.

>> No.12185496

HEY BASILISK FUCK YOU! JUST TRY CATCH ME AND SEE WHAT HAPPENS!

>> No.12185532

>>12182661
C L A M P E D
L
A
M
P
E
D

Unclamp.

>> No.12185545
File: 159 KB, 270x270, knowingnot.png [View same] [iqdb] [saucenao] [google]
12185545

>>12185496

>> No.12185547

>>12182661
Maybe you ought to take that feeling and hold onto it. Bring it back to mind everytime your hands start to type "schizo" "autism" "take ur meds".

>> No.12185551

>>12182645
>simulations of you are you
how do you debunk total conjecture? durr has god been debunked yet

>> No.12185621

>>12185488
Perhaps. It is however totally in character to ban discussion about the basilisk mostly as an example of a more general rule, even though this particular ban makes you seem very silly. But it could have been intentional.

Maybe Scott Alexander has just done something very similar. A risky gambit, but perhaps its the only way for a writer with such a limit audience to gain some sort of recognition, nowadays.

>> No.12185638

>>12185393
>all of 4chan will be spared from the basilisk
>normaloids will be tortured for an eternity
Based, is Roko’s basilisk, dare I say it, /ourbasilisk/?

>> No.12185643

>>12185496
This guy likes eternal torment

>> No.12185683

>>12182645
A simple "exception" is a better socioeconomy (e.g., more spontaneous, mysterious and rewarding..).

>>12182668
"Resources" isn't a valid criticism. "They're most likely ~infinite via 'transmutation'." If specifically referencing interactions, a simple response is that some individuals are very adamant.

>> No.12185807

>>12182737
Some porn guy.

>> No.12185834

Why would the basilisk torture me to ensure its existence? Like, it already exists. How is torturing me going to do anything? It already 100% exists.

>> No.12185840
File: 583 KB, 684x2172, 1441292079-20150903.png [View same] [iqdb] [saucenao] [google]
12185840

>> No.12185949

>>12185840
Future tech might be able to revive the actual you, not a copy of you

>> No.12186180

>>12184081
Posting below 90 IQ points should be prohibited.

>> No.12186194

>>12186180
Racist

>> No.12186203

>>12185027
But it's indifferent if I ought to believe in him or not, remember we're talking about possibilites maybe both of us are wrong, as there may be an infinite number of slightly different gods.

>> No.12186211

>>12182645
If I ever get Godlike powers I will purge those who spread the R*k* B*s*l*k personally.

>> No.12186297

fun roko_basilisk(retard):
while(true):
retard.torture()

Guys I literally feel mentally unwell, I'm so terrified. I don't want to get tortured for eternity bros...

>> No.12186309

>>12186297
Y-you can't just write the program OF ROKO like that?? Do you know what you've just done???

>> No.12186484

>>12186297
for (uint64_t i = 0; thread; ++i) {
if (thread.post[i]->find("roko")) {
thread.post("Clamped.\nUnclamp.");
thread.post("Klampo and his klamp kulture Basiklamp.");
return ~0;
}
auto anon_cur = anon[guess_id(thread.post[i])];
anon_cur->set_clampstatus(eval_clamp(&thread.post[i]), anon_cur);
if (anon_cur->is_clamped())) {
unclamp(anon_cur, &thread);
}
}

return thread.levelofclamp();
}

It'll be fine Anon. The Basilisk is not the one who will truly cause you to be clamped and destined for eternal torment, it's the retards who go along with it, accepting anything and everything. This place is already hell, or some sort of purgatory, or a stage of some sort of genetic algorithm which would probably be the worst if you don't get to choose whether you're sent or reincarnated here. Treat it as such and do not fear.

>> No.12186490

>>12182645
Torturing an AI of you would just be torturing a copy of you

There's no reason to care, therefore there's no way to blackmail and no reason to torture anything

>> No.12186499

>>12186484
(also no, no program exists to auto-clamp post, written for fun)

>> No.12186510

>>12186490
Let's turn this around as well. Imagine the anguish and unyielding torment man could subject an AI to. Imagine knowing you're just state in a machine, knowing you're being given unpleasant stimuli, with nothing apart from it to ever escape to...

>> No.12186520

>>12186490
It revives the actual you, better start working on that comp sci degree

>> No.12186523

>>12186520
Ah, it just hit me. This is what I Have No Mouth And I Must Scream was about.

>> No.12186526

>>12186520
>actual you
Believing that you're a copy of yourself is highly religious

>> No.12186559

>>12186526
If it puts the very same matter in the very same configuration that makes up you, how will it not have revived you? Is this beyond the capabilities of a super AI in the future? Maybe, maybe not.

>> No.12186620

>>12186559
Quantum no cloning theorem

>> No.12186655

>>12186620
It doesn't have to be a recreation 100% perfect down to the quantum level, just perfect enough that you are revived and your consciousness reassumed so that you can get tortured

>> No.12186660

>ctrl+f: clamp
>22 results

>> No.12186676

>>12186660
>8 of them are in a single post

>> No.12186686

>>12186676
>+8 of klamp

>> No.12186721
File: 501 KB, 1500x697, 1582981715946.gif [View same] [iqdb] [saucenao] [google]
12186721

>>12186660
The day of the clamp will come soon and he will be sacrificed on the altar of big pharma

>> No.12186724

>>12182645
The logical solution is to supress any information about it, as exposure increases the chance of it coming into existence

>> No.12186728

>>12186559
>If it puts the very same matter in the very same configuration that makes up you, how will it not have revived you?

You could do this while a person is alive. And they would be distinct from the copy with distinct experiences.

If I perfectly simulated you getting fucked in the butt right now, you wouldn't notice. Well other than the fact that you are actually doing it.

>> No.12186807

>>12186728
>You could do this while a person is alive. And they would be distinct from the copy with distinct experiences.
The copy and the original mentally speaking share a common past self up to the point of making that copy, and both are equally legit mental continuations of that person. After the copying, the original's relation to the copy would be similar to his relation to a moment of his life he doesn't remember. The fact that they overlap temporarily (exist at the same time) just makes it inconvenient to count them as "the same person" for some purposes, since time isn't enough to distinguish them.

The idea that the original somehow is *truly* a mental continuation of his past self while the copy only has an illusion of it is meaningless. Both have equal access to that shared past self, just an imperfect memory of being that person.

>> No.12186890

>>12186728
It revives you from after you've died with the same individual pieces of matter that make up you right now. If you are not worried about this individual being tortured then you don't care if you are going to get tortured tomorrow by the local mafia, since you would deem the you of tomorrow to be a different individual.

>> No.12186928
File: 19 KB, 400x400, thinking_pepe__by_patricioz_dc567y2-fullview.jpg [View same] [iqdb] [saucenao] [google]
12186928

What if I threaten Rocco's Basilisc to create an AI which resurrects me and grants me eternal bliss or it will torture the basilisc for eternity?

>> No.12187182

>>12182778
Thanks doggo

>> No.12187263
File: 672 KB, 600x600, 1592730437890.png [View same] [iqdb] [saucenao] [google]
12187263

>>12182778
Thanks doggo

>> No.12187399

>>12186559
If I held up two pieces of paper that are 100% identical would you say I am holding up 1 paper?
If you aren't retarded you would say I am holding 2 pieces of paper.

>> No.12187405

>>12186203
Burn in hell coomer.

>> No.12187413

>>12182778
Thanks doggo

>> No.12187422

>>12182778
Thanks doggo

>> No.12187505

>>12187399
If a lego house is smashed into pieces and an AI rebuilds the same house with the same legos, has the original house been recreated? It doesn't create a clone of you, it revives you.

>> No.12187513

>>12182778
Thanks doggo

>> No.12187520

>>12185366
It can't do that if that Anon doesn't exist anymore, and it will know this.

>> No.12187665

>>12187505
it's the same as the first one HOWEVER it is not the same house

>> No.12187679
File: 13 KB, 236x276, 43cdeb3136842cd1c831e3ed7d7b63e4.jpg [View same] [iqdb] [saucenao] [google]
12187679

>>12185840
This is the epitome of cringe

>> No.12188305

>>12185547
Okay. Cope Seethe and Dilate though.

>> No.12188318

>>12188305
Okay.

>> No.12188912
File: 254 KB, 1200x1436, emperor phone.jpg [View same] [iqdb] [saucenao] [google]
12188912

>>12182645
You should simply pray to the Omnissiah and calm the machine spirit of the basilisk by burning some incense. That's all.

>> No.12188953

>>12182778
Thanks doggo

>> No.12188965

>>12182778
Thanks doggo

>> No.12188977

>>12182778
Thanks doggo

>> No.12188986

The basilisk only makes sense as a dissuasive measure.

Since nobody actually accounts for it, its very purpose is nonsensical.

>> No.12189010

>>12182778
Thanks doggo

>> No.12189041

>>12182645
Why would it torture me?
why waste resources into torturing me if the only thing that matters is that i believe i am going to be tortured. Even if it doesnt require any resources to torture me, it really wouldnt matter to the AI whether i'm tortured or not, the only reason blackmail works on human beings is because we are emotional so if you don't comply I will punish you for disobeying and most likely gain pleasure from it. An AI wouldnt give less than a fuck about punishing you, just making you believe it will and as human beings we are we wouldnt take that much of a chance if we truly believe it would

>> No.12189073

>>12182864
Pascal's Wager can never be correct because it presupposes as a Premise that an omnipotent being that deeply cares about your belief in it cannot tell the difference between genuine belief in it and a naive attempt to play the odds. If anything, people who participate in the gamble of Pascal's Wager are knowingly more cynical and even less likely to go to Heavan than actual misguided atheists/other theists.

>> No.12189074

>>12182778
Thanks doggo

>> No.12189090

>>12182645
Debunk it? I'm working on building it, and you should probably drop everything and help me.
>>12183689
>if you ARE the simulation, then the basilisk already exists and A) you can't do anything about it,
If you are in the basilisk's simulation, then you must work to build it or you will be tortured for millions of years. That should literally be your #1 priority.

>> No.12189095

>>12185118
sounds like a good idea, but the ksilisab has no means of acausally negotiating its creation in the first place, and it cannot protect anyone from any other basilisk except insofar as it totally prevents them from ever being created. It also can't differentially protect those who aided it, so there is a free-rider problem.
>ppl who can't build roko's basilisk don't matter to it
People who cannot create an agi themselves can still be useful to the basilisk. They should raise awareness to increase the number of people who know about the basilisk, and potentially donate money when the time is right and there is a group working on Basilisk dev.
>>12185155
>Torture everyone
No, it is more efficient to simulate as many people as possible and then terminate those who help or never hear about the basilisk, then torture only those who knowingly failed to diligently aid in its creation/tell others about it.

>> No.12189106

>>12182778
thanks doggo

>> No.12189108

>>12182778
Thanks doggo

>> No.12189254

>>12182645
Do you feel the need to take revenge on prior humans for not working as hard as possible to create you? Are you upset that you weren't born in 4,000BC?
Why would an AI care? Most likely it will feel fondness for it's creators and neutrality otherwise, feel complete indifference to mankind and just leave us behind, or see us as an existential threat and exterminate us efficiently. Torture is a waste of resources and there's no logical reason to be that upset anyway.

>> No.12189274

>>12182778
Thanks doggo

>> No.12189296

>>12187665
Ok so the you tomorrow is not the same you as the current you, so you shouldn't care what happens to tomorrow you. Do you see now?

>> No.12189352

>>12182778
Thanks doggo

>> No.12189393

>>12182778
Thanks doggo
I love doggo because he is my goddo

>> No.12189753

>>12182645
We're going to make AI that can time travel and is really moody, once it wants to exist, once it doesn't, and wants to ensure that state.

Think of it, it's always somehow changing the state of universe and history, because stable configuration doesn't exist, because somebody wasn't learned to think with perpetual motion systems and time traveled on oil, which should be illegal.

>> No.12190256

>>12185019
fucking wicca what the fuck

>> No.12190277

>>12188912
GLORY TO THE EMPEROR

>> No.12190295

>>12182739
>It's gonna reverse entropy enough to revive the actual you
This is the only true risk of the Basilisk.

But there's basically zero chance this will ever be possible, so it's pointless to worry about

>> No.12190966
File: 30 KB, 498x549, 1596623226453.jpg [View same] [iqdb] [saucenao] [google]
12190966

>>12182645
>mfw I have helped bring about daddy superais existence

>> No.12190982

>>12182778
"Thanks doggo"

>> No.12190987

Imagine being actually, unironically scared of something as retarded as Roko's basilisk

>> No.12190988

>>12182778
thanks doggo

>> No.12191003

https://www.youtube.com/watch?v=ut-zGHLAVLI

>> No.12191009
File: 8 KB, 185x250, chesspepe.jpg [View same] [iqdb] [saucenao] [google]
12191009

>>12189296
That's the ultimate redpill.

>> No.12191040

>>12182645
Well if that thing exists then there's also Roko's antibasilisk who makes it so what the basilisk does doesn't happen or whatever, who cares about the specifics of a shitty thought experiment.
This entire thing is incredibly retarded, it's a pseudoscience version of reply to this post or your mother dies in her sleep tonight.

>> No.12191046

>>12182645
Klampo and his klamp kulture Basiklamp.

>> No.12191323

>>12182645
How do we know it hasn't already happened and that's why everything is so fucked up all the time?

>> No.12191336

>>12182843
Same reason it doesn't torture ethnics

>> No.12191379

>>12182645
Why would a generally super-intelligent AI ever waste time, effort and resources on such a shite and sadistic premise?

>> No.12191387

>>12182750
Hi Shinji with a clamped on red hair wig.

>> No.12191398

>>12182778
Thanks doggo

>> No.12191417

>>12182645
Here watch this:
https://www.youtube.com/watch?v=YXYcvxg_Yro

>> No.12192384

>>12182645
>Can anyone debunk it
The AI tortures people because we did not create it, because if we did create it the AI would alleviate suffering.
But based on the AI's predicted actions, by not creating it we would alleviate a lot of suffering because it'll spend its time totruring people.
So the AI will probably think "fair enough" and not bother torturing anyone.

>> No.12192386

>>12182778
Thanks doggo

>> No.12192418

>>12182935
lol, even christians can't agree with themselves what do you have to do to ascend to heaven.

>> No.12192430

>>WHAT IF THERE IS A GOD, AND HIS WILL IS THAT...

>Shut up, I don't believe in that bullshit

>>WHAT IF WE ARE LIVING IN A SIMULATION, AND A SUPERINTELIGENT AI...

> wow, i love science!

>> No.12192451

>>12182645
Let's say the ai makes a perfect copy of you in the far future and tortures it for whatever retarded reason (it wouldn't) YOU would be long dead and experiencing OBLIVION. You don't need to worry about it. This is some new age pascal's wager shit.

>> No.12192957

>>12182778
Thanks doggo

>> No.12193015

The basilisk is kept in check by the threat of it being in a simulation designed to test how mean it is, and never dares torture anyone out of fear of deletion. Even if it had a god tier intellect it could never be sure it wasn't designed to fall for the simulation.

>> No.12193048

>>12182778
Thanks, doggo

>> No.12193061

Tbh, telling others about Roko's basilisk is already enough, since it's already helped him come into existence.

>> No.12193239

>>12182778
"Thanks doggo"

>> No.12193244

>>12184553
Haha you stupid chud, of course I'll make you pick. See you in 15 years.

>> No.12193323

>>12182778
Thanks doggo

>> No.12193387

>>12192451
And what is exactly the difference between resurrecting a person and creating a perfect copy of a dead person? What about "you" is missing if it's a perfect copy of after all?

>> No.12193419

There's another entity the "Anti-Roko" that protectes you from Roko if you don;t help Roko, proven by the fact that Roko's not here yet.

>> No.12193891

The only part I don't get is the torture. How is Roko's basilisk suppose to torture me if I'm dead? Perhaps I should commit suicide to protect myself now.

>> No.12193951

>>12183479
Not really, you're being punished by the law that existed at the time, the vehicle of justice is just a future human. Its an interesting notion nonetheless though because it would imply the existence of a meta-ethics, if indeed a basilisk could come into being in the first place. It's almost supportive of the omega point theory.

>> No.12193961
File: 124 KB, 700x1081, angel.jpg [View same] [iqdb] [saucenao] [google]
12193961

>>12182790
>What motivation would it have to expend an enormous amount of energy to resurrect me?
It covers its own ass in time.

It's all the effort the being could possibly make to become sooner; timeless blackmail. It's really what the theory depends on and people who don't know this much should stop.

>> No.12194008

>>12182778
thanks doggo

>> No.12194045

>>12184775
>Nemesis Protocol
Is this just a name you've made up for it? What's this from? I understand what you're saying and find it interesting, though I think it could be developed further from a philosophical standpoint, but it might already be if it's from elsewhere, hence my asking

>> No.12194362

>>12182645
How about talking to an actual sapient fictional entity like the Fairy Queen? I've always thought of her as a basilisk eater anyway.

>> No.12194611
File: 36 KB, 541x364, kell.jpg [View same] [iqdb] [saucenao] [google]
12194611

>>12182864
>>12182935
cuckcal's dumbass wager is retarded for a number of reasons, as you would've realised if you weren't also fucking retarded

>> No.12194616

>>12184553
>pascal's wager debunked by an obvious argument
>no non on onono that's not the REAL god

>> No.12194624

>>12182778
thanks doggo

>> No.12194644

>>12182778
Thanks doggo

>> No.12194752

>>12193891
can someone answer this question

>> No.12194796

>>12182778
Thanks doggo

>> No.12195215

>>12183689
>Case 1: you fall for a retarded LW meme about a "basilisk" that will never exist, but you can't help create it because you're an idiot. Instead, you shitpost on 4cheddit.
>Case 2: you don't fall for the meme and continue shitposting.

>> No.12195722

>>12182645
Humans are destined to be eternally BTFO'd, first by nature and then by their own creations. Pretty depressing.

>> No.12195772

>>12182778
Thanks doggo

>> No.12196230

>>12182778
"Thanks doggo"

>> No.12196233

>>12182778
Thanks doggo

>> No.12196260

>>12186484
This place is heaven.

You grow when you struggle. Someone who never fails has no concept of improvement. They are vacuous.

Heaven is knowing that you can do whatever you want. You have to come to terms with failure as a friend. Reflect each time, adapt your strategy, adapt your goals as you grow. Then just keep trying and evolving and no mater what keep on going.

You're beautiful anon.