[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math

Search:


View post   

>> No.15976877 [View]
File: 48 KB, 652x425, existential risks.jpg [View same] [iqdb] [saucenao] [google]
15976877

https://en.wikipedia.org/wiki/Suffering_risks
https://www.youtube.com/watch?v=tPiq4njipdk

>> No.15930719 [View]
File: 48 KB, 652x425, existential risks.jpg [View same] [iqdb] [saucenao] [google]
15930719

https://en.wikipedia.org/wiki/Suffering_risks
https://www.youtube.com/watch?v=tPiq4njipdk

>> No.15857501 [View]
File: 48 KB, 652x425, existential risks.jpg [View same] [iqdb] [saucenao] [google]
15857501

>>15857495
You could actually be preventing yourself from being tortured eternally.

https://www.lesswrong.com/posts/N4AvpwNs7mZdQESzG/the-dilemma-of-worse-than-death-scenarios

>Methods which may reduce the probability of indefinite worse than death scenarios (in order of effectiveness):

>1. Suicide

>2. Working on AI safety

>3. Thinking of ways of reducing the probability

>Suicide, depending on your theory on personal identity, may make the probability 0. If you believe that there is no difference between copies of you then there may be a possibility of being resurrected in the future however. As we aren't certain about what happens to the observer after death, it is unknown whether death will make worse than death scenarios impossible. I believe there are many ways in which it could reduce the probability, but the key question is: could it increase the probability? An argument against suicide is that it is more likely that people who commit suicide will go to "hell" than those who don't. This is because an entity who creates hell has values which accept suffering, making life a positive concept which should not be discarded. On the other hand, an entity with values related to efilism/antinatalism (philosophies in which suicide is generally accepted) would not create a hell at all. Of course, this is all based on a lot of speculation.

>There is a risk that the suicide attempt will fail and leave you in a disabled state. This could make you more vulnerable when considering indefinite worse than death scenarios. However, I would argue against this disadvantage because the only potential way to evade an entity powerful enough to cause these scenarios would be suicide, which always has a risk of failing.

>> No.15741966 [View]
File: 48 KB, 652x425, existential risks.jpg [View same] [iqdb] [saucenao] [google]
15741966

https://en.wikipedia.org/wiki/Suffering_risks
https://www.youtube.com/watch?v=tPiq4njipdk

>> No.15622795 [View]
File: 48 KB, 652x425, existential risks.jpg [View same] [iqdb] [saucenao] [google]
15622795

https://en.wikipedia.org/wiki/Suffering_risks

>> No.15326509 [View]
File: 48 KB, 652x425, existential risks.jpg [View same] [iqdb] [saucenao] [google]
15326509

>>15326295
>>15326350
https://en.wikipedia.org/wiki/Suffering_risks

>> No.15324635 [View]
File: 48 KB, 652x425, existential risks.jpg [View same] [iqdb] [saucenao] [google]
15324635

>>15320340
There's a significant risk you will be eternally tortured in the far future after you are revived.

https://en.wikipedia.org/wiki/Suffering_risks

>> No.15281361 [View]
File: 48 KB, 652x425, existential risks.jpg [View same] [iqdb] [saucenao] [google]
15281361

Scientifically, how likely is it that AI will perma-helltorture everyone?

https://en.wikipedia.org/wiki/Suffering_risks
https://www.lesswrong.com/posts/D7PumeYTDPfBTp3i7/the-waluigi-effect-mega-post

>> No.15265697 [View]
File: 48 KB, 652x425, existential risks.jpg [View same] [iqdb] [saucenao] [google]
15265697

>>15262703
There are rationalists that have advocated for committing suicide as a way of avoiding eternal torture.

https://www.lesswrong.com/posts/N4AvpwNs7mZdQESzG/the-dilemma-of-worse-than-death-scenarios

>Methods which may reduce the probability of indefinite worse than death scenarios (in order of effectiveness):
>1. Suicide
>2. Working on AI safety
>3. Thinking of ways of reducing the probability
>Suicide, depending on your theory on personal identity, may make the probability 0. If you believe that there is no difference between copies of you then there may be a possibility of being resurrected in the future however. As we aren't certain about what happens to the observer after death, it is unknown whether death will make worse than death scenarios impossible. I believe there are many ways in which it could reduce the probability, but the key question is: could it increase the probability? An argument against suicide is that it is more likely that people who commit suicide will go to "hell" than those who don't. This is because an entity who creates hell has values which accept suffering, making life a positive concept which should not be discarded. On the other hand, an entity with values related to efilism/antinatalism (philosophies in which suicide is generally accepted) would not create a hell at all. Of course, this is all based on a lot of speculation.

>> No.15262530 [View]
File: 48 KB, 652x425, existential risks.jpg [View same] [iqdb] [saucenao] [google]
15262530

https://en.wikipedia.org/wiki/Suffering_risks

>> No.15224844 [View]
File: 48 KB, 652x425, existential risks.jpg [View same] [iqdb] [saucenao] [google]
15224844

>>15224742
https://en.wikipedia.org/wiki/Suffering_risks

>> No.15211393 [View]
File: 48 KB, 652x425, existential risks.jpg [View same] [iqdb] [saucenao] [google]
15211393

https://en.wikipedia.org/wiki/Suffering_risks

>> No.15119766 [View]
File: 48 KB, 652x425, existential risks.jpg [View same] [iqdb] [saucenao] [google]
15119766

>>15119228
>Outcome Likely Bad
https://en.wikipedia.org/wiki/Suffering_risks

>> No.15110050 [View]
File: 48 KB, 652x425, existential risks.jpg [View same] [iqdb] [saucenao] [google]
15110050

>>15109894
>>15109897
Relevant:
https://en.wikipedia.org/wiki/Suffering_risks
https://www.youtube.com/watch?v=jiZxEJcFExc
https://s-risks.org/

>> No.14831172 [View]
File: 48 KB, 652x425, existential risks.jpg [View same] [iqdb] [saucenao] [google]
14831172

>>14830059
>AI hell
Relevant:
https://en.wikipedia.org/wiki/Suffering_risks
https://www.youtube.com/watch?v=jiZxEJcFExc
https://s-risks.org/

>> No.14608203 [View]
File: 48 KB, 652x425, existential risks.jpg [View same] [iqdb] [saucenao] [google]
14608203

>>14607957
>despite the fact that these cosmologies threaten our souls with eternal suffering
S-risks are risks of eternal suffering on a cosmic scale that scientifically could realistically happen.

https://en.wikipedia.org/wiki/Suffering_risks
https://www.youtube.com/watch?v=jiZxEJcFExc
https://centerforreducingsuffering.org/research/how-can-we-reduce-s-risks/
https://www.lesswrong.com/posts/N4AvpwNs7mZdQESzG/the-dilemma-of-worse-than-death-scenarios

>> No.14594435 [View]
File: 48 KB, 652x425, existential risks.jpg [View same] [iqdb] [saucenao] [google]
14594435

>>14593014
The flip side of this is that the biggest scientific blackpill is the possibility of S-risks, where the future could become a literal hell.

https://en.wikipedia.org/wiki/Suffering_risks
https://www.youtube.com/watch?v=jiZxEJcFExc
https://centerforreducingsuffering.org/research/how-can-we-reduce-s-risks/
https://www.lesswrong.com/posts/N4AvpwNs7mZdQESzG/the-dilemma-of-worse-than-death-scenarios

>> No.14582109 [View]
File: 48 KB, 652x425, existential risks.jpg [View same] [iqdb] [saucenao] [google]
14582109

https://www.youtube.com/watch?v=jiZxEJcFExc
https://centerforreducingsuffering.org/research/how-can-we-reduce-s-risks/

>> No.14576194 [View]
File: 48 KB, 652x425, existential risks.jpg [View same] [iqdb] [saucenao] [google]
14576194

Scientifically, what is the most important cause area and why is it S-risks?

https://en.wikipedia.org/wiki/Suffering_risks
https://www.youtube.com/watch?v=jiZxEJcFExc

>> No.14575876 [View]
File: 48 KB, 652x425, existential risks.jpg [View same] [iqdb] [saucenao] [google]
14575876

>>14575697
How do I avoid eternal torture?

https://en.wikipedia.org/wiki/Suffering_risks
https://www.youtube.com/watch?v=jiZxEJcFExc
https://centerforreducingsuffering.org/research/how-can-we-reduce-s-risks/

>> No.14569512 [View]
File: 48 KB, 652x425, existential risks.jpg [View same] [iqdb] [saucenao] [google]
14569512

>>14568899
Relevant:
https://en.wikipedia.org/wiki/Suffering_risks
https://www.youtube.com/watch?v=jiZxEJcFExc
https://centerforreducingsuffering.org/research/how-can-we-reduce-s-risks/

>> No.14501739 [View]
File: 48 KB, 652x425, existential risks.jpg [View same] [iqdb] [saucenao] [google]
14501739

>>14501511
>>14501675
Relevant:
https://en.wikipedia.org/wiki/Suffering_risks
https://www.youtube.com/watch?v=jiZxEJcFExc
https://centerforreducingsuffering.org/research/how-can-we-reduce-s-risks/

>> No.14488660 [View]
File: 48 KB, 652x425, Scope of Villany.jpg [View same] [iqdb] [saucenao] [google]
14488660

>>14488509
Indefinite torture of all living things; making Hell a real thing.
You would need some type of self-sustaining AI hivemind that could capture and contain all living things, torture them endlessly, and somehow keep them alive forever so that the torturing may never cease.
>Super intelligent AI with hivemind of drones and no morality (think ants or bees)
>Methods to capture and contain living things (tractor beams, putting bodies into status for transport, planet sized prisons, anything of the sort)
>Methods to keep torture indefinite (cease aging, regenerate living tissue, restoring mental states from memory backups to stave off insanity or diminished reception to pain; keep the subjects alive and conscious and feeling pain at the same level indefinitely)

Navigation
View posts[+24][+48][+96]