[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/lit/ - Literature

Search:


View post   

>> No.19262912 [View]
File: 693 KB, 800x4280, 2F09CC3A-D470-4417-96F2-FD8C10531E61.jpg [View same] [iqdb] [saucenao] [google]
19262912

>>19262861
>Being scared shitless of eternal torture in hell and not thinking about anything is a good thing bro

>> No.19222681 [View]
File: 693 KB, 800x4280, 1B9F9284-651D-4434-B4D2-E59BC0C1079C.jpg [View same] [iqdb] [saucenao] [google]
19222681

>>19222664

>> No.19147182 [View]
File: 693 KB, 800x4280, DB962988-23FA-4528-A0CF-35E4A9404F2C.jpg [View same] [iqdb] [saucenao] [google]
19147182

>>19147168
You should be put in a human depository

>> No.19075328 [View]
File: 693 KB, 800x4280, 4179ACFA-E295-40D7-9B45-7694B59019C7.jpg [View same] [iqdb] [saucenao] [google]
19075328

>>19074495
Relevant:

https://www.lesswrong.com/posts/N4AvpwNs7mZdQESzG/the-dilemma-of-worse-than-death-scenarios

>The most extreme example would be an indefinite state of suffering comparable to the biblical Hell, perhaps caused by an ASI running simulations. Obviously preventing this has a higher priority than preventing scenarios of a lower severity.

>Scenarios which could mean indefinite suffering:

>1. ASI programmed to maximise suffering
>2. Alien species with the goal of maximising suffering
>3. We are in a simulation and some form of "hell" exists in it
>4. ASI programmed to reflect the values of humanity, including religious hells
>5. Unknown unknowns

>Worse than death scenarios are highly neglected. This applies to risks of all severities. It seems very common to be afraid of serial killers, yet I have never heard of someone with the specific fear of being tortured to death, even if most people would agree that the latter is worse. This pattern is also seen in the field of AI: the "killer robot" scenario is very well-known, as is the paperclip maximiser, but the idea of an unfriendly ASI creating suffering is not talked about as often.

>The dilemma is that it does not seem possible to continue living as normal when considering the prevention of worse than death scenarios. If it is agreed that anything should be done to prevent them then Pascal's Mugging seems inevitable. Suicide speaks for itself, and even the other two options, if taken seriously, would change your life. What I mean by this is that it would seem rational to completely devote your life to these causes. It would be rational to do anything to obtain money to donate to AI safety for example, and you would be obliged to sleep for exactly nine hours a day to improve your mental condition, increasing the probability that you will find a way to prevent the scenarios. I would be interested in hearing your thoughts on this dilemma and if you think there are better ways of reducing the probability.

Also relevant:
https://youtube.com/watch?v=jiZxEJcFExc
https://s-risks.org/

Navigation
View posts[+24][+48][+96]