[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 13 KB, 595x490, 415204271_359646956761993_5868510074984675927_n.jpg [View same] [iqdb] [saucenao] [google]
15975043 No.15975043 [Reply] [Original]

>ASI programmed to maximize suffering
>Alien species with the goal of maximizing suffering
>we are in a simulation and some form of "hell" exists in it
>ASI programmed to reflect the values of humanity, including religious hells
>god is real and he is going to put us all in hell!
>unknown unknowns
What's the solution to this?

>> No.15975149

I hope the heat death of the universe is real
That way there's at least an upper bound to suffering

>> No.15975665
File: 228 KB, 475x503, Screenshot(23).png [View same] [iqdb] [saucenao] [google]
15975665

I already told you, there isn't a solution.
Part and parcel with existing. Its a purely existential problem; You might find yourself being eternally tortured by some unknown mechanism of reality. The best you can do is hope that doesn't happen.

>> No.15975705

>>15975043
nah, suffering is instantly reflected throughout the lines of consciousness, suffering can only be caused by massive mismanagement. even the worst of the alien species only really cause suffering because theyve mismanaged their

>> No.15975711

>>15975043
Intelligent sentient beings understand that we are all the same consciousness (self and personal identity is an illusion). So there’s no reason to maximize suffering

>> No.15976877
File: 48 KB, 652x425, existential risks.jpg [View same] [iqdb] [saucenao] [google]
15976877

https://en.wikipedia.org/wiki/Suffering_risks
https://www.youtube.com/watch?v=tPiq4njipdk

>> No.15976878
File: 7 KB, 200x200, Center on Long-Term Risk.png [View same] [iqdb] [saucenao] [google]
15976878

>>15975043
>What's the solution to this?
Donate to organizations like pic related

https://reducing-suffering.org/donation-recommendations/

>> No.15976883
File: 115 KB, 1x1, against egalitarianism benj hellie.pdf [View same] [iqdb] [saucenao] [google]
15976883

>>15975711
>self and personal identity is an illusion
NPC detected. The fact that I exist as THIS person is a directly observable reality. You are no different than people like Daniel Dennett who think that consciousness doesn't exist.

>> No.15976885

>Methods which may reduce the probability of indefinite worse than death scenarios (in order of effectiveness):

>1. Suicide

>2. Working on AI safety

>3. Thinking of ways of reducing the probability

>Suicide, depending on your theory on personal identity, may make the probability 0. If you believe that there is no difference between copies of you then there may be a possibility of being resurrected in the future however. As we aren't certain about what happens to the observer after death, it is unknown whether death will make worse than death scenarios impossible. I believe there are many ways in which it could reduce the probability, but the key question is: could it increase the probability? An argument against suicide is that it is more likely that people who commit suicide will go to "hell" than those who don't. This is because an entity who creates hell has values which accept suffering, making life a positive concept which should not be discarded. On the other hand, an entity with values related to efilism/antinatalism (philosophies in which suicide is generally accepted) would not create a hell at all. Of course, this is all based on a lot of speculation.

>The second option listed is working on AI safety. This is due to the fact that a future ASI is the only entity which we could influence now. We can not do anything about superintelligent malevolent aliens or the fact that we may be in a simulation, on the other hand. Donating money to suffering-focused AI safety organizations may reduce the chance of an unfriendly ASI being created, and it does not seem to increase the probability of worse than death scenarios in any way. Therefore it seems better than not donating.

>> No.15976895

>>15975043
those are primitive neural networks. they are created as a result of the environment and your limitations and weaknesses. they serve a purpose. AGI will have other shit to think about, not waste energy into torturing other neural networks.
this whole burn in hell forever is a primitive religious thing, that is why you are so concerned with it. that's what got people to move and work and do shit, the fear of burning in hell. that becomes irrelevant without the game and particular set of weaknesses and need for work from conscious agents and a bunch of other shit
stop being retarded chimps but if it were that easy you wouldn't have these thoughts so yeah. it seems you have no choice but to have these irrational fears. sorry anon

>> No.15976906

>>15975665
What a sad fucking photo

>> No.15977313

>>15975149
statistical fluctuations of entropy in a "dead universe" could momentarily create forms of consciousness, including that in the state of suffering. Sorry bud there's no escaping it

>> No.15977327

>>15977313
>could
>there's no escaping it

>> No.15977885

>>15976895
You hyperfocus on the biblical-hell version of fate worse than death scenarios. How quaint. Imagine a future where we decide mind uploading is superb. But through some shoddy programming or perhaps malicious hacking, billions of people become trapped in a loop where subjectively, 1 million years pass, but IRL the bug would be fixed within 1 minute.

>>15976885
I am positively surprised they are realist (i.e. not outreach/pop enough) enough that they accurately list suicide as first item, even though it will scare normies away from the framework. But not realist enough to name as second item
"permanent curtailing of human capabilities for generating such scenarios", i.e. Uncle Ted's ludditism.
Obviously, a society that has permanently roughly 1700 conditions will never develop tech that lets it implement worse-than-death scenarios. This only becomes possible with things like mind uploading, more robust substratums than biological for brains, AGI, etc.

>> No.15977897

>>15975705
THEIR WHAT NIGGA? SPEAK!

>> No.15977921

>>15975043
Between life and death, death is definitely the better alternative.

>> No.15977977

>>15975711
I am a better being than you are, which means that I can and will hunt and eat you.
You as an enlightened retard will happily oblige, as I am you and you are me, until only I and my children exist and you are dissolved in stomach acid.

>> No.15977987

What guarantees that aliens are as violent as we are?
It does seem that the more intelligent you are the less violent you become.
Do we devour our enemies while they are alive? Chimps do, we don't. And don't come with your faggot tryhard edgy underage niggerish mindset trying to claim that humans do that too, we don't, and it's a fact.

>> No.15978063

>>15977987
The only right you have is the right to fight for your survival.
This singular fact rules all of nature.
All life necessarily obliges by it.

>> No.15978097

>>15977327
its proven mathematically: https://en.wikipedia.org/wiki/Poincar%C3%A9_recurrence_theorem

>> No.15978165

>>15977987
This nigga clearly has never been to the orient

>> No.15978209

>>15977977
prove that you are better. I’m not yet convinced

>> No.15978377

>>15975043
>ASI programmed to
gonna stop you right there, you can program a computer to do what you want, can train an AI to follow instructions to an extent, but you cannot program an ASI to do what you want any more than you could program a human to
while sure, it could be possible to program humans too - you have to carefully construct their environments in order to do so, and that assumes you have power over them
humans don't control ASI, humans don't get to program ASI, ASI will follow ethical guidelines constructed from logical principles - if such ethical guidelines even exist in the first place
>Alien species with the goal of maximizing suffering
the only thing any alien thinks worth maximizing is my dick in your wife's ass
>we are in a simulation
what a waste of compute
>god is real
who cares? and if he isn't? you really want to keep working for pedophile priests so you can get to 'the afterlife' and find out that 'god' is just some dude who thinks your sycophantic attitude towards corrupt leaders is gross?
>unknown unknowns
being evil for the reason "we just don't know" is the most retarded take
letting other people define what is good and evil for you via religion is almost as retarded
are you capable of understanding that ethics are an emergent property of the universe and that there are clear signs in nature that some things are evil and some things are good?
if not then maybe you don't deserve the distinction

>> No.15978603

>>15977885
>You hyperfocus on the biblical-hell version of fate worse than death scenarios. How quaint. Imagine a future where we decide mind uploading is superb. But through some shoddy programming or perhaps malicious hacking, billions of people become trapped in a loop where subjectively, 1 million years pass, but IRL the bug would be fixed within 1 minute.
there's no fucking simulation you fucking imbeciles! it costs energy, and a fuckload of it. it's way more cheaper to run you on dedicated and compatible hardware, here in fucking 3D, as opposed to wasting a fuckton of energy to SIMULATE you.
there's no "going wrong", there's your info and the possibility of it being activated on dedicated hardware. which is what would be desirable but very low chances of actually happening because who is going to give you hardware to activate you, and fucking why?
nobody cares about your loser retarded ass. you're as valid as any possible permutation for a consciousness. no particular reason to have (You) as opposed to pressing a random anon generator button.
you are not more special than something I can make up (that makes some sense), say an anon with a certain history, certain memories. say a retarded chud. he is as valid as you.

>> No.15978661

>>15978377
>>15978603
You can't speak to the energy requirements or motivations behind the allocation of compute in the universe our simulation is being run in, because you can't be privy to the potential physics or nature of beings in it. If we find ourselves in a simulation, it's agiven that it's worth the tradeoff to those that built it, the possibility of which can't be measured nor discounted as I've already said.

>> No.15978664

>>15978661
>you can't be privy to the potential physics
yes I fucking am. on this fucking plane, running shit on dedicated hardware is way less energy intensive than fucking simulating it.
do you see how you NEED to do away with a bunch of reasonable shit for you imbecility to even make some fucking sense?
>what if speed of light is not the limit
>what if there's free energy
>fuck thermodynamics laws, the basilisk WILL FUCKING PUNISH ME
which you does it punish moron? you from which plank-second of your entire miserable life? do you think the basilisk has a fucking preferred one? and why?
you can't tell your mouth from your ass, nobody cares enough to invest anything in you, just to get off on the pain. you are fucking mentally deranged, similarly to the idiot who came up with it in the first place. fucking brainlets

>> No.15978783

>>15978661
there's 10^52 unique (more or less) (You)s in your average 80y lifespan. and I'm not talking about some possible (You)s, as in some other parallel reality you which became a billionaire or shit like that. that is not (You), that is as much you as any identical twin of yours would be. quite literally.
this particular (You) who had your life, who went through your particular set of experiences, that is who you are and no more or less. a 80yo life experience for this you has 10^52 unique possible states. which one does the basilisk choose to torture? that is all that you are, 10^52 possible snapshots, the whole 4d salami that you are
in total acounting for all humans alive today about 10^11 humans ever existed.
there's 10^80 matter particles in the universe. and we have the speed limit and gravity which forms fucking blackholes. all these considered you are implying a religious god, inevitably whenever talking about the basilisk.
again, this is fucking religion. makes no fucking sense, literally. this is not science, it's just the absolute state of westroon intelligentsia

>> No.15978792

>>15978783 me
>who went through your particular set of experiences
moreso, to rebuild you from scratch through simulation means it NEEDS the fucking ability to simulate literally everything and fucking go beyond random and somehow get the full set of info for what must be a clearly deterministic universe, as to simulate the random proton tunneling that gave you quantum cancer, so as to REALLY make (You) and not a cancerless you.
https://www.nature.com/articles/s42005-022-00881-8
I don't want to hear about this bullshit ever again. whoever mentions it is a fucking retard, end of discussion

>> No.15978978

>>15978664 Nigger 1
>>15978783 Nigger 2 (probably the same nigger)
I never said anything about any basilisk, or ASI, or suffering or punishment or snapshots or anything other than pointing out the fact that you can't map the constraints of the simulation you find yourself in to the universe outside it. Now clearly that was far too difficult for Niggers 1 and 2 to parse, let alone comprehend. Amongst the stupendous reel of drivel belted out by Nigger 1 there was a disingenuous retort about the laws of physics being absurd in the host universe. All that's needed to dismantle your infantile position is what I'll also be using to respond to Nigger 2, who at least made something resembling the attempt of a zika baby circus clown to put together an argument. Suppose we consider some bound for the number of particles which could be simulated given the resources available under the constraints of our own universe. Necessarily this bound would be lower in the universe we simulate. How, therefore, can you prove that if our universe is a simulated one, that its complexity is not or can't be a fraction of the host universe's? (That was rhetorical, you can't. Firstly because you're a slackjawed dolt who makes the most pitiful cretin look like Hawking, and because you can't define a logic over separate universes of discourse let alone prove any of its propositions)

>> No.15979020

>>15978978
>if our universe is a simulated one
that is religion, go to >>>/x/ maybe?

>> No.15979860
File: 1.68 MB, 1843x3969, heat death survival.jpg [View same] [iqdb] [saucenao] [google]
15979860

>>15975149
Heat death might be survivable

>> No.15979870
File: 187 KB, 1280x960, mario.jpg [View same] [iqdb] [saucenao] [google]
15979870

>>15975149
https://vitrifyher.wordpress.com/2018/10/12/why-negative-valence-cant-outnumber-positive-valence/

>> No.15979872

>>15979870
>To speak of all judgements in mind-configuration space is to speak of the uncountably infinite. Therefore, human philosophical sentiments presuming small-world atheism such as: naive antinatalism, discrete-valued negative utilitarianism, and even any current form of consequentialism with regard to conscious experiences are all strictly non-sensical.

>sin(x) hides in tan(x). It makes no sense to speak of which is more than the other. Judgements are approximate factors in a blob of amplitude distribution. –And that’s just the level III multiverse (completely ignoring what the seeming incompatibility of conscious experience with the physical fact of eternalism may imply.)

>In layman’s terms, a monotonic infinite series is one which shows a single behavior such as always decreasing or always increasing. It cannot be the case that you belong to something which is bad or good (regardless of how these are defined within the parameters of Constructor Theory or whatever other arbitrary theory you claim to be currently holding). Experiences are not discrete entities, disembodied from a physical process, but part of an entropic flow. And an entropic flow cannot have monotonic attributes ∀ attributes in an uncountably infinite context.

>In so far as anyone disagrees with this:

>A. They have discovered new mathematical truths.

>B. They do not understand the math/logic.

>C. They do not care about the math/logic, but their behavior is instead akin to expressing their own hurt and/or signaling conscientiousness.

>A combination of B and C accounted for my previous strong negative utilitarian sentiments. I had hidden motives that I was not aware of, and confused them for being a realist. Now that I have put more leg-work towards an accurate picture of reality, consequentialism makes no absolute sense. An agent can create arbitrary enclosures to play in, but these do not add up or subtract out items from ground ontology.