[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 92 KB, 420x432, 1320114684337.jpg [View same] [iqdb] [saucenao] [google]
4159103 No.4159103 [Reply] [Original]

You are given control of 10 people.

Option A) Execute the 10 people
Option B) Let them go free. However, if you let them go free there will be a 1 in 100000 chance of the world ending

(Note that this will be the only decision this consequential that you will ever make)

>> No.4159110

>>4159103
Useless hypothetical is useless.

>> No.4159122

>>4159103
People die every day. Killing those 10 would hardly matter.

>> No.4159125

I make a decent PRNG, ask him for a number between 0 and 1. If that number is above 1/100000, I kill them. Else, I let them go.
Basic statistician/algorithmician/gametheorist procedure.

>> No.4159131

A) Kills 10 people.

B) World population = 6,840,507,000.
Chance of killing them all is 1/100000.
Expected value 68405.07.

A is the clear choice. Kill less people.

>> No.4159139

>>4159122

This.

Do we know who the people are? Are they close to you (self bias), important (culture bias).

The chance is less than that every day and many more die per day, but due to human cognitive bias one must assume that he will spare the ones he loves (so I would let them free rather than kill me, which, from my perspective, would be worse than the world ending because either or they would die. The problem then is deciding this indefinitely.)

The problem with not killing them = you see the world solipsistically. If they die and you love them, it sucks ass. If they don't and the world ends, it is irrelevant.

Otherwise, if we have knowledge of whom is there, perspective bias says that the value of 7,000,000,000 > 1. Kill.

Making the decision is harder than saying it here.

en.wikipedia.org/wiki/Cognitive_Bias
e.wikipedia.org/wiki/Ethics

>> No.4159142

B)

I am not a fucking murderer.

>> No.4159151

>>4159131
Yeah, but you will only make this choice once, so it is overwhelmingly more likely that B will kill less people.

>> No.4159152

>>4159139
>If they don't and the world ends, it is irrelevant.

Why is the world ending bad?

>> No.4159153

Depends on the people.

Are they close friends/relatives? B. Do I have no idea who they are? A.

Deal with it.

>> No.4159154

It is also the biggest decision of your life. The integral of the question depends on the function, or input. Therefore this means that we require more information to make a choice that is in accord with human emotional, social, and mental needs.

Of course, mathematically, we assign an economical value to the importance of those people relative to the self, and to the world, and compare it. Otherwise it is simple better to kill 10 than possibly kill 7,000,000,000. Other factors include, what is the chance of the world ending otherwise, do people know, what stressors are you under in the situation, et cetera.

en.wikipedia.org/Trolley_Problem

Is possibly killing them, by not killing them directly as bad as killing them directly. Is not killing = killing in ethics and probability?

>> No.4159156

>>4159152
To clarify, I wasn't challenging the quoted poster, only bringing up his point.

>> No.4159163

>>4159142

Option A: Kill a homeless crackhead.

Option B: Everyone on this planet is burnt alive and tortured, humanity is wiped out, etc. Including your family and especially your dog, which you must watch.

>> No.4159171

>>4159163
>>4159163

OP simply stated that it ends. You may or may not witness, it may not be in accords with human logic and has multiple variables.

>> No.4159174

>>4159171
I'm not OP ya daft cunt

>> No.4159178

>>4159103
as long as they are different races I'd pick A

>> No.4159194
File: 34 KB, 499x499, 1322260153101.jpg [View same] [iqdb] [saucenao] [google]
4159194

Let them go, I would have to have some shit luck to never win the lotto but end up destroying the world with a 1/10k chance.

>> No.4159185

>>4159174

I know.

>> No.4159186

There is an EQQQQQQQQUUUUUUUUILLLLIBRIUMMMM ECON 101 LOLZOR HAHAHAHAHAHAHHA


anways, yeah, obviously A, shitheads.

The interesting question is though, to keep increasing the number killed in A and decreasing the probability of the world ending in B, when will you become INDIFFERENT to choosing A or B?


(anyone choosing B right off the bat is a fuckign retarded and if their lives were in my decision, the I would see to it that they would have very painful deaths)

>> No.4159197

>>4159185

I know ya know ya bloody wanka

>> No.4159207

let them go. show some confidence in the world OP

>> No.4159205

>>4159186

A. Execute with 100% chance of humanitie's survival. Your life is made a living hell, and you may have killed ones you love or kill those who will determine the future.

B. Spare the ones and make them happy. Make all happy. They may or may not do something stupid. There is neutrality either way. The world is wipe, you cease to exist. No big deal. Its not like its apocalyptic, and devastatingly suffering. Its simply gone. OR they go free.

>> No.4159213

>>4159205
for B, OP just said the world ends, he didnt say how.
the method could be all of humanity simultaneously burning to death as the earth slowly gets scorched by fire.

...just saiyan.

>> No.4159216

>>4159213

I'm aware.

>> No.4159219

>>4159186
You're a fucking moron. I hate pompous assholes like you. Go fuck yourself, dipshit.

>> No.4159221

>>4159219
NOOOOOO YOUU

>> No.4159233
File: 6 KB, 158x204, images.jpg [View same] [iqdb] [saucenao] [google]
4159233

Option B is the obvious choice. If nothing happens, you are a hero who just saved 10 lives. If the world ends, noone will be left to blame you..

>> No.4159242

B

1/10000 = Definitely not going to happen unless you make the decision several times.

>> No.4159251

>>4159242
>clear misuse of the word 'definitely'

>> No.4159266
File: 11 KB, 250x250, 1251919276122.jpg [View same] [iqdb] [saucenao] [google]
4159266

NEVER TELL ME THE ODDS!!

B.

>> No.4159279

>>4159251

Exaggeration.

>> No.4159287

Force one to kill the other nine. Then kill him.

My conscience is relatively clearer.
Goodnight.

>> No.4159493

Man, I'm not gonna let he chance that I can end the world go for the sake of just killing 10 people. Obviously I choose B.

>> No.4159498

>>4159103
A obviously, 10 deaths is literally not even noise in the data.

>> No.4159538

1. What the fuck is the point of this
2. What does this have to do with science and math?

>> No.4159540

B. There is no possible way for 10 people to cause the world to end.
Also very retarded question.

>> No.4159566

B. Gives me a 10^-6 chance that all future suffering is prevented.

>> No.4159576

>>4159242
>cites the wrong probability
>writes "Definitely not going to happen" while acknowledging that probability > 0

>> No.4159622

>>4159233
FLAWLESS VICTORY

>>4159154
if we're using 7bil, 1/700,000,000 probability of world ending has same expected deaths. Answers to the question would imply whether people were risk averse or risk loving w.r.t. other people's death.

OP:
world ending=assume 100% fatality?
also, testing for zero probability fallacy?

New Q:
kill 10 people or subject 20 people to death on a coin toss? Is the answer different than kill 10 people or subject humans to 1/700,000,000 probability of extinction

>> No.4159774

>>4159622
You forget all future humans and posthumans (including after space colonization). They won't exist if the world ends now. This could be a good thing or a bad thing but there's a lot more than 7 billion lives at stake.

>> No.4159858

B is the only decent choice...
at least then there will be a 1/100000 chance