[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 98 KB, 1042x804, philosophy.png [View same] [iqdb] [saucenao] [google]
4056036 No.4056036 [Reply] [Original]

An interesting moral conundrum

>> No.4056056

0/3 is bad

>> No.4056065

Tell the brain to flip a fucking coin.

>> No.4056083

>>4056065
Where does it say the brain is capable of flipping coins?

>> No.4056085

>>4056065
The brain doesn't know coins.

>> No.4056089

I'm this brain.

And the best choice is clear...

OP is a fag. Take your philosophical bullshit somewhere else. This isn't /sci/ related.

>> No.4056199

"Hoooooly shit."
My reaction upon reading the pic.

Well. First of all, the cartesian demon is irrelevant, since its existence is unverifiable. Second of all, the knowledge that the brain will serve as a model to other brains is also irrelevant, since we should choose the right course of action no matter what (and since the situation is unlikely to be duplicated even once in the next ten thousand years, considering how many unlikely factors are present). So. Let's ignore those two little tidbits.

Now, both Rightie and Leftie are going to kill the same five men, one of whom is going to kill thirty orphans. So that's also discountable. We can also ignore that two of the five men are the man who put the brain at the controls and the man who wrote the example, since, again, their deaths are inevitable. (Wow, that really breaks the fourth wall, doesn't it?) The only difference between Rightie and Leftie is that Rightie is going to kill the five men intentionally, whereas Leftie is going to kill them unintentionally. Since the brain is apparently clairvoyant, and <span class="math">knows[/spoiler] outcomes, intentions are irrelevant. If the brain didn't <span class="math">know[/spoiler] outcomes, it would be a different story. Intentions would mean everything.

Now, killing Leftie also kills ten heart transplant donees. However, if they die, then twenty kidney transplant donees get to live. The fact that one of the kidney transplant donees is Hitler is irrelevant, because another one cures cancer, and the number of people cured of cancer over any sufficiently great period of time will be significantly greater than the number of people killed on the orders of a single man in less than one lifetime.

Therefore, the brain should kill Leftie. The five men get killed, the thirty orphans get saved, the ten heart transplant patients die, and the twenty kidney transplant patients live.

>> No.4056277

>>4056199
If you assume someone else can also find the cure for cancer then the thing that matters is how much faster would a cure be found if the guy needing the kidney was saved, and if the number of patients saved by finding the cure faster is greater than the people killed by hitler.

>> No.4056292

>>4056277
>If you assume someone else can also find the cure for cancer then the thing that matters is how much faster would a cure be found if the guy needing the kidney was saved

Oof. I hadn't thought of that. In that case, the question cannot be answered unless the brain already knows the answer to that one.

>> No.4056294

Don't do anything and allow nature to take its course. Sometimes the best choice is nothing at all.

>> No.4056353

>>4056199
war crimes

sage because stupid

>> No.4056489

I need quantification of the tyrant's negative action, Hitler's negative actions, the amount of war crimes, and also the definition of war crime, to make a utilitarian decision.

>Furthermore, there is an intermittently active Cartesian demon deceiving the brain in such a manner that the brain is never sure if it is being deceived

OH FUCK THIS

>> No.4056620
File: 46 KB, 446x388, 1279891806797.jpg [View same] [iqdb] [saucenao] [google]
4056620

OP's image is a fairly good deconstruction of Utilitarianism

>> No.4058405

>>4056036
The Brain would short out, and go whichever way the track is already set because it's:
>Casually hooked up.
Also,
>implying the robo-brain has any compassion left for its ex-brethren who use it now as nothing more than a shitty basic computer.

>> No.4058421

>>4056620
And other ethical systems could solve this problem *so* much better, right?

>> No.4058432

Both choices are equally devoid of merit.

>> No.4058445

>>4056620
> 2011
> does not know utilitarianism is right by definition

>> No.4058464
File: 1.35 MB, 799x796, corsair_himself.png [View same] [iqdb] [saucenao] [google]
4058464

Running over the hearts seems like it'd be pretty metal, let's do that

>> No.4058465
File: 14 KB, 298x241, penis-anatomy.jpg [View same] [iqdb] [saucenao] [google]
4058465

>>4058445
My ethical theory, absolute penisarianism, is also right by definition. It is centered around including as many penises in as many things as possible. Penis!

>> No.4058510

I love those types of elaborate and complex hypothetical scenarios. Does anyone have any more of these?

>> No.4058529

>>4058510
think of me being you in a different context of spacetime.
think. and please don't post

>> No.4058547

Is this a trick question and i am being trolled, or the orphans need to die?
Right(direction)?

>> No.4058612

>>4058445
as OP's image demonstrates, it's a crock of shit because you are never able to oversee the consequences of your actions

>> No.4059051

bump

>> No.4060353

I've already considered so many of those thought experiments. Any more?

>> No.4060379

little over the top. if you cut it to half of its current size it would be good.

>> No.4060401

>>4056620
>Unclear questions mutilated and stitched together until barely recognizable shapes of original form
>Deconstruction of Utilitarianism

Really?