[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 79 KB, 600x338, TDTESS4.jpg [View same] [iqdb] [saucenao] [google]
2102480 No.2102480 [Reply] [Original]

/sci/ where do you stand on Newcomb's paradox?


A super-intelligence from another galaxy, whom we shall call Omega, comes to Earth and sets about playing a strange little game. In this game, Omega selects a human being, sets down two boxes in front of them, and teleports out.

Box A is transparent and contains a thousand dollars.
Box B is opaque

You can take both boxes, or take only box B.


The twist is if Omega has predicted that you will take ONLY box B, it put a million dollars in box B. If Omega predicted you'll take both, it put 0 in box B. And Omega has been correct on each of 100 observed occasions so far.

Before you make your choice, Omega has scanned you & moved on to its next game. Box B is either already empty or already full.


So do you decide
a) i want my million. Give me only box B.
or
b) the boxes are already fixed. Why not take both

>> No.2102489

I love you

>> No.2102493

>>2102489

I love you too. Now answer the question :p

>> No.2102495
File: 125 KB, 803x499, 1289461629152.jpg [View same] [iqdb] [saucenao] [google]
2102495

>> No.2102498

Its not my money. Therefore, I do not take it.

Seriously though, this is actually the only answer.

>> No.2102505

There's no correct answer, that's why it's a paradox. But most people choose a side really fast & stick with it forever.

>> No.2102506

I flip a coin, heads I take both, tails I take just B.

>> No.2102513
File: 50 KB, 345x345, i_dont_think_so_tim.jpg [View same] [iqdb] [saucenao] [google]
2102513

>>2102506
with a million dollars at stake?

>> No.2102515

>>2102506
That's retarded. Both boxes already have the money in them.

>> No.2102523

I'll just take B. $1000 is nice, but it isn't a huge deal.

>> No.2102530
File: 48 KB, 533x594, n725075089_288918_2774.jpg [View same] [iqdb] [saucenao] [google]
2102530

>>2102515
>betting against the alien superintelligence.

>> No.2102532

it seemed like a paradox at first but now i don't see it.

i pick box b, what happens? i get my million because it wouldve predicted i picked it?

>> No.2102533

>>2102498
>I'm against the concept of gifts.
Omega bases his actions on what kind of person I am.
Am I a person who will take only one box or one that will take both boxes.
I want the million, so I make sure I'm a person that only opens one box the only way I know how. I actually only open one box.

Yeah, I give up 1.000 monies. That won't even dent the shitfaced grin on my face.

>> No.2102537

>>2102515

But it was predicted that you would take both, so you only get $1000 dollars.

>> No.2102544

If we're creating a question which involves a being that can predict human action, then we really can't speak of humans as even HAVING a choice... From a deterministic standpoint, our choice has already been made.

>> No.2102547

>>2102523
I'd say it's a matter of principle. Lets say your buddy goes backstage and verifies that the boxes are very much set already. Are you just gonna be a coward & decide to pick B?

>> No.2102551

>>2102515
Retarded is thinking you've won this game by giving up the million instead of the 1.000. You change omegas superintelligent guess about what you'll do by changing what you'll actually do.

>> No.2102556

>>2102532
Its a violation of causality. You think that your choice can alter the contents of the boxes, when in fact it cannot.

>> No.2102559

>>2102544
The only thing that's already determined is the stuff in the boxes. I can pick whatever I want.

>> No.2102562

>>2102551
Well, its retarded for you. The boxes already have money in them. Flipping a coin isn't going to change that.

>> No.2102563

>>2102480
I would make a quantum observation and determine my results based on that.
How is this a paradox again?

>> No.2102564

>>2102547
When you're changing the premise you're removing the paradox part. And your principle is... what exactly? It's more rational to loose because your intuition leads you wrong?

>> No.2102573

>>2102563
Dohoho

>> No.2102578

To perfectly predict a human's future actions, Omega must know the human perfectly down to the atomic level, and the entirety of the human's light cone as well. For example, if the human decides to flip a coin, Omega should have already known not only that, but also which way the wind would blow at that time, and which way every single air molecule would fly and hit the coin when it's flipped. If the human decides to call a friend in Australia and ask him to flip a coin instead, Omega should have also known the pathways of the air molecules in Australia, and everywhere else as well.

Now, if we are sure that Omega's claim of infallibility is true (and assume that the laws of physics are not broken - if they are broken, nothing has to ever make sense again), we conclude that Omega can obtain atomic-level information on you (and the rest of the space up to a several light-minutes away) and run some kind of algorithm to predict which box this assemblage of atoms will "decide" to open. The most straight-forward algorithm would of course be a perfect virtual physical simulation of you and your surroundings.

>> No.2102579

>>2102562
Yes, but omega knows what kind of choice you'd do in such a situation.
You have two possibilities here:
Be the kind of person who takes both boxes and get 1000 or be the kind that takes one box and gets a thousand times that amount.

It's about signaling precommitment. Only omega is smart enough for functional mindreading and don't need actual signals, just that you are a guy who genuinely doesn't take both boxes.

>> No.2102581

>>2102563
>The problem is called a paradox because two strategies that both sound intuitively logical give conflicting answers to the question of what choice maximizes the player's payout.

wikipedia

Also, I really doubt anyone's stupid enough to just flip a coin with that much money on the line

>> No.2102583

>>2102578
I assert that a person in a perfect simulation has self-awareness indistinguishable from that of a person in the real world, since the "real world" is indistinguishable from a perfect simulation, by the definition of "perfect". The consequence of that assertion is that when you are standing there, looking at the two boxes, you have no way of finding out whether it is the "real" you, or a perfect copy of you in Omega's simulation, and whether the present you perceive now is actually a simulation of the hypothetical future several minutes from the "now" in the real world.

If you are in Omega's simulation (at least 50% chance, likely greater), once you open box B or open both boxes, you do not get a million dollars. You just cease to exist. Once Omega knows what it wants to know, it shuts down the simulation since it no longer serves any use. If the simulated you opens both boxes, the real you inevitably opens both boxes too, and doesn't get a million dollars, while Omega gloats. If the simulated you chooses box B, the real you also chooses box B and gets a million dollars.

If you believe that the simulated you and the real you are essentially the same person and want what's in their common self-interested, i.e that the simulated you should choose a course of action that benefits the real version of you, just as you every day choose actions that benefit the "future" you, the course is clear. You choose box B. Since you don't know whether or not you are a simulation done by Omega with the sole intent to trick yourself, you have no other choice.

>> No.2102588
File: 26 KB, 300x382, SpeakOutDerrenBrown.jpg [View same] [iqdb] [saucenao] [google]
2102588

Derren Brown pulls stuff like this everyday...

Problem, Omega?

>> No.2102589

>>2102579
His knowledge isn't gonna help him once the boxes are set now, is it?

>> No.2102597

Definitely B

>> No.2102598

>>2102589
True. If you find a way to be a who doesn't plan on opening both boxes or changing his mind after omega leaves and then magically (invasive brain surgery?) turn yourself into not such a person and I'll join you in taking both boxes. Until then I'll keep my wide grin from my huge pile of money while you loose out on principles.

>> No.2102599

Anyway I think both groups would agree if you can pre-commit to B before getting scanned then that's a total no-brainer.

>> No.2102604

>>2102598
Would it make a difference to you if you were like uncovering boxes that were buried thousands of years ago? Trying to emphasize the "set in stone" part.

>> No.2102605

Well, that was a nice little thread. Yeah. Yes it was!

>> No.2102616

>>2102599
I'd assume that me simply sitting here planning to only open one box would suffice for an entity with omegas capabilities.
Perhaps I'm wrong, still that 100% score for omega implies that the odds of that are slim.
Also the cost of being wrong about opening both boxes is a thousand times larger than the cost for being wrong about opening one box. With those odds only an idiot would bet against a superintelligent predictor with 100% success rate.

>> No.2102617

>>2102605
looks like you & me are the only rational two-boxers :(

"It's not my fault Omega chooses to reward only people with irrational dispositions; it's already too late for me to do anything about that."

>> No.2102619

>>2102604
That's a pretty good twist.

"On an achaeological expedition in the sudan, you come across an ancient coverstone to an underground cavern containing two chests. The coverstone tells the story of a mystic who lived in the year 2600 bc. He foretold the exact day on which the cavern would be uncovered, and even your name! The coverstone further states that...

(Then restate the problem. Ten gold bars in chest a, ten thousand in chest B.)

>> No.2102623

>>2102616
The problem only says 100 instances. I bet I could get 100% success rate just by predicting everyone takes B

>> No.2102629

>>2102617
It's not rational to consistently choose a loosing strategy.

>> No.2102631

>>2102619
day and name is rather clear proof >< How about just many sets of boxes

>> No.2102642

>>2102619
>ten gold bars
that's like, over 2 million dollars

>> No.2102645

>>2102629
Nothing guarantees that it's a losing strategy. I mean you could flip a coin (or do the quantum state) & logically omega is gonna be wrong 50% of the time.

>> No.2102646

>>2102642
depends on the size.

Well, a gold coin goes for about what, a grand? It just seemed cheap to have a chest with just a gold coin or even ten in it. They could be ingots.

>> No.2102651

>>2102642
I'm gushing because I got back from Chicago's Field's Museum. Exhibit on Gold wat!

>> No.2102653

>>2102619
Provided there was better proof than a stone tablet (could have been planted etc.) and two boxes and also proof that this being lives up to such bargains (which exists in the omega question) my reasoning still stands. One box.

>> No.2102661

>>2102645
cont:
In fact I'd say this problem is about how much you believe in causality. It's gonna take more than 100 box experiments to get me to give up causality.

>> No.2102665

If you aren't taking only box B the scenario went over your head.

>> No.2102667

>>2102645
It's been a loosing strategy in all observed instances. And even if you do the quantum state thing and get the odds of outsmarting omega there's a 50% chance that you do not. You're betting a million dollars to have a chance to win another thousand. And I thougt regular gamblers were bad at this...

And also, you've then changed your strategy from thinking your way through the problem to "meh, I'll just flip a coin". Not very rational.

>> No.2102670

take one box

open it, it contains 1 million dollars.

then when omega isn't looking snatch the other box.

does my million dollars then disappear?

If omega predicted I would do this and put nothing in box /b/, then I could just not take box a, since I would gladly give up $1000 to prove a superintelligent being wrong

>> No.2102671

im actually picking a superposition of box b and boxes b and a

>> No.2102673

>>2102661

Right, there's a causal chain going from how you think about this problem to what you get from omega. You propose to think about this in a way that leads to loosing a million.

>> No.2102674

The issue here isn't so much about whether or not it's rational to take the $1,000,000, because that's an obvious choice. The problem is why you don't have access to the extra $1,000. The $1,000 is in the clear box no matter what happens, yet it's only available if you sacrifice the $1,000,000 by betting against a being with near perfect information by assuming you have free will.

You made my forehead crinkle OP. I like you.

>> No.2102676

>>2102671
Still gambling with an expected return below one. Irrational

>> No.2102681

The problem with this paradox is that people think of it is "If I were selected, I'd do this." whereas it's really "You are selected. Pick one."

The only logical decision is to take both, since it's already been determined which one will be in there. If you take only one, you are assuming that your choices now will affect what has already happened.

In all seriousness, I would consider the contest rigged, take my million dollars and leave. No one can eliminate chance entirely.

>> No.2102682

take box b, open it, if it contains the million dollars then also take a. I now have 1.001 million dollars.

take box b, open it, if it contains nothing then I don't take box a. Now I have nothing but the satisfaction that I proved omega wrong and am smarter than god.

pretty simple choice really

>> No.2102684

this is probably wrong but... why not just take box B?

if there's 0 in it, then omega predicted your actions incorrectly, which it hasn't done the last 100 times.

or do we not know shit about what omega's logic?

(didnt read any other replies)

>> No.2102694

>>2102684
>>2102684
>>2102684
>>2102684

>> No.2102698

I think the most logical thing is that omega can psychically create money with his mind based on which boxes you open. Its the only way he could be right 100% of the time

>> No.2102701

>>2102684
Because the same logic that says "Pick B" contradicts itself. That situation already says both boxes are filled, so picking both is the real max payoff.

>> No.2102706

This question might be harder if it was

$10,000 vs $20,000

that way the risk/reward are more balance. Of course, I'd still take box /b/

>> No.2102710

>>2102682
>>2102682

What actually happens is you superstitious guys pick B and find nothing in it, so you take the 1k in transparent box A as consolation money.

>> No.2102715

It's not necessarily a true assumption that causation is strictly linear in time. Therefore saying that the boxes are already set might not be valid reasoning. My choice may in fact cause the state of box B despite box B being "already set". Therefore, I'll chose only box B.

After all, the most I can lose is $1,000. If I chose both I'd be betting against a 100-0 record, and be risking a million.

>> No.2102718

>>2102715

this is why moot should make /religion/

>> No.2102720

>>2102701
As has been stated, there's a causal link from your thought on such matters to both what's in the boxes and what boxes you open.
I can be the guy who will take both boxes and get 1000$ or I can accept that sometimes seemingly irrational decision making algorithms are quite rational and get a million.
I'm going to have to go with winning instead of whining.

>> No.2102727

>>2102720
Holy shit... I'm now picturing the boxes in some sort of shroedinger's cat like state... where your decision actually influences their contents.

>> No.2102735

>>2102706
It'd increase the weight of evidence for omega being a good predictor I'd demand before one boxing. "I'm superintelligent and am always right about what you'll do" Is somewhat fishy. Still, 100/100 might do it. It's still expected an return of one on a bet against 100/100.

>> No.2102748

>>2102720
>>2102720
/palmface
You have the direction of causality reversed. Your decisions cannot possibly affect the state of the box.

In the spirit of the problem I think both teams should agree on this point. We can even suppose the decision maker doesn't know the rules. Omega drops the rule book & boxes then flies away *before* you read it.

>> No.2102764

>>2102727
You honestly don't get how your algorithm for decision making decides both what omega puts in the box and what box you open?
You rigidly adhere to a method for decision making that causes Omega to give you less money. I adopt one that causes Omega to give me more money. I can't do that if I'm planning to open both boxes as soon as his back is turned.

>> No.2102776

picking both boxes isn't even an option here, you either pick only box A or only box B.

I'm planning on picking up the clear box, so I leave the simultaneously empty and money-fille opaque one behind because I know that there won't be any money in it when I leave the room. I don't care if I just proved an omniscient entity wrong I just don't feel like carrying around a worthless empty box.

>> No.2102784

>>2102776

why not choose box b and be a millionare?

>> No.2102787

>>2102764
Omega has already chosen how much money to give you, dude.

>> No.2102791

>>2102776

new rule: If he predicted you'd choose ONLY A, he put 1 million dollars in b just to troll you

what now?

>> No.2102793

>>2102748
>You have the direction of causality reversed. Your decisions cannot possibly affect the state of the box.

My decision making process influences what's in the box. The actual decision to open only one box follows from the same process.

>In the spirit of the problem I think both teams should agree on this point. We can even suppose the decision maker doesn't know the rules. Omega drops the rule book & boxes then flies away *before* you read it.

That doesn't matter. I'm precommitted to once I've read the rules choose the option of one-boxing.
Also, changing the premises of a thought experiment kind of takes the point away, which is why I will disregard the fact that I then have no reason to think omegas game is legit (i.e. he actually predicts things well and actually gives money). Lack of such reasons makes it another and considerably less interesting question.

>> No.2102798

All you fags are wrong. You can't outsmart Omega and your actions do not affect the contents of the boxes. There's a reason it's called a paradox.

I pick the million box because I know that Omega, NOT me and NOT my decision, put a million dollars in it. How do I know he did? because I took the million dollar box that's how I know.

>> No.2102799 [DELETED] 

Am I missing something?
Because the answer seems incredibly simple...
Take both boxes..

There only seems to be a paradox if you assume that Omega is always right.. But there is no reason to assume this, so bottom line is box B is ALREADY either empty or full, your decision at this point changes nothing.

Possibilities of taking only box B:
1million dollars
0 dollars..

Possibilities of taking both boxes:
1.1 million dollars
0.1 million dollars

I cannot fathom a situation in which you would want to take only box B...
Even if Omega predicts that we will use this logic then we will probably be getting $0 in box B, but at least you still get 1million in box A.

>> No.2102809

>>2102784
it's too late now I already left with box A
>>2102791
well I guess I should have picked neither box and left emptyhanded

>> No.2102811

>>2102798
But you are assuming that Omega is always right.

IT IS ONLY A PARADOX if you ASSUME that omega CANNOT be wrong.
If you accept that there is some possibility that omega can be wrong then it is not a paradox.

Nothing in the original question says that he cannot be wrong, therefore it is not a proper paradox.

>> No.2102820

What happens if you take only box A?

>> No.2102827

>>2102820
Same as if you took both and B was empty

>> No.2102835

>>2102827

Except it's not one of the options.

How's this thing supposed to work if you do anything other than one of the two proposed actions?

>> No.2102838

>>2102811

The original Newcomb's problem actually says that Omega is almost surely (with a probability of one for those of you not familiar with probability theory) right.

>> No.2102840

>>2102838
I see...
The OP only stated that he was right 100 times.

But ok, I stand corrected, carry on

:D

>> No.2102857

>>2102811
he can be wrong, but you can't prove him wrong unless you take neither box or only box A. You are not smart enough to prove him wrong. It's really not up to you what happens because it has already been decided. Omega did not effect your actions, but he predicted them, and you do not have the ability to act in a way that he did not predict. (Whether or not he had predicted it you would have picked the same box, so his prediction did not effect you). It isn't even percievable that your actions effected the contents of the boxes because if they had done so through some abomination of causality the outcome would have been the same.

There's no way the causality problem can be solved by something this simple, but: if A+B=C, then C-B=A. It doesn't matter which two were your previous data and which one is the final sum or difference, all the numbers are still the same, so causality by chronology is unmeasurable.

>> No.2102864

>>2102838
>probability of 1

Which is clearly impossible. If this thread proves anything, it's that there's crazies who will flip a coin.

>> No.2102908

>>2102864
So he's Laplace's demon and know what the result of your coin flip is.
Or he just flips a coin right back at you if he predicts you'll flip a coin or measure quantum states.
And then you've got a random chance of getting nothing in the opaque box and the standard 100% of 1000$ in the transparent.
Congratulations, you've turned a really good chance at a million dollars into a thousand dollars plus truly random chance of a million.
That's easily more rational than me who will take only one box and has as good as good a chance as the ultimate predictor of human action has of getting this right of a million and no chance at the thousand.

Explain to me how loosing is rational again?

>> No.2102929

You're all fuckwits.

See, I'd take box B. And because he's awesome, he'd totally know that, and I'd get a million dollars.

You're over thinking a very simple problem. The way to win is to be as stupid and closed-minded about it as you can.

>> No.2102945

>>2102929
Or simple realise that being rational is not about rules for thinking but rather about winning the game. Then realise that this particular game is set up so the winning move is to act stupid, precommit to acting stupid in such instances for rational reasons (I'd like to win this game) and then hoping omega notices your sincere intent to not take both boxes while doing his predictions.

>> No.2103024

I would think this super-intelligence can predict the universe, therefor no freewill , therefor I would have to take box B to make him think i'll take box B and win $$$. (cause he would have know if I change my mind and take both, and no I did not use free will to make this choice , this was the only possibility)

>> No.2103033

Seeing as box a only contains a thousand dollars, and I know this. I would choose box b; regardless of my ability to see what's inside.

The only downside to this decision, is that the opaque box may harbor something that could kill me; obviously I don't know that it doesn't. >_<

>> No.2103054

>>2102908
>Explain to me how loosing is rational again?

two boxers reject the "automatic loss" premise
one boxers reject causality

>> No.2103113
File: 456 KB, 485x495, 1290042868527.png [View same] [iqdb] [saucenao] [google]
2103113

Schrödinger's cat says hi

>> No.2103135

>>2103054
>two boxers reject the "automatic loss" premise
>one boxers reject causality

Errr, I'm betting on that the supposedly superhuman predictor accurately predicts my actions. Predictions does not violate causality. Even predictions that are so good they include predictions on whether or not you'll change your mind and still open both boxes.
Unless I'm to assume that either there's no possibility of winning the million (despite the example clearly stating this not to be the case) or that I'm somehow able to outsmart a superhuman intelligence (laughable, I'm human Omega is superhuman) the only chance to get the million is to be the kind of person who opens only one box. This is causality going the right way, my descision making algorithm is there before Omega poofs in and offers the game.

>> No.2103183

I take only box A
FUCK YOU SUPERINTELLIGENT BEING

>> No.2103187

>>2103113

SCHRODINGER'S CAT IS DEAD... well maybe.

>> No.2103189

This isn't a paradox at all. This is simply tricking a type of intelligence that exists at certain deviations. It points to the arrogance of humans to not consider how a higher being would see them. An average human wouldn't be able to surmise the consequences of being highly intelligent because they simply did not have such an experience.

In contrast, an intelligent human can estimate any obvious and glaring intellectual gaps between certain populations due to their experience which includes what it means to KNOW how something works and why they believe it -- whether they are ultimately correct or not-- because it has utility.

The average individual cannot comprehend the effects of being "smarter" because it is not in their environment (read physical/emotional/biological). A smart individual may have the mental tools to surmise from communication that the being is relatively supreme in intelligence and pick the non-obvious answer.

>> No.2103194

>>2103189
this is the most meaningless and pretentious post i have seen all week

>> No.2103620

the thing I hate about determinism is that it doesn't seem to give us any more knowledge about decisions people make, our decision could change in a matter of seconds just because of some very arbitrary unrelated thing that happens before our choice.

>> No.2104932

>>2103135
>the only chance to get the million is to be the kind of person who opens only one box.

If you can't see how that is backwards causation, there's no hope for you

>> No.2105151

>>2104932
It's not. See, my decision making process is set up so that I'll take one box (you should have figured this out from my answer to this hypothetical). This is the case before Omega puts money in the box and thus something that can cause there to be more money in my box without backwards causality.
If someone was to suggest Kavkas toxin puzzle to me I'd use the same decision making process to decide to drink the poison and then I'd do it.

>> No.2105366

The way I see it there are two answers that depend on how powerful Omega is.
If Omega was only "almost surely" correct, then you would pick both boxes. Omega has predicted what you will do regardless of your thought process and has set up both boxes with whatever the amount is in each- whatever plan to fool yourself once in the room wouldn't change what's in the box. So if Omega thought you would pick box B, then he put 1mil in box B, and if he thought you would pick box A then he put nothing in box B all before you entered the room. You're now free to pick both boxes because the money is already in the boxes so there's no reason to pick only B. By picking both A and B you did not change anything. If you make yourself a "box B only" type of person and he knew that, then once in the room you could take both because this is based on Omega being able to be wrong.

If Omega was truly a higher deity that cannot be wrong, then you would pick box B. Whatever you pick is what he predicted so by picking box B there would be the 1 mil because it was the right choice and he knew you were going to pick it. This is based on Omega not able to be wrong, so whatever you pick is the right choice.

>> No.2105379

but hard determinism would take into account the last second change of mind wouldn't it?

>> No.2105395

>>2105379
That would make Omega unable to be wrong so it would fall under the second answer. The first one gives you the ability to fool Omega and make his prediction wrong.

>> No.2105425

>>2102480

if Omega is always right, either option ends in $1000

if he could be wrong, you could end up with $0 if you pick B when he thought you would pick both. BUT you could end up with $2000 if you pick both and he thought you would pick B.

so I'd pick both on the offchance that Omega is wrong and I get $2000. If he is not wrong, then I can still be guaranteed $1000.

>> No.2105433

Take B.

If there's nothing in it I'm taking A too because Omega is a faggot for fucking me about.

>> No.2105447

>>2105425
>if Omega is always right
then by picking B i'd prove him right and get 1 million.

>> No.2105462

>>2105447

which is awesome

>> No.2105475

>>2105447
the is the best answer

he's been right on the previous 100 occasions, so he's right a very large percentage of the time at the very least. and he presumably saw that I was going to pick b because picking B gets you $1,000,000.

>> No.2105478

>>2102513

A million dollars at stake? You don't know what's in box B. Therefore it's not really at stake.

>> No.2105486

>>2105478
I know I'm going to pick B
he knows, because he scanned my brain and saw what I had planned on doing
so he put a million dollars in box B

always pick B. this should be obvious to anybody

>> No.2105570

>>2105486

The whole point of the experiment is that you don't know what's in box B you aren't told what his plan is. Unless he tells us before hand that he has 1 million in box. However, even then, you wouldn't be aware of his intentions to remove the 1 million if you chose to take both.

>> No.2105593

>>2105570

He doesn't remove it, kid. This shit is like schroedinger's cat.

>> No.2105598

>>2105593
>kid
confirmed for 13-year-old

>> No.2105606

>>2105478
That makes no sense. B contains either a million dollars or nothing. Therefore a million dollars is at stake.

>> No.2105608

The true paradox is this thread.

By providing the meta data of the scenario, it creates a contradiction to the hypothesized scenario, thereby making any decision or statement otherwise moot/redundant/contradictory/etc.

>> No.2105609

>>2105570

then how is it a paradox if we aren't aware of the result of the choice?

>> No.2105967

Can anyone think of a way for Omega to safeguard against
>>2102682
without creating the possibility for Omega to cheat? (For example, being forced to register your choice before observing the contents of box B would stop it, but that would also allow Omega to set up a mechanism to alter the contents of box B based on the choice.)

>> No.2105982

Niether

>> No.2106000

>>2105967
Set it up so that box A self destructs when you open box B. Stick the following note on top of both boxes: "If you do choose to open both boxes, start with box A and remove the moneys before opening your complementary empty box"

>> No.2106006

I don't get this, how is it a problem or a paradox?

If the being has been correct 100/100 times, and I am made aware that if the being is right, and I choose only box B, then I get a million dollars. Then why wouldn't I choose box B?

Statistically speaking, there is more than a 99% chance that box B contains a million dollars. Even if were to completely disregard statistics, if I were to choose box B and receive nothing, then doesn't that mean that I outsmarted a super-intelligent alien?

Like seriously, why would anyone not pick box B when the alien has been right 100/100 times so far?

>> No.2106028

I take only B and open it. if there isn't 1 mil in there, I go take A and leave with my thousand. problem solved.

>> No.2106034

So then, there are four possible outcomes. The most probably outcome would be 1 & 4.

1) You choose Box B and there is nothing, so you take Box A as well.
2) You choose Box A and take Box B as well (or you can ignore it and leave it be), which has nothing.
3) You choose Box B and there is a million.
4) You take Box B, and there is nothing, so you leave Box A alone, and break the super-intelligent alien's 100-0 streak, which certifies you smarter than a super-intelligent alien, and strokes your ego eternally.

I think knowing that I outsmarted a super-intelligent alien from a different galaxy is worth more than a million dollars.

Personally

>> No.2106071

I take the box A, smash it on the box B, check what's inside and then kill Omega with the shards.

>> No.2106072

Kill omega, take all the money.

>> No.2106123

also, are we saying that Omega is always 100% right, or that he's just 100% right SO FAR? because if he's just statistically correct 100% of the time, it seems like it has no bearing whatsoever on the outcome because he's already determined the outcome himself. furthermore, why doesn't he stay to see which one you pick? how can he know for sure that you actually did what he predicted

here's my theory: Omega is actually a giant douchebag looking to stroke his ego, and the only reason he has a 1:1 ratio for being correct is because he just assumes he's right and moves on, being the giant douchebag he is. if anything, we can say that we've observed him behaving this way 100% of the time, so we have more of a reason to believe our own intuition rather than his fudged statistic about him being a super-smart know it all.

>> No.2106131

>>2106006
>Like seriously, why would anyone not pick box B when the alien has been right 100/100 times so far?

because transparent box A with the clearly visible 1k will forever haunt your dreams. You'll always wonder what really stopped you from taking both.

>> No.2106144
File: 39 KB, 450x599, 1275210983704.jpg [View same] [iqdb] [saucenao] [google]
2106144

Being right or wrong is not up to Omega at all, it's up to the subject. The subject chooses whether Omega is right or wrong. Assuming everyone opens box B first, there can either be a million or nothing.

If there is a million, the person can then grab box A, and thus Omega will be wrong.
If there is nothing, the person can ignore box A, and thus Omega will be wrong.
The only two ways Omega can be right is if the person grabs box A as a consolation prize after opening box B and realizing it's empty.
The other way is if the person opens box B and finds a million, he then chooses to ignore box A and thus allows Omega to be right.

Whether Omega is right or wrong is a decision completely based on the subject, if successful tactics are applied, Omega will be right 0% of the time. What a silly game, Omega has no chance versus those who wish to win.

>> No.2106154

>>2106144

That's what I've been trying to tell them, but they think 100 successful attempts = all logic goes out of the window

>> No.2106162
File: 20 KB, 204x203, iocaine_powder.jpg [View same] [iqdb] [saucenao] [google]
2106162

>>2106000
But that still leaves room for cheating. If you take the money from box A, the cheating apparatus disappears. But if you open box B first, the cheating apparatus transforms into a million dollars.

A possible safeguard against cheating from Omega's end is the approach of
>>2102547
where a trusted agent other than you knows the contents of the box before your choice. Then you can register your choice before looking inside Box B, knowing full well that registering your choice cannot change the contents of Box B without exposing Omega as a fraud. Registering "only B" would destroy box A; that prevents cheating from your end. But there's still the possibility of cheating from the trusted agent's end by signaling to you what's in Box B. Presumably Omega could scan the intentions of the trusted agent, but if Omega was allowed to reject agents he didn't like, that would allow Omega to forbid any agent that wasn't collaborating with HIM. So we can clearly not choose the trusted agent approach!

>> No.2106428

>>2106162

So much thought and time put into something that makes you look so stupid.