# Afternoon Inquisition 1.22

Well, I guess it’s time to dust off an old classic for today’s Afternoon Inquisition; Newcomb’s paradox.

It’s simple. It’s thought-provoking. It’s revealing. And besides, I couldn’t think of anything else to ask you all today.

So . . . . Newcomb’s Paradox:

A highly superior being presents you with two boxes, one open and one closed. In the open box there is a thousand-dollar bill. In the closed box there is either one million dollars or there is nothing. You are to choose between taking both boxes or taking the closed box only.

But there’s a catch.

The being claims that he is able to predict what any human being will decide to do. If he predicted you would take only the closed box, then he placed a million dollars in it. But if he predicted you would take both boxes, he left the closed box empty. Furthermore, he has run this experiment with 9999 people before, and has been right every time.

What do you do?

This may seem like an easy problem, but there is a lot going on here. Let’s see you wrap your brain around it. And please include an explanation of your choice.

### Sam Ogden

Sam Ogden is a writer, beach bum, and songwriter living in Houston, Texas, but he may be found scratching himself at many points across the globe. Follow him on Twitter @SamOgden

### Related Articles

1. Me? I think I’d choose both boxes, so I could get the open box. \$1000 dollars in the hand is worth a million in the bush.

Well, you know what I mean.

Besides, if the highly superior being (who is that? William Shatner? I’m going to pretend it’s William Shatner) is always correctly predicting, that’s the only way I see any scratch, right?

2. Careful. Bill Shatner might change the rules on you though. That’s how he beat the Kobayashi Maru.

3. @a.real.girl:
But if he is always right then the closed box would contain \$1,000,000, so I would chose the closed box only. So either I get a \$1,000,000 or I prove an omnipotent being wrong. Sort of like the JREF prize in reverse.

4. Maybe I’m just thick, but I don’t see where the paradox is. The only question I see is knowing whether the creature is telling the truth about being so accurate. If he is, then you take the closed box only, because he would have predicted you would have taken it, and it would have the million.

And who’s this Newcomb person anyway?

5. Closed box.

I split it into two options;
a) take both (1,000 or 1,001,000)
b) take closed (0 or 1,000,000)

It knows what I’m going to do, so apply the conditions of his prediction to each option;
a) take both (1,000)
b) take closed (1,000,000)

1,000,000 >> 1,000, so take the closed box and hope it doesn’t screw up its prediction.

6. @A.Real.Girl and Sam_Ogden: Or worse, it might be Bill Shatner as “Big Giant Head” from Third Rock. Then, all bets are off. :-D

7. I already have zero. So, I might as well as go as bold as possible and take the closed box. There’s absolutely no risk in taking the closed box, except walking away with zero.

8. Thickness is running rampant today because I don’t get the big paradox either. What I would do is flip a coin. “Predict this, Gazoo!”

9. “The being claims that he is able to predict what any human being will decide to do. If he predicted you would take only the closed box, then he placed a million dollars in it. But if he predicted you would take both boxes, he left the closed box empty. Furthermore, he has run this experiment with 9999 people before, and has been right every time.”

Well, if he is able to predict what I will do, then I will take both boxes, because obviously, he knew I would predict both boxes.

(In reality, I’d probably choose the open box, because I’m a chicken shit that doesn’t gamble. I used to fly to Vegas every other month for nearly two years (ltr), and I gambled all of ONCE, and not with my money.)

10. Er, I read that wrong. Woo. Obviously I’d do what Hanes said since, if he can predict what I’ll choose, then you always choose the best option.

11. I’m a violent, unpredictable, second-cousin of a monkey. I’d shoot the bastard, take both boxes, and then go straight to the tabloids selling photos of the body. Even if the closed box is empty, I’ll make a million dollars easy.

12. I would choose the closed box. If professor chaos is so smart he knows I would do that and I get the million. If he isn’t and he is wrong then Dr. Impossible will need to rewrite his thesis.

13. Simple, I just tell him “NO DEAL!” and then I get the option to open one more box… then I know what’s in it!

14. @davew: The Great Gazoo! I haven’t thought of him in YEARS! Thanks for the reminder!

For what it’s worth, Sam, I would have taken two boxes, so I fit into one of the two most c0mmon categories of answers.

And I’m a nice enough guy not to shoot the alien – after all, he just gave me some money and confirmed (to me, at least) that there is intelligent extraterrestrial life. I might ask him to come along to NASA with me, though. Then I’d be rich and famous, too. :-D

15. I think that what you have to remember here is that the alien? He’s crazy. He’s damn Santa Claus going around saying to you, “Don’t worry, I know you’ll pick correctly! Please! Take my money!” So you take the closed box, and here’s why, because if he’s wrong, he’s this jolly fellow around joyfully giving his money out anyway (I mean, the guy’s gotta have given out minimum 9,999,000 dollars already) he’s going to give you the money. He’ll even probably apologize. Take the closed box, because even if somehow, his predictive abilities are wrong and you wind up with nothing, he’ll probably give you the thousand bucks anyway. He’s already given out almost ten million minimum, what’s another thousand to the guy?

16. When I start I have nothing, I have not chosen any box.

I think I have the possiblity of four totals
\$0 , \$1000, 1 million, and 1,001,000

So the alien is has stated he has not missed in 9999 tries at predicting. So he basically saying I can’t choose in a way where he is wrong. So this pisses me off a bit because I would like to think I have the ability to choose something this arrogant alien bastard can’t predict.
So I am left with the choice of do I participate in this demostration of my lack of free will or do I just say fuck it and go get a coffee using my own money but with my free will in tact. I choose not to participate.

17. @Elyse: Brilliant! But there was nothing in the description about the boxes being held be curiously artificial women in heels. That might have changed by decision-making center by about three feet.

18. Well if I take both boxes I’d get either 1000 or 100000, but if I took one box I’d get 0 or 101000. So I’d take both so I’d at least get something. His prediction seems irrelevant because people behave how they behave, he just happens to be right about it. So you’d also have to predict how other people would respond. If other people didn’t come to the same conclusion as me, I’d get 101000. So I’d be banking on other people being stupid, I guess. Maybe that’s not the best strategy… :)

19. I fucked up some of those numbers I think, whatever you get the gist…stop asking these after work! :) Bastards…

20. I don’t see how the number of times he’s successfully predicted the outcome has any impact on my decision.

I disregard the only potential zero payout of taking the closed box alone, and take both.

The only way I walk away with nothing is if I don’t take the open box. If I leave the closed box I definately don’t get the million, but if I take it as well there’s a chance I might get it.

I’m not sure if I’ve understood this as it seem like a bit of a no-brainer to me.

21. There is no paradox, we have experimentally established that (within a statistically significant sample of humans) he has 100% accuracy. Ergo, it doesn’t matter which you will choose, because Shatner will have known which one you will choose anyway. The question I have is (because we have established that he is ‘never’ wrong), do we sample subjects have any free will in the matter?

22. I’m a cheater too and read up on it on wikipedia – there the paradox includes the fact that the being’s predictions are “almost always” right, not that he’s been right about every one before…

23. But the Predictors prediction is specific to you, and he is very very accurate. So if you chose both boxes, the Predictor would have predicted this and left the closed box empty.

24. If the cash has already been placed in the boxes before I got there, no amount of hemming and hawing on my part is going to change what’s in them.

I take both. The closed one either has a million bucks or it doesn’t, and my decision at that point can’t possibly affect its presence or lack thereof.

And if I find a dead cat in the box, that’s two paradoxes with one stone!

25. Closed box for me.

If he has always been right, me choosing the closed box means he predicted it, which means I get the million. According to the scenario, chosing both would never yield anything more than 1000.

Oh and another thing, where would you have to go to be able to USE the 1000 dollar note? It doesn’t exist in actual current circulation. Give me the million, at least it stands the chance of being in a usable denomination free of suspicion from banks and businesses.

26. My solution was pretty much the same as Scott Aaronson‘s:

Here’s you, and here’s the Predictor’s computer. Now, you could base your decision to pick one or two boxes on anything you want. You could just dredge up some childhood memory and count the letters in the name of your first-grade teacher or something and based on that, choose whether to take one or two boxes. In order to make its prediction, therefore, the Predictor has to know absolutely everything about you. It’s not possible to state a priori what aspects of you are going to be relevant in making the decision. To me, that seems to indicate that the Predictor has to solve what one might call a “you-complete” problem. In other words, it seems the Predictor needs to run a simulation of you that’s so accurate it would essentially bring into existence another copy of you.

“God, this guy is being a dick. He’s like the villain in a bad Star Trek episode. What was that one with the superpowerful being who turned out to be just a child? Oh, yeah, “The Squire of Gothos”, with Trelane, set in 2267. Is that a prime number? . . .”

Let’s play with that assumption. Suppose that’s the case, and that now you’re pondering whether to take one box or two boxes. You say, “all right, two boxes sounds really good to me because that’s another \$1,000.” But here’s the problem: when you’re pondering this, you have no way of knowing whether you’re the “real” you, or just a simulation running in the Predictor’s computer. If you’re the simulation, and you choose both boxes, then that actually is going to affect the box contents: it will cause the Predictor not to put the million dollars in the box. And that’s why you should take just the one box.

27. @Sam Ogden: Oh so that’s the “paradox” – just that we’re screwed either way. :)

@jtradke: Yeah that’s sort of what I was thinking. My subsequent choice doesn’t affect the box, he already made his prediction. I’m assuming he made the prediction based on something (like observation of human behaviour) and not just magical deity “knowing”.

28. I’d take the closed box and get the million. That’s what the being predicted I’d do anyway.

29. @QuestionAuthority:

You’re welcome. I had a hard time choosing between Gazoo and Marvin. I finally just flipped a coin.

30. @Kimbo Jones: Any sufficiently advanced technology would be indistinguishable from magic. And besides, he knows what you’ll do regardless of the mechanism, so it’s not like you have a choice either way.

31. My response is based on the specific dollar amounts that were chosen.

US\$1,000.00 more or less wouldn’t make a whole lot of difference to me. It would pay off 1/15 of my credit card debt. Big deal!

US\$1,000,000.00 on the other hand would be a major life-improving bonus, even after taxes.

Therefore I would take a chance, and just take the closed box.

32. (Assumption: SB is honest about its representations.)

If the superior being (SB) is correct about me in its predictions because SB assumes I am rational, then SB will predict that I will choose the option that will, given SB’s only two potential actions, provide the greatest payoff – i.e., closed box only (\$1,000,000). So should I do the rational thing based on the assumption that SB predicts that I will be rational?

On the other hand, 9999 people could not all have acted rationally, and yet, SB was correct in its predictions about them. So I am skeptical that SB would assume that I am rational in making its prediction. That is, I am only as rational as the 9999 others.

Hmmm.

I will take closed box only … Do I win?

33. I switched boxes when your back was turned!!You FOOL! You just fell victim to one of the classic blunders.

It doesn’t matter what he predicted. My best odds are with picking both boxes. Can we combine this Paradox with Pascal’s Wager?

34. @Sam Ogden: The question is moot given we only act as though we have free will. In other words free will and determinism are not allowed to get in bed together and really shouldnâ€™t even flirt or make passes at each other. I prefer the â€˜Monty Hall problemâ€™ because it makes me feel like I have a part in the decision.

http://en.wikipedia.org/wiki/Monty_Hall_problem

35. If the Predictor is always right about what people will do, then *everyone* taking both boxes gets only \$1K, and *everyone* taking only one gets \$1M. So I’d take only the single, closed box.

It’s not important to this idea that there’s dispute between the two possibilities (plus the “it’s an incoherent question” possibility). It only matters that I’d obviously come to this conclusion, and the Predictor knows that.

Of course the Predictor may not always be right, but it seems the way to bet.

36. Another thing that concerns me, personally at least, is well… I live in New York. A thousand dollars, that’s nice and all, but when all is said and done, it’s about a month’s rent. Nice, sure, but I can do without this higher intelligence giving it to me. One thousand dollars, that’s going to help me out for a month, maybe two. A million, that’ll change my life. If I am to have a bit of trust in this guy’s marvelous predictive abilities, then for me there is no choice. I take the closed box, because if I’m wrong and his predictive abilities aren’t all they’re cracked out to be, I miss my chance to get a grand. If I take both and I’m wrong about his predictive abilities, I wind up missing out on nine hundred ninety nine grand.

37. @Steve: I just noticed that tree lobsters provide a link to skepchick but skepchick doesn’t provide a link to tree lobsters. I hereby suggest a new AI why do the skepchicks hate treelobsters?

38. Based on the wording given to me, I don’t see why I wouldn’t choose only the closed box, assuming the paradox isn’t whether or not to trust the supreme being. If he’s trustworthy, than being correct 9999 out of 9999 times before me is pretty damn good evidence that he’ll have me pegged, as well. I’d have to be pretty full of myself to believe I can beat that. So the only logical choice is the only one that guarantees me a million dollars by his logic.

39. 48774″>Gabrielbrawley:

Are you kidding? I love the Tree Lobsters.

Steve, you should let Jen and Amanda know when you create a new one, so they can link to it in the Quickies.

40. It is a chicken-and-egg.

If I think I could be the one and only person to fool the SB (i.e., my independent free will throwing chaos into the universe), then I would pick both boxes because if SB gets it wrong, SB would have predicted that I would choose the closed box only and put a \$1,000,000 in the closed box – yielding me \$1,001,000.

On the other hand, if I don’t believe I can fool the SB (i.e., determinism), then I would maximize my two potential outcomes and pick the closed box only – yielding me \$1,000,000.

Like I said, I’m going with the latter.

41. @Gabrielbrawley: I’m sure they don’t have the TL. Probably just didn’t get around to adding the link.

@Sam Ogden: I’ve got them scheduled to publish themselves every 4 days at midnight EST but, yeah, I suppose I could send the link on the suggestions page.

Currently (half-heartedly) trying to get Phil Plait to “guest write” a few, given what he said on his Bizarro post.

42. I borrow \$100,000 take out a \$3 million credit default swap to hedge the outcome of his prediction, then pick the closed box. Either I end up with \$900,000, or if he predicted wrong, \$2.9 million.

43. Being from a superior being, am I even able to open the closed box if I pick it?

44. @eli54: Yes, while his prediction abilities are superior his box making skills are sadly inferior, the box is made of tissue paper and wishful thinking.

45. @Gabrielbrawley “I was trying to be funny.”: Oh, I know. And I’m sure they’ll get around to adding a link sometime, when they’re not too busy. I don’t want to be a burden, you know, but a link would be nice… or maybe call now and then to let me know they’re OK… or a Perihelion card…

46. It seems to me its a win-win situation for you, since you can either play it safe, and get \$1,000, or risk it for \$1,000,000. So, my charge is that by issuing the problem, Newcomb issued no problem at all. But, since it its just in thought, that causes an issue by not being real, and if it were made real, whoever funds the paradox would have a problem-they’d be out of at least \$1,000

47. Obviously, since he’s an alien, he wants to destroy humanity and take over the earth. Therefore, he’s lying and the closed box always contains a bomb that explodes when it’s opened.

So I’d take both boxes and regift the closed one to someone I don’t like.

48. Wait a minute… has the supreme being predicted the SAME action out of everyone? or has he made a unique prediction for each person? I thought it was the latter.

49. Ok, seriously assuming that the Supreme Omnipotent being is not Howie Mandell, I have to say I’m a million dollar girl myself.

50. @notreallyalice: “It doesnâ€™t matter what he predicted. My best odds are with picking both boxes.”

Choosing both boxes only gains you the best payout when you don’t care if SB is right or wrong.

Statistically you’re better off choosing between correct predictions of \$1,000 (A+B) or \$1,000,000 (B) which is easy.

51. A superior being would make the boxes out of Transparent Aluminum ™, so, paradox? Not so much.

OR…since the being is always right, you could as easily state the “problem” backwards. If you choose just the closed box, the being will have already put \$1,000,000 in it. If you choose both boxes, he will have already have left the closed box empty. I don’t see why anyone would choose to do anything but take only the closed box. Or perhaps I should say “take,” since “choice” in this case is an illusion.

52. @blake_stacey: â€œGod, this guy is being a dick. Heâ€™s like the villain in a bad Star Trek episode. What was that one with the superpowerful being who turned out to be just a child? Oh, yeah, â€œThe Squire of Gothosâ€, with Trelane, set in 2267.”

You got it in one. Yes, he is a dick..or rather an adolescent Q. Didn’t you hear? If the Q (or especially, Picard’s Q) is involved, you’re probably screwed no matter what you do. He obeys no rules and is a trickster. :-D

53. @Reverend Kel:

Scotty is a superior being? He certainly was a … well, …large…being by the end of the Trek movies…

54. @jrpowell: “I borrow \$100,000 take out a \$3 million credit default swap to hedge the outcome of his prediction, then pick the closed box. Either I end up with \$900,000, or if he predicted wrong, \$2.9 million.”

This has to be COTW. Basically beat the Pardox. Brilliant.

Hell, it’s got to be a contender for COTY

55. 2 choices:

you choose to believe him in which case you take the 1000\$ because you’re screwed either way

or

you think he’s lying, in which case you still take the 1000\$ because if he’s lying, he doesn’t have 1,000,000\$ anyway.

56. @James Fox: “The question is moot given we only act as though we have free will. In other words free will and determinism are not allowed to get in bed together and really shouldnâ€™t even flirt or make passes at each other. I prefer the â€˜Monty Hall problemâ€™ because it makes me feel like I have a part in the decision.”

The MHP is Fracking Sweet (what? Galactica season 4 started back on TV this week, MrsS and I had a catch up session tonight) because it’s simple maths and anyone can get it.

The first time I heard it, I didn’t get it, someone explained it to me, I still didn’t get it, but because it’s so simple you can think about it. I thought about it for about two weeks, turning it over in my head until I did get it. And there’s no better feeling in the world than knowing you understand something.

The only more satisfying solution is when you solve the “Doors to Heaven and Hell Problem” which took me the better part of 3 months pondering. I was literally punching the air when I got it, because when you have the answer you realise theres no way it could be the wrong one. A very good problem indeed.

57. @DanielMcL: He’s been right 9999 times in a row, with no misses. I’d bet everything I own on *that* horse. This means that every single person who chose just the closed box is now a millionaire. What’s the better assumption, here? That he’ll be right again, or that he’s “due to miss?” If you take both boxes on the theory that he has to be wrong soon, your betting your million bucks against a grand with only one chance in 10,000 of winning, and winning means increasing your prize money by one-tenth of one-percent. Woo-hoo, party.

58. Since the thousand dollar bill was last printed in 1945 and is no longer in circulation, what kind of currency does this Highly Superior Being use for the \$1 million? A million dollar bill? Good luck trying to cash that.

I would take just the closed box and never open it. I would show the unopened box to my skepchick friends, tell them the story and then see if this effect is real or not.

http://www.ehbonline.org/article/S1090-5138(08)00117-7/abstract

59. @DanielMcL: Ah, understood. I must have missed your post on the original paradox. However, I’m working from only the information given at the start of this thread, hence *my* reply. Thinking about it, though, I don’t believe the difference between “certainly” and “almost certainly” would cause me to adjust my conclusion. Which, hey! is the same as yours! So we’re *both* millionaires (in theory).

60. The “paradox” here, like that of the single-shot Prisoner’s Dilemma, comes solely from the fact that you’re locked in a “problem box” with an absolutist assumption.

Never mind caveats about being only “almost” always right — the point here is that under the highly artificial terms of the problem box, your choice is known beforehand to the “box master”.

But… given you’re already assuming arbitrary powers for the box master, that’s functionally equivalent to its teleporting the bills out of the closed box, the moment you reach for the open one!

So, if, and only if you accept the terms of the problem box (as opposed to simply disbelieving them), choosing the closed box is the only logical response. Essentially, the problem reduces to whether you’re willing to accept those (patently unrealistic) terms. And I, for one, would want to see some “extraordinary evidence” before I did accept them….

61. This is a really interesting problem. I think the most entertaining aspect of this however, is that regardless what you choose, that choice was always* the best choice you could have made.

The predictor knows that people who choose both would not have chosen only B, so 10,000 dollars was the highest they could possibly receive.

At the same time, the predictor also knows who is going to choose only B, meaning that they would never choose both, meaning the 1,000,000 they received is the highest amount they would ever get.

* – This is of course ignoring the one in a thousand chance that the predictor is wrong about you.

At the same time, it makes me wonder what this says about people who choose their answer on the expectation that the predictor could be wrong. Are they lotto players? Are they inherently distrusting?

What about people who choose A and B because it’s the only option that guarantees money?

62. @David Harmon: “So, if, and only if you accept the terms of the problem box (as opposed to simply disbelieving them), choosing the closed box is the only logical response. Essentially, the problem reduces to whether youâ€™re willing to accept those (patently unrealistic) terms. And I, for one, would want to see some â€œextraordinary evidenceâ€ before I did accept themâ€¦.”

We’ve already been given that the offer is being presentede to us by a “highly superior being.” Within the frame of logic defined by the original premise, I’d likely be willing to accept that each part of the offer is true. Now the question is; am I right or wrong? The difference is either a not-great amount of money, or a life-changing amount of money. Worth the risk, to me at least.

63. I say that there is no omniscient being and he can’t be right ALL the time. So, since he’s been right 9999 times thus far the odds are in my favor that he’s about to display his .0001% fail rate on MY choice. So both boxes, alien arguments from authority be damned…

64. I think in this instance I’ll fall back on House MD 101. Everyone lies. The odds this someone’s capable of the feat he claims is highly unlikely, and thus I feel I can disregard it. And even were it true, better a grand and a box than just a box.

65. @Sam Ogden: Hey Sam !! Get your toady ass on face book. The girls say its cool. (ok Elyse said itâ€™s cool and sheâ€™s a girl…, and cool.)

66. If Sam doesn’t join Facebook, he gets \$0.

If Sam joins Facebook, he may or may not get \$1,000,000.

What should Sam do?

67. I first came across this problem when Eliezer Yudkowsky talked about it at Overcoming Bias. He made a convincing argument that you should take the closed box only.

I agree with his assessment.

68. Don’t listen to them, Sam. Facebook will suck the last of your free time down an ever-narrowing funnel of status updates and application requests. Its sheer mass distorts the fun-leisure continuum, leaving you helpless and unable to break its grasp as you are pulled inevitably over the time waste event horizon.

Also, make sure you friend me if you join.

69. My decision-making process has to include all the information I’m provided.

“A highly superior being” – how do I know he’s highly superior? Did he tell me? Has it been in the news? Or does it say he’s highly superior in this book he wrote, you should read it, it will change your life?

“The being claims that he is able to predict what any human being will decide to do.” – again, how do we know he’s not flat-out lying? Where are the records? The statistics? The peer-reviewed research?

I think picking both boxes is my best course for the following reasons:

1. A thousand bucks I know. It won’t solve all my problems, but it will help a bit. A million bucks isn’t real money. I’ve never met a single person who has a million bucks. Real money (\$1,000) beats fake money (\$1,000,000) in my ledger.

2. A claim that he is “able” to predict my behavior doesn’t mean he actually has. Maybe he guessed. Maybe his simulation was flawed. By taking both boxes, I’m guaranteed my real money while still retaining a potential for the fake money to exist as well.

3. Regardless of the existence of the fake money, I can sell the boxes on eBay. “Box granted by the Highly Superior Being! A real collector’s item – only 10-20,000 in the world!

Now, if opening the second box knocked a fat guy off a bridge onto a guy deciding whether a 70/30 cash split was a fair enough deal, I might make a different decision.

70. It’s clear what to do. You take both boxes. The being may or may not have predicted you would take both but by the time you’re presented with the boxes it has already been decided whether or not box B contains 1mill and so it makes sense to take both because box B’s contents are already decided and so taking both means you either have 1,000 or 1,001,000.

71. Take the closed box, if there is a million dollars in it, then you have a million dollars. If there isn’t you get the satisfaction of telling the ‘superior’ being that they were wrong.

72. I think the Superior Of Being goes backwards in time at least as long as it’s game is played. So actually, everybody playing the game puts money in the boxes and it has made a lotto money.

I thought it was a paradox at first but on reading jtradke’s comment it becomes clear. If the moneys already been placed in the boxes before you get there then there is no paradox. The money is there or it isn’t. Unless the alien is using teleporters or time travel then there is nothing he or you can do to change whether or not there is money in the box. Take both.

74. Actually, that’s assuming we’re living in this world or rationality and reason. The ability to predict perfectly takes us out of this universe and into a universe that doesn’t obey our laws and may not even have the same ideas of physics etc. In a world that is identical to ours except for this aliens magic powers then you choose the closed box, in our universe you choose both boxes. In a universe that has it’s own completely different laws of physics, time, causality etc. then the solution could be anything. My vote is to ride the invisible pink unicorn to the sherbet kingdom and form an alliance with the rainbow ponies to bring an end to war and famine. Then we drink.

75. The way I see it, if we’re having to accept the assertion that this being is indeed superior, and while not psychic he has abilities to understand and make informed predictions with so high a degree of accuracy that amount to roughly the same thing (I believe that’s the underlying premise, being hypothetical and all), then the million smackers is already almost beyond reach – his accuracy is 9999 in 10000, so you have a 1 in 10000 chance of getting the million, so I say take both boxes and guarantee a 1000 win, rather than gambling the 1000 at odds of 9999-1 to win the mil (which is only 1000x the stake anyway).

In short, I’d consider this a gift of Â£1000 (I get to choose it comes in sterling, right?!) and set aside the notion of the mil altogether.

The only remaining question is whether the closed box has a dead cat in it…

76. Gabrielbrawley: You are right. I did make a choice but it was a choice outside the original paradox. Now if or alien friend says, “You will choose not to participate in choosing boxes.” Well then I would be impressed.
This paradox also bases its idea on the fact that people want money, and that there really is nothing to lose. If you are completely wrong you come out even.
He makes it easy to be drawn into choosing a box. What if every time the predictor has to make his guess he says “You chose either a single box or both.” He could be correct every time until someone chooses not to take a box.

Well, it was fun to think about.
Peace

77. Is this have something to do with derivatives and the stock market. I bet this alien giving up money is the reason of the economic crisis, he already devalued money giving up a huge amount of money, so I would take the closed one and spend the money very quickly before the supreme being does this on more 1000 people and make the million dollars enough to buy a gum or something. :-)

78. I could go on ad infinitum about trying to figure out what he wants then do the opposite, but I’m not Vizzini The Sicilian from “Princess Bride”.
OK, seeing as how I do not think 9,999 humans would make the same choice, the Big Giant Head has something going on. I let my cat choose. If he can read me like a book, I need to remove myself from the choice. Optionaly I flip a coin, but any time I can get my cat to do something, I take it. I either win big or loose nothing, so what the heck.

79. The solution to the problem is unique to each presented with it depending on one’s financial situation and one’s “faith” in the Preditor’s accuracy. More weight will likely go the the chooser’s financial situation — here’s why:

If one believes the stated “fact” that the Predictor’s demonstrated accuracy is very high, AND one isn’t in dire need of \$1,000, one must choose the closed box because

a) If you don’t need the bucks and Predictor IS highly accurate, choosing the closed box gives you the BEST chance for a big payoff.

b) If you DO need the bucks, you can’t afford to take a chance you might be one of the Predictor’s few errors.

If one DOES NOT believe in the Predictor’s claimed accuracy, and does not need the bucks, you should STILL choose the closed box because it affords the highest payoff. As someone has already pointed out, \$1,000 is a nice weekend getaway, but it’s not a “life-changing” sum like \$1 million.

On the other hand, if you need the bucks and you think the Predictor is a quack, there’s no question, take what you can see.

For me, my financial situation is such that if I don’t get a \$1,000 I’m not gonna lose sleep over it. I’ll take the chance the Predictor has pegged me for the greedy bastard I am…

–Boomer

80. It comes down to whether or not you believe the Supreme Being when he tells you that he’s never made a mistake yet. If he’s honest/actually clairvoyant, 9999 out of 9999 is as good a steak as you could hope for, and you’d be an idiot not to take the closed box for the certain million.

If you think he’s a liar, or maybe fudging the numbers, then it doesn’t matter which you pick, because the contents of both boxes are already predetermined, and you should pick both.

There’s also the third, less-talked-about option, where you uppercut him in the junk, steal his identity and empty the supreme bank account.

81. Why would this Newcomb guy or William Shatner’s head tell me that he placed \$1 million in the closed box and then ask me to choose between both boxes or the closed box. If I know he is going to put the \$1 million in the closed box I am always going to choose the closed box, having that information there seems to be no paradox. Not knowing what he is going to do may be a paradox, even then I would still choose the closed box because I have a 50/50 chance of getting \$1 million. I don’t quite see the paradox here. I am going to install an anti-logical paradox shield in my brain so my head wont explode when I am asked questions like this.

82. @neverclear5: Ultimately I think the perfect prediction aspects are irrelevant. The problem is that by being the type of person that would make the choice for both boxes, you lose out on the million dollars. If you were the type of person who didn’t, you would get it. This is the paradox. Assuming the being is accurate, your seemingly rational choice will actually get you a worse outcome than someone who chooses the closed box.

I think it becomes more clear if you first forget the superior being and perfect prediction aspects.
Suppose instead that I am playing this game with you, and have with others in the past, scoring a 90% hit rate. Lets say I’m (unknown to you) achieving this by the mechanism of reading threads on Newcombe’s paradox on internet blogs and noting what answers people give. (Lets assume 10% change their mind in the real game).

The received returns are still best for those who chose closed box only. ie:

Both boxes : \$1000 + 10% * \$1,000,000 = \$ 101,000
Closed only: 90% * 1,000,000 = \$ 900,000

(We’ll assume that you’re solely concerned with maximising expected outcome, and unconcerned with sure money vs gamble etc.)

Making this decision criteria explicit makes the causality more clear. It’s not Choice -> Money in Box (requiring time travel), it’s: (Blog_Post -> Choice) AND (Blog_Post -> Money in Box). So you’re right – since my decision criteria was your blog post, you’re already screwed and ought to take both boxes. The “superior being” just complicates the decision criteria. Instead of just a blog post, it’s performed detailed evaluations of everything about you to reach a 99.99999% certainty.

The paradox comes from the fact that if you were the type of person who would take the closed box, you would also probably be the type of person who would give that answer in this blog post, and would thus come out ahead, despite the seemingly rational (given the static situation) decision to take both boxes. Causation isn’t happening backwards, but correlation is, which comes to the same effect.

83. The dude has done this 9999 times. Even if everyone of those people took the 1000 it means when he got to you he had 10 million. He is loaded. Kick the shit out of him stuff him in the trunk and drive over really bumpy roads till he tells you were the rest of the money is.

84. As MyNameIsTim points out, the real test here is psychological — namely whether you can accept the conditions of the challenge!

85. Looks like this thread is still rocking, but since a new AI is about to go up, Iâ€™ll just offer these words, and leave you all to it.

Puzzle master, Martin Gardner, wrote about Newcombâ€™s Paradox in Scientific American, and this topic out of all the puzzles and problems he presented was the one that generated the most feedback from his readers.

Gardner reports that those who think about this paradox seem to divide into two “warring camps” (closed box only choosers and both box choosers), and that those in one camp tend to think that those in the other camp are just being silly.

I personally am in the “choose both boxes” camp, but I don’t believe that those who choose only the closed box are being silly. Some people say that they would take the closed box because they are willing to risk the thousand in the open box to get a million. After all, the thousand dollars wasn’t their money to begin with, and so if the closed box were empty, they’d be no worse off. And if the predictor made a correct prediction, and thereâ€™s no indication that the predictor is fallible, they’d be a million dollars richer.

And I guess that’s a pretty good argument.

From my standpoint (choosing both boxes), the idea comes down to the notion of a prediction. The predictor does not wait for me to make a choice, and then decide what to do with the closed box. Rather, the predictor makes his decision first, and then I choose.

If the predictor put a million dollars in the closed box, then my selection of both boxes won’t make that million dollars disappear. And if the predictor left the closed box empty, then my selection of the closed box won’t put the million dollars into it.

At the moment I make my choice, the contents of the boxes are fixed. I may as well take whatever is in both boxes.

This problem touches on some larger concepts, and thatâ€™s really where my choice is determined.

Isaac Asimov said, “I would, without hesitation, take both boxes”, because he felt it was important to have at least a belief in free will.

Newcomb’s Paradox pits determinism against free will (I knew you guys would pick up on that). If a predictor such as the one in this problem can accurately predict a choice that you believe was made of your own free will, then arguably you don’t really have free will at all. Because if the predictor is sufficiently accurate (for this argument, 100% accurate), your choices effectively become irrelevant, the outcome has been pre-determined, and your free will nullified.

It’s interesting to note, however, that the free will issue falls outside the parameters of the puzzle. But it is brought to light in the accuracy of the predictor, who of course is a major player within the puzzle.

The accuracy of the predictor is the toe hole solvers use to step beyond the ‘parameters’ of the problem, and examine free will. To me this is the great allure of puzzles of this type.

We all know that by staying strictly within the parameters of the puzzle, there is no way to ever get the extra thousand dollars. But of course, this money is not real. There’s probably not a person among us who, were this a real life scenario with real money, and he or she needed to feed his or her family, or was faced with helping his or her parents ease into retirement, or needed to buy some new clothes or a big, fancy boat, would not choose only the closed box. It would be foolish to do otherwise. However, so many other ideas are triggered in contemplating the problem that, though they may be outside the parameters of the puzzle, are well worth examining and discussing. Free will is one of them.

If there were such a predictor, if every action in life was predetermined, there would be no free will. And I think Azimov was so quick and staunch in his answer, because he recognized the bitter taste of such a notion, even though it was only presented in a hypothetical scenario.

As for me, I happen to agree with Azimov.

It seems to me that this problem is an exercise that succeeds in revealing much about the solvers more than it requires a concrete solution. We can deduce that the both box choosers are either greedy, or that they refuse to be ‘beaten’ by the being and hold strongly to the concept of free will. They will always choose both boxes. No one is going to force them to admit they can’t ‘win’ thereby attaining the extra thousand dollars. As I said before, I happen to be in this camp. Not because I don’t see the logic behind always choosing only the closed box, but because I don’t want to sacrifice my free will, even in a hypothetical scenario. I’m so proud and hardheaded that I’ll gladly pass up a million fake dollars to prove it.

The other side consists of those who can more easily relinquish free will, at least in this situation. They recognize the futility in hanging onto it, given the stipulations of the problem. The being is never wrong, they say. I can do nothing to get the extra thousand dollars, and I refuse to look foolish by trying to trick an untrickable being. I’m giving in to the bastard and taking him for a million.

Remember, whichever way you chose, there is no wrong answer; just a thought provoking problem.

86. @Sam Ogden: What are your feelings with a less effective predictor? It doesn’t seem like a violation of free will for someone predicting based on less reliable data (noting a correlation with answers given in this thread for example) – if this violated free will then free will is clearly violated every day. Would you still take both boxes when faced with someone who guessed 90% right (and you didn’t know how they were deciding)?

In that situation, I’d go for the closed box only. Without knowing the criteria being used to judge me, the only way I can maximise my chances is to act as much like a person who chooses the closed box as possible. And the only way to do that is to *be* the type of person who chooses the closed box when faced with Newcomb’s paradox. I don’t think this violates free will (assuming it exists). I could still decide to change my mind at the last minute, its just that doing so makes me less likely to be predicted as a closed box selector. Hence this constrains my free will only as much as a decision to act rationally in such situations does.

As the accuracy of the predictor increases, the only thing that changes is the likelihood of this strategy paying off, so if both boxes is the winning choice when faced with a good guesser who’s got a 90% accurate record, it still should be for a 100% right guesser.

87. I ask what he predicted, and then follow through with it. It is the best way to max my cash. if he predicts both and i take both, i made1k. if he predicts both and i take only b, i make zero. if he predicts b and i take b, i get a million, if he predicts b and i take both i get 1mil1k. so varying from what he predicted only gains me 1k at most, but could potentially cost me 1 million.

88. Worst possible outcome: no money
Best possible outcome: gain \$1,001,000

Going for the best possible outcome and avoiding the worst possible outcome factor out to the same choice – take both boxes.

89. From what I can tell, the solution to this problem comes down to trust and rationality:

1. If you believe the being is telling the truth and you are rational, he is at least 99.99% accurate in his predictions, meaning that statistically, you stand to gain \$999,900 by choosing only the closed box.

2. If you are trusting but irrationally believe that you can beat the odds, you take both.

3. If you are irrational, and untrusting you take neither because you suspect the being is the devil.

4. If you are rational but untrusting, you take only the \$1000, because the other box may contain a bomb.

90. What paradox? The “superior being” is obviously lying, it’s a con, the \$1000 bill is a fraud and I should run as fast as I can to find an authority to deal with him.

JBS

91. Isaac Asimov said, â€œI would, without hesitation, take both boxesâ€, because he felt it was important to have at least a belief in free will.

But Arthur C. Clark said, “Any sufficiently advanced technology is indistinguishable from magic.” If this being has been capable of predicting the subject’s choice 9999 times in a row, then it’s reasonable to assume that he will do so with me, free will be damned. The 9999 subjects before me all had free will and look where it got them.

For all I know, the being can manipulate spacetime and is actually placing the appropriate amount of money in the closed box retroactively, based on my decision. If that’s the case, all the free will in the universe won’t get you a million bucks if you choose box boxes.

Close

Close