At first sight it looks as if you are being offered an easy chance of enriching yourself. The temptation is to take one box to get the million. For, if you take one box, won't the Predictor have anticipated your choice and put a million in that box?
The expectation from the alternative two box choice is a mere ten thousand.
But there is a powerful argument against this policy. The Predictor has already made his prediction and determined the contents of the box. Whatever you do now will not change that, you cannot change the past.
If you take both boxes you will get £10,000 more than if you take just the opaque box, and this is so whether it is empty or contains a million. If the opaque box is empty one-boxers get nothing and two- boxers get £10,000. If it has money in it, one-boxers get £1,000,000, two-boxers get £1,010,000. Either way two-boxers get more. The two box choice is said to dominate the one box choice.
The predominant but by no means unanimous view among academics is that you should take both boxes and follow the dominance principle. Suppose you do this, then it is not true that you would have been a millionaire if you had taken only the opaque box. For you would have acquired nothing. But the one- boxer will retort that if you had taken just one box you would be richer, since the Predictor would have predicted your choice and filled the box accordingly.
Suppose I say, " lucky you didn't light a match in that room we were just in because you would have caused an explosion, because the room was full of gas."
You reply, " No it's not lucky, if I had lit a match it wouldn't have been full of gas because I'm a careful person."
I say, "But it is what I say, not what you say, that is relevant to deciding whether it is safe to light a match in a gas filled room, I've only just told you about the gas." If we rule out backwards causation, future effecting the past, then similarly we must regard the contents of the opaque box as already fixed. The only relevant, counterfactual sentence to the one or two box choice is the two- boxers " if I had taken only one box I should be £10,000 worse off."
But what if you knew the Predictor was infallible? You would know there were only two possible outcomes. You would either get the million in the opaque box or £10,000 from choosing both boxes. The possibilities of getting £1,010,000, and of getting nothing drop out because these would only be realized if you falsified the prediction.
If you choose the opaque box, there is no point in regretting that you didn't choose both and get £10,000 more, because in that case the Predictor wouldn't have been infallible contrary to what you know.
But if it is rational here to take just one box why does it cease to be rational if it is just highly probable that the Predictor is right?
It is not to the point to object that we never could know that the Predictor was completely infallible. All we need to get this argument going is the claim that if the Predictor were known to be infallible, two- boxing would not be rational.
Suppose that nevertheless rationality does dictate the two box option. And suppose that the transparent box contains just one euro. Then although I believe it is rational to take both boxes I know there are distinguished one- boxers which suggest I might have got it wrong. I can easily afford to lose a single euro on the off chance that I have got it wrong. That may seem a reasonable thing to do, but if I haven't got it wrong after all, does it make it more rational?
Of course as a two- boxer I could wish I had thought it was more rational to be a one- boxer, in which case the Predictor would probably have put money in the opaque box. But I can't bring that about by changing my mind and taking just one box because the contents of the box are already determined. I could get one- boxers to convince me, so then when I next faced the Predictor he would have anticipated my one box choice and put money in the opaque box. I now would have a reason for one- boxing which I once thought bad, but now think is good, so I now have an irrational belief.
But is it irrational to get myself into that position? No: it can be quite rational to cause yourself to act irrationally. Suppose a burglar is threatening to torture me unless I open my safe, and I take a drug which makes me temporarily irrational. He starts to torture me and while complaining about how much it hurts, I encourage him to go on. He realises his threats can no longer influence me and that his best recourse is to flee. In the circumstances it was perfectly rational to make myself irrational. Though rationality would seem to dictate two- boxing.