Suppose you get to choose between two envelopes, and all you know is that one contains $x and the other $2x. Once you choose and open the envelope, you are given the option to take whatever is in the other envelope instead.
Since it's a 50-50 proposition (whether the other envelope has more money in it or less), but you have more to gain than to lose (If 1st envelope contained $2000, then you'll either lose $1000 or gain $2000 by switching), you should definitely switch, right? On the other hand, this logic applies no matter how much money was in 1st envelope, so you'll ALWAYS switch, so why not go straight to the 2nd envelope? But "2nd envelope" means nothing if you have yet to choose, so it really makes no difference what you do! Can someone please explain this paradox?
2007-03-12
03:04:04
·
7 answers
·
asked by
blighmaster
3
in
Science & Mathematics
➔ Mathematics
The best way I can explain this is with a Bayesian approach, wherein you have a prior distribution on how much money the unseen envelope has. Let the money in the unseen envelope=X. And the money in the seen envelope= Y.
Then, we are interested in
P(X= Y/2|X=Y/2 or X = 2Y)
= P (X = Y/2)/ ( P( X=Y/2) + P(X = 2Y) )
This can be 0.5 always only if we have a constant uniform distribution over X for all positive real values of X. Such a distribution however integrates to infinity, and therefore cannot be normalized (and the distribution therefore does not exist!). Any well behaved probability distribution over a continuous(real valued) random variable should vanish as X -> infinity, which is not the case here. Thus for any well behaved distribution, after a point,
P (X= Y/2| Y) = p > 0.5
Therefore the expected return
= Y(p/2 + 2(1-p))
= Y(2 - 3/2p)
Thus for p > 2/3 there is no reward in switching the envelope.
Concretely, this means the following. If you find Y to be very small, say $1, you would possibly think that it is more likely for X to be $2 than $0.5 and make the switch. However if Y= $1M you would expect that it is more likely for X= 0.5M (possibly with p > 2/3) than $2M and possibly not make the switch. It is not important what the exact prior distribution is. I might consider it not worth the switch when Y=$1K, while you might not consider it worth the switch when Y=$1B. However, the important thing is that for any well behaved normalizable prior distribution, there will be some value of Y after which p is always > 2/3. And beyond this value, you won't make the switch.
2007-03-12 03:29:35
·
answer #1
·
answered by Vijay Krishnan 1
·
0⤊
0⤋
Suppose you do this 100 times.
The envelopes contain $1000 or $2000 (you don't know this).
On average, the first envelope you open has $1500.
If it has $1000 and you switch, you get $2000.
If it has $2000 and you switch, you get $1000.
On average, the second envelope has $1500.
So, yes, it makes no difference what you do.
If the first envelope contains $2000, the true odds for switching are:
Get $1000: 100%
Get $4000: 0%
Average: $1000.
But ... you don't know the true odds. You may think they are 50-50. That's where the paradox comes in -- you are being fooled into thinking that they are 50-50.
2007-03-12 03:20:22
·
answer #2
·
answered by morningfoxnorth 6
·
0⤊
0⤋
Let us use '$x' as the low amount. Suppose you first picked the low-amount envelope, then the value of switching is +$x (note that only the incremental value matters in the decision to switch). On the other hand, if you first picked the high-amount envelope the value of switching is -$x. Since there is a 0.5 probability of picking either the low or high-amount envelope, the expected value of switching is 0.5 x (+$x) + 0.5 x (-$x) = 0. Therefore, there is no extra value in switching and you should stick to your original choice.
2007-03-12 03:35:08
·
answer #3
·
answered by fdelley 2
·
0⤊
0⤋
you are exactly right, without knowing the values in the two envelopes, you will never know if you picked the lower or higher of the two values just by looking at the amount you got in the first envelope.
there is a way to try to reason to yourself to determine if you got the larger or smaller value. if the value is say, $200, do you think its more reasonable that the values were $100-200, or $200-400. people usually tend to stick to nice round numbers, and 100, is a very nice, round number. i myself would be incliced to stay with my first pick if i got a $200 envelope.
but it is always a calculated risk, that seems to have a higher possible margin of return than a chance of loss.
2007-03-12 03:13:36
·
answer #4
·
answered by Tom B 4
·
0⤊
0⤋
I think that the problem is your idea that you always have more to gain than lose. If you were to play this game over and over, the average amount you would make if you switched every time would be the same as the average amount you would make if you didn't switch.
Essentially the idea is that if the game isn't over you don't really have the money, right?
2007-03-12 03:19:22
·
answer #5
·
answered by s_h_mc 4
·
0⤊
0⤋
1st envelope $x , expected money when you switch is 0.5x + 0.5 2x = 1.5x so yes it is always better to switch.
play the game a couple of times and you will see that indeed you will gain more money in the end.
life is weird
2007-03-12 03:23:36
·
answer #6
·
answered by gjmb1960 7
·
0⤊
1⤋
This one is open to debate. More here:
http://en.wikipedia.org/wiki/Two_envelope_problem
2007-03-12 03:15:27
·
answer #7
·
answered by Anonymous
·
0⤊
0⤋