Monty Hall Problem [solution discussion]
Moderators: jestingrabbit, Moderators General, Prelates
Monty Hall Problem [solution discussion]
This is the thread for posting your solutions and arguing about them.
blistering guitar solo
Re: Monty Hall Problem [solution discussion]
Switching doors is the best strategy. You only have a 1/3 chance of your first choice being right, and with one wrong door eliminated the odds of the remaining one being the prize become 2/3. Therefore, switching doubles your chances of winning.
Let the arguments begin.
Let the arguments begin.
I burn the cheese. It does not burn me.
Nope. Your chances are increased to 2/3, not 1/2. Consider: the odds of your initial choice being correct are 1/3. This remains true even after Monty opens one of the wrong doors, because you didn't have that information at the time when you made your choice. The total probability of the prize being behind one of the doors must be unity, and your first choice still has odds of 1/3; therefore, the remaining door is correct with probability 1  1/3 = 2/3.
To illustrate, consider a more extreme example in which there are 1000 doors. You pick one and then Monty opens 998 of the wrong ones. Clearly, there is more than an even chance that the last remaining unopened door is actually the right one! You only had a 0.001 chance of picking the right one to begin with, and all the others were shown to be wrong.
To illustrate, consider a more extreme example in which there are 1000 doors. You pick one and then Monty opens 998 of the wrong ones. Clearly, there is more than an even chance that the last remaining unopened door is actually the right one! You only had a 0.001 chance of picking the right one to begin with, and all the others were shown to be wrong.
I burn the cheese. It does not burn me.
 AntiScurvyLg
 Posts: 14
 Joined: Wed Sep 06, 2006 7:56 pm UTC
 Contact:
The odds change, but you will always be equally likely to win no matter what door you chose. Since you can't win immidiatly by picking the correct door first when there were 3 doors then it will ALWAYS come back to being three doors left. When there are only two doors left you still have an equal chance of selecting the car as you do of deselecting it.
The 1000 door example won't work, because in that case you are leaving only the same number of doors open but the percentage is completely different
The biggest effect that this would have is an increase in the amount of time an individual...
Damn you, you made me think instead of just guessing.
X=Car N=Nothing R=Revealed door S=Selected Door C=Choice
Sticking with first choice
XNN XNN XNN XNN NXN NXN NXN NXN NNX NNX NNX NNX
SRC SCR CSR CRS SCR RSC CSR RCS SRC RSC RCS CRS
WIN WIN LOS LOS LOS WIN WIN LOS LOS LOS WIN WIN
I tried and don't feel like creating a picture, the alightment there is close enough for anyone that actually wants to look at it that closely, 6 wins 6 loss possibilities.
Changing doors, Just switch the wins with the losses. If he is allowed to reveal the door with the car you would loose straight away so it wouldn't matter if you stayed with the original door or chose the alternate because they would both be empty of course.
No, I am not illustrating the 1000 door example, just like if there were only two doors and he had to reveal one that you didn't pick AND didn't have the car behind it, only then would you have a guarentee 100% but the game would be ruined and unplayable anyway.
The biggest effect that this would have is an increase in the amount of time an individual contenstant is on stage thus reducing the number of prizes that would be given out and also makes for some nice dramatic tension when accompanied by the right naration/music.
I used the word thus more than once so I have to be right.
The 1000 door example won't work, because in that case you are leaving only the same number of doors open but the percentage is completely different
The biggest effect that this would have is an increase in the amount of time an individual...
Damn you, you made me think instead of just guessing.
X=Car N=Nothing R=Revealed door S=Selected Door C=Choice
Sticking with first choice
XNN XNN XNN XNN NXN NXN NXN NXN NNX NNX NNX NNX
SRC SCR CSR CRS SCR RSC CSR RCS SRC RSC RCS CRS
WIN WIN LOS LOS LOS WIN WIN LOS LOS LOS WIN WIN
I tried and don't feel like creating a picture, the alightment there is close enough for anyone that actually wants to look at it that closely, 6 wins 6 loss possibilities.
Changing doors, Just switch the wins with the losses. If he is allowed to reveal the door with the car you would loose straight away so it wouldn't matter if you stayed with the original door or chose the alternate because they would both be empty of course.
No, I am not illustrating the 1000 door example, just like if there were only two doors and he had to reveal one that you didn't pick AND didn't have the car behind it, only then would you have a guarentee 100% but the game would be ruined and unplayable anyway.
The biggest effect that this would have is an increase in the amount of time an individual contenstant is on stage thus reducing the number of prizes that would be given out and also makes for some nice dramatic tension when accompanied by the right naration/music.
I used the word thus more than once so I have to be right.
Nonconformists unite for paradox.

 Posts: 286
 Joined: Tue Aug 22, 2006 10:35 pm UTC
 Contact:
Marrow wrote:X=Car N=Nothing R=Revealed door S=Selected Door C=Choice
Sticking with first choice
XNN XNN XNN XNN NXN NXN NXN NXN NNX NNX NNX NNX
SRC SCR CSR CRS SCR RSC CSR RCS SRC RSC RCS CRS
WIN WIN LOS LOS LOS WIN WIN LOS LOS LOS WIN WIN
I tried and don't feel like creating a picture, the alightment there is close enough for anyone that actually wants to look at it that closely, 6 wins 6 loss possibilities.
A common mistake. You have enumerated the possible outcomes for sticking with your first choice, but have also assumed that each outcome is equally likely. Not so! Why?
It should be clear to everyone that there is no possible way the odds that XNN comes up instead of NXN or NNX is 1/3, and similarly for NXN and NNX. Each has a 1/3 chance of being the actual situation, and your diagram implies that is the case. No problems. So let's concentrate just on the other aspects, assuming the situation is XNN (generalize to NXN and NNX when we're done).
XNN XNN XNN XNN
SRC SCR CSR CRS
WIN WIN LOS LOS
Notice in the
XNN
S??
WIN
scenario, Monty has a choice. He can either choose SRC or SCR and it makes no difference whatsoever which he chooses. But for each time
XNN
S??
WIN
occurs, he can choose *only one* of those two options, not both. Your diagram assumes that
XNN
S??
WIN
occurs twice as frequently as the other two, so it's no wonder you win more often than in reality. To be correct, you should rework your diagram to read
XNN XNN XNN NXN NXN NXN NNX NNX NNX
S?? CSR CRS SCR ?S? RCS SRC RSC ??S
WIN LOS LOS LOS WIN LOS LOS LOS WIN
or, if you like,
Code: Select all
XNN XNN XNN XNN NXN NXN NXN NXN NNX NNX NNX NNX
SRC SCR CSR CRS SCR RSC CSR RCS SRC RSC RCS CRS
WIN WIN LOS LOS LOS WIN WIN LOS LOS LOS WIN WIN
1/18 1/18 1/9 1/9 1/9 1/18 1/18 1/9 1/9 1/9 1/18 1/18
to give the appropriate odds of each scenario occuring
GENERATION 1i: The first time you see this, copy it into your sig on any forum. Square it, and then add i to the generation.
Marrow wrote:The odds change, but you will always be equally likely to win no matter what door you chose.
Incorrect. I hope you can see how that statement contradicts itself.
Marrow wrote:X=Car N=Nothing R=Revealed door S=Selected Door C=Choice
Sticking with first choice
XNN XNN XNN XNN NXN NXN NXN NXN NNX NNX NNX NNX
SRC SCR CSR CRS SCR RSC CSR RCS SRC RSC RCS CRS
WIN WIN LOS LOS LOS WIN WIN LOS LOS LOS WIN WIN
You're forgetting about symmetry. Some of those cases are identical and represented more times than they should be. Let's illustrate the situation more clearly with a decision tree:
Code: Select all
first choice (prob.) revealed (prob.) total prob. other door stay switch
==================== ================ =========== ========== ==== ======
empty #1 (1/3) > empty #2 (1) 1/3 car lose WIN
empty #2 (1/3) > empty #1 (1) 1/3 car lose WIN
/> empty #1 (1/2) 1/6 empty #2 WIN lose
car (1/3) +
\> empty #2 (1/2) 1/6 empty #1 WIN lose
Examine the total probability for each row. You will see that the WIN rows have a combined probability of 2/3 when the player switches doors, only 1/3 when he stays.
I burn the cheese. It does not burn me.
My way of thinking...
We went over this in my first probability class way back when, and the teacher had this rather complicated diagram that explained it pretty well... Sadly, I really like bonfires so I don't have my notes.
Anyway, the easiest way to think about it, I've found, is like this:
As it opens, you are invited to play a game in which you have a 1 in 3 chance of winning. You pick one of the three doors, and there is a bit more than a 33% chance that it is the one with the car.
Then, Monty Hall opens a door and reveals a goat.
You are now given an option: you may either play a new game, in which you have a 50% chance of winning, or you may stick with the previous game, in which you had a 33% chance of winning. Looking at it like this, the right answer is clear: you choose to not switch, because goats make excellent raptor distractions.
Anyway, the easiest way to think about it, I've found, is like this:
As it opens, you are invited to play a game in which you have a 1 in 3 chance of winning. You pick one of the three doors, and there is a bit more than a 33% chance that it is the one with the car.
Then, Monty Hall opens a door and reveals a goat.
You are now given an option: you may either play a new game, in which you have a 50% chance of winning, or you may stick with the previous game, in which you had a 33% chance of winning. Looking at it like this, the right answer is clear: you choose to not switch, because goats make excellent raptor distractions.
Complicated diagrams seem unnecessary.
Odds of guessing correctly on first guess: 1 in 3.
If you guess incorrectly and switch: you are guaranteed to win.
If you guess correctly and switch: you are guaranteed to lose.
So, if you always switch:
1/3 of the time: you guess right, switch, and lose.
2/3 of the time: you guess wrong, switch, and win.
Odds of guessing correctly on first guess: 1 in 3.
If you guess incorrectly and switch: you are guaranteed to win.
If you guess correctly and switch: you are guaranteed to lose.
So, if you always switch:
1/3 of the time: you guess right, switch, and lose.
2/3 of the time: you guess wrong, switch, and win.
blistering guitar solo
I meant the odds change from the first guess to the second guess, as in if you only had one guess total and there was no change for the second round. I just wrote it improperly.
I didn't include any results where Monty would open the door with the car behind it. That eliminates 1/3 of the possible out comes which is why there were only 4 choices for XNN NXN and NNX. If you chose the first door and it was behind the second door, he can't reveal the second door, only the third.
You CAN'T win on your first guess, and you Monty CAN'T reveal the door that the car is behind
XNN XNN NXN NXN NNX NNX
RSC RCS SRC CRS CSR SCR
All of those would have him reaveal the car ending the game due to an error. Even though those can't be revealed they will enter into the odds of winning or loosing because he isn't making a choice in that situation, he is forced to chose the particular door.
Though I suppose the fact that he can't make that choice does indeed mean that he is twice as likely to choose the one with the car behind it when you do guess incorrectly. Therefore eliminating the "error" choices is actually what improve the odds of winning by playing to loose with your initial choice.
One of those sort of thing where you have to look at what isn't there istead of what is. I approve of the anwer of switching for the win. Not that it matters because its a fact.
I didn't include any results where Monty would open the door with the car behind it. That eliminates 1/3 of the possible out comes which is why there were only 4 choices for XNN NXN and NNX. If you chose the first door and it was behind the second door, he can't reveal the second door, only the third.
You CAN'T win on your first guess, and you Monty CAN'T reveal the door that the car is behind
XNN XNN NXN NXN NNX NNX
RSC RCS SRC CRS CSR SCR
All of those would have him reaveal the car ending the game due to an error. Even though those can't be revealed they will enter into the odds of winning or loosing because he isn't making a choice in that situation, he is forced to chose the particular door.
Though I suppose the fact that he can't make that choice does indeed mean that he is twice as likely to choose the one with the car behind it when you do guess incorrectly. Therefore eliminating the "error" choices is actually what improve the odds of winning by playing to loose with your initial choice.
One of those sort of thing where you have to look at what isn't there istead of what is. I approve of the anwer of switching for the win. Not that it matters because its a fact.
Nonconformists unite for paradox.
The logic for this becomes somewhat more apparent if you increase the number of doors.
Say we have 10 doors, behind 1 is the car, the other 9 have nothing.
You pick one, and the host reveals 8 of the empty doors, leaving the one you picked and one other.
You can see that the problem is basically the same, but it's much clearer now that you should switch. Initially you only had a 1/10 chance of hitting the car, so you're much more likely to have picked an empty door.
Assuming the most likely outcome, that you picked an empty door, then the other remaining door must be the car, so you should switch!
Say we have 10 doors, behind 1 is the car, the other 9 have nothing.
You pick one, and the host reveals 8 of the empty doors, leaving the one you picked and one other.
You can see that the problem is basically the same, but it's much clearer now that you should switch. Initially you only had a 1/10 chance of hitting the car, so you're much more likely to have picked an empty door.
Assuming the most likely outcome, that you picked an empty door, then the other remaining door must be the car, so you should switch!
Re: My way of thinking...
Tacroy wrote:You are now given an option: you may either play a new game, in which you have a 50% chance of winning, or you may stick with the previous game, in which you had a 33% chance of winning.
No. In the new game (i.e. if you switch) you have a 67% chance of winning, because you have more information about the situation than when you started.
I burn the cheese. It does not burn me.

 Posts: 286
 Joined: Tue Aug 22, 2006 10:35 pm UTC
 Contact:
Apparently the Monty Hall problem, or "Principle of Restricted Choice", is used by bridge (and maybe other card games?) players all the time. The canonical example follows:
You have AT764 trumps and your partner has K982. You lead A, your left hand opponent plays 3, you play your partner's 2, and your right hand opponent plays Q. Next you lead 4 and LHO plays 5.
The question: should you play your partner's K to guarantee winning, or should you play your partner's 9 to win cheaply (finesse) if RHO does not have the J but lose if he does?
Your first guess might be "well, even odds it's with LHO or RHO, so I might as well just play K to guarantee the win". But this isn't quite right. Suppose RHO had both QJ to start. Then his first play was probably picking one of them randomly. We know he had the Q. We say he had the J with probability 1/2. But then he played the Q, which if he had just Q would have probability 1, but if he had QJ would have probability 1/2. So the updated probability he had QJ is only (1/2)/(1+1/2)=1/3. You actually have a 2/3 chance of winning cheaply if you play the 9.
It's actually slightly different than 2/3 due to hand sizes and such, but you get the point. Now I have to go through card games I actually play to see if this applies anywhere.
You have AT764 trumps and your partner has K982. You lead A, your left hand opponent plays 3, you play your partner's 2, and your right hand opponent plays Q. Next you lead 4 and LHO plays 5.
The question: should you play your partner's K to guarantee winning, or should you play your partner's 9 to win cheaply (finesse) if RHO does not have the J but lose if he does?
Your first guess might be "well, even odds it's with LHO or RHO, so I might as well just play K to guarantee the win". But this isn't quite right. Suppose RHO had both QJ to start. Then his first play was probably picking one of them randomly. We know he had the Q. We say he had the J with probability 1/2. But then he played the Q, which if he had just Q would have probability 1, but if he had QJ would have probability 1/2. So the updated probability he had QJ is only (1/2)/(1+1/2)=1/3. You actually have a 2/3 chance of winning cheaply if you play the 9.
It's actually slightly different than 2/3 due to hand sizes and such, but you get the point. Now I have to go through card games I actually play to see if this applies anywhere.
GENERATION 1i: The first time you see this, copy it into your sig on any forum. Square it, and then add i to the generation.

 Posts: 41
 Joined: Mon Aug 14, 2006 7:10 pm UTC
 Contact:
Re: My way of thinking...
Tacroy wrote:You are now given an option: you may either play a new game, in which you have a 50% chance of winning, or you may stick with the previous game, in which you had a 33% chance of winning.
That only adds up to 83%, though. The car has to be behind one of them, so since there's a 33% chance if you stick, there will be a 67% chance if you switch.
(Saying it's 50/50 also adds up correctly to 100%, but then you get the odd conclusion that your initial random choice of 1 out of 3 doors has a 50% chance of being right.)
No sale, honcho!

 Posts: 4
 Joined: Wed Sep 13, 2006 6:36 pm UTC
 Location: Penn State University
Re: My way of thinking...
Factitious wrote:(Saying it's 50/50 also adds up correctly to 100%, but then you get the odd conclusion that your initial random choice of 1 out of 3 doors has a 50% chance of being right.)
Aha! That's the problem! That's how to say it! Thank you!
Your initial choice actually *is* 1 in 2, because one of the wrong ones is going to be revealed, regardless of the original selection! That's the whole issue. I've tried to explain that to people so many times, and always wound up talking myself in circles, but that's it. That's the clear and concise way of explaining it: There is no actual advantage to switching because you have a 50% chance with each door from the getgo (which remains throughout), even though you have three choices.
Re: My way of thinking...
friartucksduck wrote:Your initial choice actually *is* 1 in 2, because one of the wrong ones is going to be revealed, regardless of the original selection! That's the whole issue. I've tried to explain that to people so many times, and always wound up talking myself in circles, but that's it. That's the clear and concise way of explaining it: There is no actual advantage to switching because you have a 50% chance with each door from the getgo (which remains throughout), even though you have three choices.
It may be concise, but it's incorrect. You have no way of knowing which of the three will be opened, so the fact gives you no additional information. Your initial odds are 1/3. Switching after a wrong door is opened doubles those chances.
Ephphatha wrote:The odds change each round though.
Round one it's 33.3% to 66.6%, Round two it's 50/50
No.
I burn the cheese. It does not burn me.

 Posts: 286
 Joined: Tue Aug 22, 2006 10:35 pm UTC
 Contact:
Re: My way of thinking...
friartucksduck wrote:Aha! That's the problem! That's how to say it! Thank you!
Your initial choice actually *is* 1 in 2, because one of the wrong ones is going to be revealed, regardless of the original selection! That's the whole issue. I've tried to explain that to people so many times, and always wound up talking myself in circles, but that's it. That's the clear and concise way of explaining it: There is no actual advantage to switching because you have a 50% chance with each door from the getgo (which remains throughout), even though you have three choices.
That's what I love about this thing. It really shows how good humans are at rationalizing incorrect beliefs.
Here's yet another attempt to explain why the probability of winning if you switch is 2/3. Every step follows either from the problem statement or from previous steps:
Suppose you have decided to pick door 1 at first (you could have picked any door, let's just say 1). Let D1 represent the car being behind door 1, D2 behind 2, and D3 behind 3.
P(D1)=1/3, clearly, and just as clearly P(D2)=P(D3)=1/3. And P(D2_or_D3)=2/3.
Which door does Monty open? M1 is the first, M2 the second, M3 the third.
P(M1)=0 because you picked door 1. He picks one of the two that you didn't.
P(M2D2)=P(M3D3)=0 since he doesn't open a door with a car behind it.
P(M2D3)=1 since it's the only possibility left. Similarly P(M3D2)=1.
P(M2D1)=P(M3D1)=1/2 since he has two choices of empty doors. Assume he chooses randomly between them.
Okay, then, with what probability do you win if you switch? Either you switch to door 2 and win with P(D2M3) or you switch to door 3 and win with P(D3M2).
P(D2M3)=P(M3D2)*P(D2)/P(M3) = 1*(1/3)/[P(M3D1)*P(D1)+P(M3D2)*P(D2)+P(M3D3)*P(D3)] = 1*(1/3)/[(1/2)*(1/3)+1*(1/3)+0*(1/3)] = (1/3)/(1/2) = 2/3.
P(D3M2) is similar, and is also 2/3.
GENERATION 1i: The first time you see this, copy it into your sig on any forum. Square it, and then add i to the generation.

 Posts: 4
 Joined: Wed Sep 13, 2006 6:36 pm UTC
 Location: Penn State University
Re: My way of thinking...
GreedyAlgorithm wrote:That's what I love about this thing. It really shows how good humans are at rationalizing incorrect beliefs.
Here's yet another attempt to explain why the probability of winning if you switch is 2/3. Every step follows either from the problem statement or from previous steps:
Suppose you have decided to pick door 1 at first (you could have picked any door, let's just say 1). Let D1 represent the car being behind door 1, D2 behind 2, and D3 behind 3.
P(D1)=1/3, clearly, and just as clearly P(D2)=P(D3)=1/3. And P(D2_or_D3)=2/3.
Which door does Monty open? M1 is the first, M2 the second, M3 the third.
P(M1)=0 because you picked door 1. He picks one of the two that you didn't.
P(M2D2)=P(M3D3)=0 since he doesn't open a door with a car behind it.
P(M2D3)=1 since it's the only possibility left. Similarly P(M3D2)=1.
P(M2D1)=P(M3D1)=1/2 since he has two choices of empty doors. Assume he chooses randomly between them.
Okay, then, with what probability do you win if you switch? Either you switch to door 2 and win with P(D2M3) or you switch to door 3 and win with P(D3M2).
P(D2M3)=P(M3D2)*P(D2)/P(M3) = 1*(1/3)/[P(M3D1)*P(D1)+P(M3D2)*P(D2)+P(M3D3)*P(D3)] = 1*(1/3)/[(1/2)*(1/3)+1*(1/3)+0*(1/3)] = (1/3)/(1/2) = 2/3.
P(D3M2) is similar, and is also 2/3.
Well explained, and totally correct, but there is one flaw that you present, one which I failed to address before. It's not a flaw in your math, but a minor flaw in your method of evaluation. The question asked whether one person is more likely, less likely, or equally likely to win by switching. What you calculated is the ratio of winning people who switched to total people who switched (2:3), and the ratio of winnig people who did not switch to total people who did not switch (1:2). It would appear at face value to any mathematician that these figures answer the question, but I'm afraid that in this case, common sense wins the day. Observe:
The scenario with the 2/3 probability in favor of switching comes up when you consider that there are three second round scenarios:
*If you chose wrong door A, you will be presented in the end with a right door and a wrong door, and if you switch, you'll win.
*If you chose wrong door B, you will be presented in the end with a right door and a wrong door, and if you switch, you'll win.
*If you chose the right door, you will be presented in the end with a right door, and if you don't switch, you'll win.
This means that overall, more people who choose to switch will win the car in the second round because it's twice as likely that they chose the wrong door the first time. No problem.
However, there are only two possible scenarios in your personal second round:
*If you chose either of the wrong doors, you will be presented in the end with a right door and a wrong door, and if you switch, you'll win.
*If you chose the right door, you will be presented in the end with a right door, and if you don't switch, you'll win.
This means that there is a 50% probability of winning the car in any individual final round no matter whether you choose to switch.
To summarize, a survey of 50 people who switched and 50 who did not switch should show results of about 33 winners who switched and 25 winners who did not switch, but this has no bearing whatsoever on any single instance, and you are, if ever in this scenario, facing a 50% probability on both sides despite the misleading poll results. Your math was correct, but your logic was flawed.
edit: See xkcd's posts for a similar explanation of the same issue in the Sleeping Beauty Puzzle. http://forums.xkcd.com/viewtopic.php?p=1671&highlight=#1671
second edit: See also the first two posts in The Gambling Tyrant. http://forums.xkcd.com/viewtopic.php?t=81
Last edited by friartucksduck on Thu Sep 14, 2006 1:34 am UTC, edited 3 times in total.
The first round you have a 33% chance of guessing car, since the car is never removed from the second round. The odds of winning or loosing are always going to be x/3.
A 1:2 ratio represents 1/3 and 2/3 of course the total choices made is 3, once happens once, the other happens twice.
I don't know why I am saying these things, I have already proven it to myself and I am not responding to anyone in particular.
Thanks for bringing up ratios, I bet thats why a lot of people think its 1/2 besides just saying it has to be since there is one car and two doors.
A 1:2 ratio represents 1/3 and 2/3 of course the total choices made is 3, once happens once, the other happens twice.
I don't know why I am saying these things, I have already proven it to myself and I am not responding to anyone in particular.
Thanks for bringing up ratios, I bet thats why a lot of people think its 1/2 besides just saying it has to be since there is one car and two doors.
Nonconformists unite for paradox.
Re: My way of thinking...
friartucksduck wrote:Well explained, and totally correct, but there is one flaw that you present, one which I failed to address before. It's not a flaw in your math, but a minor flaw in your method of evaluation. The question asked whether one person is more likely, less likely, or equally likely to win by switching. What you calculated is the ratio of winning people who switched to total people who switched (2:3), and the ratio of winnig people who did not switch to total people who did not switch (1:2).
2/3 + 1/2 != 1. The latter ratio is 1:3, not 1:2.
friartucksduck wrote:However, there are only two possible scenarios in your personal second round:
*If you chose either of the wrong doors, you will be presented in the end with a right door and a wrong door, and if you switch, you'll win.
*If you chose the right door, you will be presented in the end with a right door, and if you don't switch, you'll win.
This means that there is a 50% probability of winning the car in any individual final round no matter whether you choose to switch.
Not so. The first of those two scenarios is twice as likely to be the case as the other. Switching is the optimal strategy.
I burn the cheese. It does not burn me.

 Posts: 4
 Joined: Wed Sep 13, 2006 6:36 pm UTC
 Location: Penn State University
Marrow wrote:The first round you have a 33% chance of guessing car, since the car is never removed from the second round. The odds of winning or loosing are always going to be x/3.
A 1:2 ratio represents 1/3 and 2/3 of course the total choices made is 3, once happens once, the other happens twice.
I don't know why I am saying these things, I have already proven it to myself and I am not responding to anyone in particular.
Thanks for bringing up ratios, I bet thats why a lot of people think its 1/2 besides just saying it has to be since there is one car and two doors.
Well put. Nonetheless...
The odds of winning or losing on the first round alone are, and always will be 1/3. The odds of winning or losing on the second round, with exactly two choices, are 1/2.
A ratio can be used, as you implied, to compare one sample of a population to another, but it can also be used to compare a single sample to the population. (e.g. 1:2 ratio of winners to losers represents 1/3 winners; however, a 1:2 ratio of winners to total contestants represents 1/2 winners.)
Percentages and fractions are ratios, and though it may confuse some, the two can be used interchangably such that 1/2 = 50% = a 1:2 ratio of part to whole.
Chariot wrote:Another way of looking at it is that if you have a computer randomly selecting stay or switch for you, the odds in that situation will in fact be 50%. Since the first door only had a 33% chance of being correct, and the two choices must average to 50%, switching must be 67% (rounded).
Exactly right. This shows (with a reasonable margin of error) that in 2/3 of all cases, the switch will yield a victory. Nonetheless, in any individual case, a switch has a 50% chance of victory.

 Posts: 286
 Joined: Tue Aug 22, 2006 10:35 pm UTC
 Contact:
Re: My way of thinking...
friartucksduck wrote:However, there are only two possible scenarios in your personal second round:
*If you chose either of the wrong doors, you will be presented in the end with a right door and a wrong door, and if you switch, you'll win.
*If you chose the right door, you will be presented in the end with a right door, and if you don't switch, you'll win.
This means that there is a 50% probability of winning the car in any individual final round no matter whether you choose to switch.
To summarize, a survey of 50 people who switched and 50 who did not switch should show results of about 33 winners who switched and 25 winners who did not switch, but this has no bearing whatsoever on any single instance, and you are, if ever in this scenario, facing a 50% probability on both sides despite the misleading poll results. Your math was correct, but your logic was flawed.
*If you chose either of the wrong doors, you will be presented in the end with a right door and a wrong door, and if you switch, you'll win. The probability that you are trying to decide whether to switch and that you are in this scenario is 2/3.
*If you chose the right door, you will be presented in the end with a right door, and if you don't switch, you'll win. The probability that you are trying to decide whether to switch and that you are in this scenario is 1/3.
Don't think that just because you have two possibilities that they are equally likely. Consider the following example: I roll a d6. You get to choose either 12 or 36. If the number rolled is within the range you chose, you win! Here it is blindingly obvious that two scenarios is not the same as 50/50.
friartucksduck wrote:edit: See xkcd's posts for a similar explanation of the same issue in the Sleeping Beauty Puzzle. http://forums.xkcd.com/viewtopic.php?p=2440&highlight=#2440
xkcd is exactly correct. But let's see what he actually says...
xkcd wrote:The tricky thing here is that probability is defined as "if this is run X times in X parallel universes, in how what fraction will it come out with this result?" Since we don't actually have parallel universes, we often use the shortcut of "in X repeated trials." But when the event is tested in multiple trials that are not completely independent, (number of trials with this result) / (total trials) is NO LONGER a measure of the result's probability.
xkcd says that "in X repeated trials" is how we determine probability unless those trials are not independent. Are you saying that somehow if you played Monty Hall (with the same randomization assumptions) several times the results of the final round would change based on the results of the initial rounds? I hope not, because then you're definitely dealing with a different problem than all of us!
friartucksduck wrote:second edit: See also the first two posts in The Gambling Tyrant. http://forums.xkcd.com/viewtopic.php?t=81
Again, probabilities different from ratios of repeated trials only happen because the repeated trials are not independent here. But in the Monty Hall scenario, they are independent.
GENERATION 1i: The first time you see this, copy it into your sig on any forum. Square it, and then add i to the generation.
friartucksduck wrote:Percentages and fractions are ratios, and though it may confuse some, the two can be used interchangably such that 1/2 = 50% = a 1:2 ratio of part to whole.
I hope you meant 1/2=50%=0.5=1:1 because 1/2 isn't = to 1:2
Also as a logic problem there would be no logic problem if round one didn't affect the solution. The problem isn't how many doors is the car behind, thats the only thing that is 50% at this point. Saying that means that the first round would have absoutely no bearing on your odds.
Nonconformists unite for paradox.

 Posts: 4
 Joined: Wed Sep 13, 2006 6:36 pm UTC
 Location: Penn State University
Marrow wrote:I hope you meant 1/2=50%=0.5=1:1 because 1/2 isn't = to 1:2
Again, a ratio can be used to describe either parttopart, in which two samples of a population are compared to each other, or parttowhole, in whichone sample is compared to the entire population, including itself. 1:2 as a parttopart is = to 1/3. 1:2 as a parttowhole is = to 1/2. I specified part to whole.
Marrow wrote:Also as a logic problem there would be no logic problem if round one didn't affect the solution. The problem isn't how many doors is the car behind, thats the only thing that is 50% at this point. Saying that means that the first round would have absoutely no bearing on your odds.
Which is exactly what I'm saying. the question was,
and the answer is that it does not. The "logic puzzle" comes from seeing through counterintuitive probability calculations listed in great detail above to see that the first round really does have no effect on the second whatsoever.posiduck wrote:Does switching alter your chances of winning, and if so, which is better, and why?
GreedyAlgorithm wrote:*If you chose either of the wrong doors, you will be presented in the end with a right door and a wrong door, and if you switch, you'll win. The probability that you are trying to decide whether to switch and that you are in this scenario is 2/3.
*If you chose the right door, you will be presented in the end with a right door, and if you don't switch, you'll win. The probability that you are trying to decide whether to switch and that you are in this scenario is 1/3.
That's right, and I made note of it earlier in the post you quoted. But the correct choice for any given trial is unaffected by the probability of actually being in one scenario or the other, so it's still not actually better to choose to switch (as per the original question quoted above). That is, those who switch will win more often than those who don't over repeated trials, but since there is only one instance of this choice for any one of them, the choice doesn't affect any particular one's likelihood of winning.
Two coins are headsup, one tailsup. A head is removed and you have to pick a coin at random. You have a fifty percent chance of picking heads, whether or not you had your heart set on picking one of those two before the third was removed. It's exactly the same in the Monty Hall Problem Your first choice did not actually affect the result any more than your desire to pick a particular coin.
GreedyAlgorithm wrote:Don't think that just because you have two possibilities that they are equally likely. Consider the following example: I roll a d6. You get to choose either 12 or 36. If the number rolled is within the range you chose, you win! Here it is blindingly obvious that two scenarios is not the same as 50/50.
That's right, but it's not a suitable comparison. In the scenario I postulated the two scenarios mean a 50/50 shot because there are two possible instances sharing a single variable and each with two possible results, each the inverse of the other.
GreedyAlgorithm wrote:xkcd says that "in X repeated trials" is how we determine probability unless those trials are not independent. Are you saying that somehow if you played Monty Hall (with the same randomization assumptions) several times the results of the final round would change based on the results of the initial rounds? I hope not, because then you're definitely dealing with a different problem than all of us!
Bad form though it is, for the sake of formatting, I am going to respond to this in three parts:
1. xkcd actually only says that "in X repeated trials" fails if the trials are dependent, not that it is always to be used otherwise. Moreover, the trials he refers to are parallel only to each contestant, not to each guess.
2. Terribly sorry. That was the wrong link. I will edit it in the original. I meant the post higher on the page, http://forums.xkcd.com/viewtopic.php?p=1671&highlight=#1671, which reflects exactly the same scenario, though worded differently, and points out the same two possible solutions debated on this topic and demonstrates that given the phrasing of that original question, 2/3 is the better answer. Here, it is 1/2 because of the original phrasing.
3. You say here that the initial round trials are completely independent from the final round trials. This means that only the final round affecting the final result is the second round, in which two options are available, each of which has an equal chance, in a single scenario, of being correct, and thus demonstrating my point.
GreedyAlgorithm wrote:Again, probabilities different from ratios of repeated trials only happen because the repeated trials are not independent here. But in the Monty Hall scenario, they are independent.
See 1. above.

 Posts: 41
 Joined: Mon Aug 14, 2006 7:10 pm UTC
 Contact:
Re: My way of thinking...
friartucksduck wrote:Factitious wrote:(Saying it's 50/50 also adds up correctly to 100%, but then you get the odd conclusion that your initial random choice of 1 out of 3 doors has a 50% chance of being right.)
Aha! That's the problem! That's how to say it! Thank you!
Your initial choice actually *is* 1 in 2, because one of the wrong ones is going to be revealed, regardless of the original selection! That's the whole issue. I've tried to explain that to people so many times, and always wound up talking myself in circles, but that's it. That's the clear and concise way of explaining it: There is no actual advantage to switching because you have a 50% chance with each door from the getgo (which remains throughout), even though you have three choices.
What you're saying here is that if there are three doors, one of which has a prize, and you pick one at random, there is a 50% chance of picking the one with the prize. Is that what you meant to say?
No sale, honcho!
friartucksduck wrote:Which is exactly what I'm saying. the question was,and the answer is that it does not. The "logic puzzle" comes from seeing through counterintuitive probability calculations listed in great detail above to see that the first round really does have no effect on the second whatsoever.posiduck wrote:Does switching alter your chances of winning, and if so, which is better, and why?
Incorrect. The solution to this problem has been worked out in great detail, by a great many people using several techniques. The right answer is that switching increases one's odds from 1/3 to 2/3. Your analysis is flawed.
friartucksduck wrote:GreedyAlgorithm wrote:*If you chose either of the wrong doors, you will be presented in the end with a right door and a wrong door, and if you switch, you'll win. The probability that you are trying to decide whether to switch and that you are in this scenario is 2/3.
*If you chose the right door, you will be presented in the end with a right door, and if you don't switch, you'll win. The probability that you are trying to decide whether to switch and that you are in this scenario is 1/3.
That's right, and I made note of it earlier in the post you quoted. But the correct choice for any given trial is unaffected by the probability of actually being in one scenario or the other, so it's still not actually better to choose to switch (as per the original question quoted above). That is, those who switch will win more often than those who don't over repeated trials, but since there is only one instance of this choice for any one of them, the choice doesn't affect any particular one's likelihood of winning.
Can you see how logically inconsistent this statement is? You're saying that individual trials are 50/50, but repeated trials are not. How could that be, when their outcomes are independent?
friartucksduck wrote:Two coins are headsup, one tailsup. A head is removed and you have to pick a coin at random. You have a fifty percent chance of picking heads, whether or not you had your heart set on picking one of those two before the third was removed. It's exactly the same in the Monty Hall Problem Your first choice did not actually affect the result any more than your desire to pick a particular coin.
No, it's very different. In the Monty Hall problem, you pick the coin before a head is removed. If you picked a head, it forces Monty to take the other one out of circulation instead. That changes the playing field considerably, because you know that in such a case the other remaining coin must be a tail. If you pick after a coin is removed, you don't have that information and must go with a pure 50/50 chance.
I burn the cheese. It does not burn me.
posiduck wrote:Odds of guessing correctly on first guess: 1 in 3.
If you guess incorrectly and switch: you are guaranteed to win.
If you guess correctly and switch: you are guaranteed to lose.
So, if you always switch:
1/3 of the time: you guess right, switch, and lose.
2/3 of the time: you guess wrong, switch, and win.
Can someone who disagrees with the 2/3 solution tell me which part of my reasoning they find faulty?
blistering guitar solo

 Posts: 286
 Joined: Tue Aug 22, 2006 10:35 pm UTC
 Contact:
friartucksduck wrote:That's right, and I made note of it earlier in the post you quoted. But the correct choice for any given trial is unaffected by the probability of actually being in one scenario or the other, so it's still not actually better to choose to switch (as per the original question quoted above). That is, those who switch will win more often than those who don't over repeated trials, but since there is only one instance of this choice for any one of them, the choice doesn't affect any particular one's likelihood of winning.
Yes. Yes it does. That is pretty much the definition of probability. For frequentists, that is the definition of probability. For Bayesians, that approximates the results of probability in the limit. For N independent trials of an experiment, the limit of the ratio of the number of times an outcome A occurs is exactly the probability of A on any one trial.
friartucksduck wrote:Two coins are headsup, one tailsup. A head is removed and you have to pick a coin at random. You have a fifty percent chance of picking heads, whether or not you had your heart set on picking one of those two before the third was removed. It's exactly the same in the Monty Hall Problem Your first choice did not actually affect the result any more than your desire to pick a particular coin.
Not the same at all. The same as the Monty Hall problem is as follows: two coins are headsup, one tailsup. One is picked at random; it cannot be removed. A head is removed. Now if you had to pick a coin at random, the odds would be 50/50 of getting the tail. But you don't have to pick one at random. Why would you choose to just forget which one was chosen to not be allowed to be removed? It's information that you have, so use it! Since you don't have to pick one at random, which do you choose? 2/3 of the time the other one that could have been removed but wasn't will be the tail. I don't see why you choose to forget pertinent information.
friartucksduck wrote:1. xkcd actually only says that "in X repeated trials" fails if the trials are dependent, not that it is always to be used otherwise. Moreover, the trials he refers to are parallel only to each contestant, not to each guess.
He said that because, as I said at the beginning of this post, "For N independent trials of an experiment, the limit of the ratio of the number of times an outcome A occurs is exactly the probability of A on any one trial."
2. These are not the same scenario. If they were, xkcd says the answer is 2/3. Since they're not, that doesn't mean the answer is 1/2.
friartucksduck wrote:3. You say here that the initial round trials are completely independent from the final round trials. This means that only the final round affecting the final result is the second round, in which two options are available, each of which has an equal chance, in a single scenario, of being correct, and thus demonstrating my point.
Nope. I say here that repeating the entire experiment again is independent of the first experiments. Obviously the beginning and end of the experiment are not independent. One more time: if you repeat the entire Monty Hall experiment 30000 times and switch each time, and end up winning about 20000 times, then your probability of winning in a single instance of the Monty Hall experiment is with extremely high confidence very close to 2/3.
GENERATION 1i: The first time you see this, copy it into your sig on any forum. Square it, and then add i to the generation.

 Posts: 41
 Joined: Mon Aug 14, 2006 7:10 pm UTC
 Contact:
posiduck wrote:posiduck wrote:Odds of guessing correctly on first guess: 1 in 3.
If you guess incorrectly and switch: you are guaranteed to win.
If you guess correctly and switch: you are guaranteed to lose.
So, if you always switch:
1/3 of the time: you guess right, switch, and lose.
2/3 of the time: you guess wrong, switch, and win.
Can someone who disagrees with the 2/3 solution tell me which part of my reasoning they find faulty?
friartucksduck seems to be disagreeing with the first line (Odds of picking a particular one out of a set of three are 1/3), but I'm holding out hope that this is all just a misunderstanding.
No sale, honcho!
One more lash to a nearly expired equine.
I don't think that anyone here has yet really given the crux of the problem. (I'm playing catchup after a month away from the forum, so forgive me if I missed it.) Here is the NUGGET OF UNDERSTANDISM:
Monty Hall knows where the car is. Knowing is the probability killer. When he picks a door to reveal, the probability of him picking a booby prize is ONE. (There will always be at least one door to choose that does not have the car.) The probability of him picking the car is ZERO. However, that knowledge is not provided to the player during the first choice.
If you pick door A, the probability of A being a winning pick is: A = 33%. The probability of having chosen incorrectly is /A (not A) = 67%.
We'll say Monty picks B. Again, the key is that Monty, unlike the player, knows he will not pick the car. Thus, the probability of B = 0.
The trick is that the knowledge of what door Monty would reveal is unavailable during the initial choice. A is still 33%; /A is still 67%. Adding our newfound knowledge. The probabilty of /A AND B (the AND operator is equivalent to multiplication) = 67% * 0 = 0.
Since /A = B OR C (OR is equivalent to addition), C = /A  B = 67%  0 = 67%.
Hmmm... now that I've typed it all out, I'm not sure that was as clear as I had hoped. Again, the keys are that the player's first choice is independant of Monty, and that Monty has a zero probabilty of choosing the car. (There is also the fact that Monty's choice depends on the player's first choice, but that just muddies things up before eventually cancelling out.)
If you still don't believe it, find one of the simulation scripts on the net and keep track of your own stats. If you think it's cheating, write your own!
I don't think that anyone here has yet really given the crux of the problem. (I'm playing catchup after a month away from the forum, so forgive me if I missed it.) Here is the NUGGET OF UNDERSTANDISM:
Monty Hall knows where the car is. Knowing is the probability killer. When he picks a door to reveal, the probability of him picking a booby prize is ONE. (There will always be at least one door to choose that does not have the car.) The probability of him picking the car is ZERO. However, that knowledge is not provided to the player during the first choice.
If you pick door A, the probability of A being a winning pick is: A = 33%. The probability of having chosen incorrectly is /A (not A) = 67%.
We'll say Monty picks B. Again, the key is that Monty, unlike the player, knows he will not pick the car. Thus, the probability of B = 0.
The trick is that the knowledge of what door Monty would reveal is unavailable during the initial choice. A is still 33%; /A is still 67%. Adding our newfound knowledge. The probabilty of /A AND B (the AND operator is equivalent to multiplication) = 67% * 0 = 0.
Since /A = B OR C (OR is equivalent to addition), C = /A  B = 67%  0 = 67%.
Hmmm... now that I've typed it all out, I'm not sure that was as clear as I had hoped. Again, the keys are that the player's first choice is independant of Monty, and that Monty has a zero probabilty of choosing the car. (There is also the fact that Monty's choice depends on the player's first choice, but that just muddies things up before eventually cancelling out.)
If you still don't believe it, find one of the simulation scripts on the net and keep track of your own stats. If you think it's cheating, write your own!
a slightly different look...
Lets say Monty presented you with 3 small bags, one of which contains the key to the car... He tells you to pick a bag. You pick one. Monty then takes the other 2 bags and puts them together into a 4th bag... Now he asks you to pick between the one you initially chose and the bag containing the other 2. If you came into the game at this point, and only saw the 2 bags, then your odds of picking the correct one is 5050, however, having come in at the beginning, you have additional information, you know that bag 4 has a 67% chance and your bag only has 33..
Someone mention starting with higher numbers, but others dismissed this. I think this is an excellent way to push up the numbers to make it much more obvious. Lets take my example above, but have a wall with 100 bags, one has a key. You pick a bag, the remaining bags are all combined together and you are given the option of keeping your selection, or taking the big bag.. There is still only one key... and 2 selections at this point, but it would be crazy to say you have a 5050 chance of having the correct bag with your initial choice...
The 5050 argument is only valid if you ignore information you learned in the first round of the game... The odds of your initial choice don't change, the odds of the non chosen bags are added together... In the original game, each bag has 1/3 chance, so the 2 added together end up having a 2/3 chance of winning, your choice stays at 1/3..
One final look at these examples.. the host removes empty bags from the big bag until there is only one bag left inside... have the odds suddenly dropped to 50/50? of course not..
Oh, on a side not, I think Posiduck stated the simplest solution to this I have ever seen, short, clean, easy to understand..
Someone mention starting with higher numbers, but others dismissed this. I think this is an excellent way to push up the numbers to make it much more obvious. Lets take my example above, but have a wall with 100 bags, one has a key. You pick a bag, the remaining bags are all combined together and you are given the option of keeping your selection, or taking the big bag.. There is still only one key... and 2 selections at this point, but it would be crazy to say you have a 5050 chance of having the correct bag with your initial choice...
The 5050 argument is only valid if you ignore information you learned in the first round of the game... The odds of your initial choice don't change, the odds of the non chosen bags are added together... In the original game, each bag has 1/3 chance, so the 2 added together end up having a 2/3 chance of winning, your choice stays at 1/3..
One final look at these examples.. the host removes empty bags from the big bag until there is only one bag left inside... have the odds suddenly dropped to 50/50? of course not..
Oh, on a side not, I think Posiduck stated the simplest solution to this I have ever seen, short, clean, easy to understand..
 Binary.Tobis
 Posts: 34
 Joined: Mon Sep 25, 2006 4:35 pm UTC
 Location: Anchorage, Alaska
 phlip
 Restorer of Worlds
 Posts: 7572
 Joined: Sat Sep 23, 2006 3:56 am UTC
 Location: Australia
 Contact:
Two tactics I often use when people are confused by this problem:
Just knocking down the paradox, without giving a solution:
Probability says that if the result you're looking for is one result in a sample space of size n where the samples are all equivalent, the probability is 1/n  the chance of getting heads is 1/2, the chance of rolling a 6 is 1/6, etc... heads and tails on a coin are equivalent mathematically, they're just labels for the sides of the coin.
However, the doors in the Monty Hall game are not equivalent. Say you choose door 1 and Monty opens door 2, leaving you the choice of doors 1 or 3. They are not equivalent  Monty could not have opened door 1, but he could have opened door 3 but chose not to (either by choice or because it's the car). This makes the doors distinct mathematically, and just saying "there's 2 choices so it's 50/50" doesn't apply. You have to look deeper.
Another way to think of it is to introduce some kind of separation between the door you chose and the doors you didn't. Which, unless you have magical doors on wheels means dropping the doors and using some other analogy.
Like a deck of cards. Say I have three cards, one with a picture of a car and two with pictures of goats. You choose a card from this deck (but don't look at it), and I keep the other two. The chance that the car is in your hand is 1/3, and the chance it's in mine is 2/3. This is simple.
Now, you know at least one of my cards has a goat on it  so I turn it over on my side of the table. This doesn't tell you anything about which side of the table the car's on, it's still a 1/3 chance it's on yours and a 2/3 chance it's on mine. But because I've shown you one of my cards, the other one has a 2/3 chance of being the car.
If that's not enough, increase the number of cards, say to a full 52card deck, trying to find, say, the Ace of Spades. You pull one random card from an entire deck, and I take the remaining 51 cards. If I have the Ace of Spades (which I have a 51/52 chance) I pull it out and show you the other 50 cards. Otherwise I pick one at random and show you the other 50. Since I had a 51/52 chance of having the Ace of Spades to begin with, I have a 51/52 chance of holding the Ace of Spades and showing you the rest of the deck, and a 1/52 chance of holding some random card. If you were asked "do you want to keep your card or swap for mine", you'd of course choose to swap it. Since showing you the cards doesn't give you any more information about my side of the table as a whole, the question "do you want to swap?" is equivalent to "do you want to give me your card, and I'll give you the Ace if I have it". Since I have a 51/52 chance of having it, swapping will give you a 51/52 chance of winning.
Just knocking down the paradox, without giving a solution:
Probability says that if the result you're looking for is one result in a sample space of size n where the samples are all equivalent, the probability is 1/n  the chance of getting heads is 1/2, the chance of rolling a 6 is 1/6, etc... heads and tails on a coin are equivalent mathematically, they're just labels for the sides of the coin.
However, the doors in the Monty Hall game are not equivalent. Say you choose door 1 and Monty opens door 2, leaving you the choice of doors 1 or 3. They are not equivalent  Monty could not have opened door 1, but he could have opened door 3 but chose not to (either by choice or because it's the car). This makes the doors distinct mathematically, and just saying "there's 2 choices so it's 50/50" doesn't apply. You have to look deeper.
Another way to think of it is to introduce some kind of separation between the door you chose and the doors you didn't. Which, unless you have magical doors on wheels means dropping the doors and using some other analogy.
Like a deck of cards. Say I have three cards, one with a picture of a car and two with pictures of goats. You choose a card from this deck (but don't look at it), and I keep the other two. The chance that the car is in your hand is 1/3, and the chance it's in mine is 2/3. This is simple.
Now, you know at least one of my cards has a goat on it  so I turn it over on my side of the table. This doesn't tell you anything about which side of the table the car's on, it's still a 1/3 chance it's on yours and a 2/3 chance it's on mine. But because I've shown you one of my cards, the other one has a 2/3 chance of being the car.
If that's not enough, increase the number of cards, say to a full 52card deck, trying to find, say, the Ace of Spades. You pull one random card from an entire deck, and I take the remaining 51 cards. If I have the Ace of Spades (which I have a 51/52 chance) I pull it out and show you the other 50 cards. Otherwise I pick one at random and show you the other 50. Since I had a 51/52 chance of having the Ace of Spades to begin with, I have a 51/52 chance of holding the Ace of Spades and showing you the rest of the deck, and a 1/52 chance of holding some random card. If you were asked "do you want to keep your card or swap for mine", you'd of course choose to swap it. Since showing you the cards doesn't give you any more information about my side of the table as a whole, the question "do you want to swap?" is equivalent to "do you want to give me your card, and I'll give you the Ace if I have it". Since I have a 51/52 chance of having it, swapping will give you a 51/52 chance of winning.
Code: Select all
enum ಠ_ಠ {°□°╰=1, °Д°╰, ಠ益ಠ╰};
void ┻━┻︵╰(ಠ_ಠ ⚠) {exit((int)⚠);}
Who is online
Users browsing this forum: No registered users and 6 guests