Talk:Monty Hall problem/Archive 1
This is an archive of past discussions about Monty Hall problem. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 | Archive 2 | Archive 3 | → | Archive 5 |
The problem with this article is that it makes it seem like switching is not really the right answer. This is because of language like "standard" and "classical". It makes it seem like there is some controversy whether switching really is right. And the "Assumptions" section is confusing and irrelevant. If anything it gives the reader false hope that the odds could be 1/2. This is a math problem, there is no POV/NPOV involved. Some people are being really childish here.
- (It's not generally a good idea to stick your new comment at the top of the discussion page, pre-empting the months of diescussion that have taken place before. It's not even tactically a good idea, as most Wikipedia editors will assume that new stuff is added on the end, and may not notice your new comment, especially if you don't sign it with a date-stamp; that is, ~~~~)
- Now, as to the substance of your comment: The reason for the wording about the "classical" answer and so on is that it is not by any means a pure mathematical problem. As explained with great care in the text, the problem as normally stated involves a whole raft of assumptions, without which the answer would be different. If it were stated as a pure mathematical problem, with the conditions outlined properly, then it would indeed be simply a math problem. But that would not be the "standard" or "classical" problem.
- A person who does not know the practices of the TV show—in fact, the practices as modified for this problem—will not necessarily get the "right" answer no matter how good his math is. That's the reason for the Assumptions section. If you find that section not to be clearly written, by all means edit it.
- (Perhaps you would not so easily conclude that someone is being childish if you read the discussion here before prefixing it with your comment.) Dandrake 08:27, Jul 15, 2004 (UTC)
Should you switch?
For the original choice, there are these possibilities:
123 CGG GCG GGC
So whatever door you choose, your chances are 1/3.
For the second choice, call the door you originally chose 1, and the door Monty opens as 3. Now the possibilities are:
123 CGG GCG
(We know that GGC is not a possibility because Monty showed us a goat behind door 3.)
So if you choose either 1 or 2, you have a 1/2 chance of winning. Which means it doesn't improve your chances to switch.
Now, what's wrong with this argument?
- The two cases CGG and GCG are not equally likely. In fact, GCG is twice as likely as CGG. This is obscured by your relabeling the doors after the fact so that the door Monty opens always gets number 3. You may want to write down all 9 possible equally likely scenarios: the car can be behind one of three doors, the contestant can choose one of three doors, makes 3x3=9. Then analyze how the sticker does (he wins in 3 of the 9 cases) and how the switcher does (he wins in 6 of the 9 cases).
Okay, so let's not renumber. Call the door you originaly pick number 1, and indicate the door Monty opens with parentheses. Then we have the following possibilities:
1 2 3 C (G) G C G (G) G C (G) G (G) C
So, if you stay with 1, your chance of winning is 1/2. If you switch to 2 or 3 (whichever remains unopened), your chances of winning are also 1/2. So it doesn't improve your chances to switch.
Or does it?
- Again, the four cases you list are not equally likely. The first two are equally likely, but they are each
half as likely as the third and fourth case. If the door you originally pick is number 1, then the equally likely scenarios are CGG, GCG and GGC. Each of them has probability 1/3. The first case splits into the two equally likely cases C(G)G and CG(G) each with probability 1/6. The two cases GC(G) and G(G)C both keep probability 1/3.
Don't forget that the two goats are distinct. There are six permutaions for item placement:
1 2 3
C g1 g2 C g2 g1 g1 C g2 g1 g2 C g2 C g1 g2 g1 C
There are three door choices the player can make, yielding 18 possible outcomes:
1 2 3
C g1 g2 C g2 g1 g1 C g2 g1 g2 C g2 C g1 g2 g1 C
1 2 3
C g1 g2 C g2 g1 g1 C g2 g1 g2 C g2 C g1 g2 g1 C
1 2 3
C g1 g2 C g2 g1 g1 C g2 g1 g2 C g2 C g1 g2 g1 C
At this point, the player has "lost" 2/3 of the time. If the player changed the door selection now, the choices would be between two doors, like this:
C g1 C g2 g1 g2 g1 C g2 g1 g2 C
Choosing either column 1 or column 2 still gives a 2/3 chance of losing, so switching doors won't accomplish anything.
Monty now eliminates an un-chosen goat (if there are two to chose from, it doesn't matter which one he chooses, so I'll always choose the rightmost). Note that in 4 out of the 6 cases, Monty is "forced" to give the player a car if he switches, and this is the key -- Monty's choice is not random. The resulting column for a switch is:
C C g1 C g2 C
Here's an expanded chart, where the player's first choice is bolded and the eliminated goat is replaced with a hyphen. I've put a "#" where a switch wins, and a "-" where a switch looses.
1 2 3 win/lose
C g1 - - C g2 - - g1 C - # g1 - C # g2 C - # g2 - C #
1 2 3
C g1 - # C g2 - # g1 C - - - g2 C # g2 C - - - g1 C #
1 2 3
C - g2 # C - g1 # - C g2 # g1 - C - - C g1 # g2 - C -
But why count the two goats separately? What if there weren't any goats, just doors left empty - that wouldn't change the probabilities, would it?
Suppose Monty doesn't open the door, but after your initial choice he still gives you a chance to "switch", meaning you can either stay with your original choice, or if you switch you'll win if the car is behind either of the other two doors. Obviously you want to switch, because then you have two doors, with two chances to win, as opposed to only one door, one chance to win, if you don't switch. But what if Monty opens one door, with a goat, before you decide. That means one of the other doors isn't really a chance to win. Doesn't that change things? Doesn't it mean that it's more likely that the car is behind each of the other doors than orginally thought? Doesn't it make it more likely that your original choice was a winner? Aren't the odds now 1/2 rather than 1/3?
If Monty opens a door , then yes the chances for that door may change.
It depends on what Monty does when a car is behind the door you first choose, whether he open the two empty doors with equal chance.
Before Monty opens a door and you decide to switch, yes your chances for all trials are 2/3, but get this:
Suppose a car is placed and you choose door 1. If Monty decides to open door 2 with a certain frequency over door 3 if the car is behind 1 and makes this determination before he asks for your decision to switch and before he opens a door, you chances for door 2 open are set before any door is open. Your chances for door 2 are set, for door 3 are set and the average is set (2/3).
Now suppose two people play the game one chooses door 1 and one chooses door 2 and door 3 is open. Aren't the chances for each player to switch equal? No, because one door was chosen that Monty could not open, therefore the situation is not symmetric. One player will gain, the one that goes from the protected door to the unprotected door, and the other will lose chances, the one going from the unprotected to the protected door.
Well, suppose Monty opens both of the other doors, and they're
both goats. Still think you should switch? Doesn't showing both
goats behind the other doors mean that your original choice is
a sure winner? That the odds for not switching are now 1.0, not
0.33? So why wouldn't showing the goat behind one of the other
doors also change the odds?
- The odds can't change because they are locked in when you first choose. When you choose a door with a goat (2/3 odds), Monty must show you the other goat, so the car will be behind the door you didn't pick. So 2/3 of the time, switching will get you the car.
- Here's another way to put it: you win if you pick a goat and then switch. The odds of picking a goat are 2/3, so if you always switch you win 2/3 of the time.
Here's another question. Suppose while you're deciding whether to switch, another contestant is brought in and Monty lets them choose one or the other of the two closed doors. Aren't their odds of winning 50-50 for either choice?
- No. If they choose my original door, they win with 1/3 probability, if they choose the other one they win with
2/3 probability.
I know you think this, but why? There are just two doors, and the car is behind one or the other of them. Why doesn't that make the odds 50-50 for the new contestant?
- Because the situation is not symmetric. The new contestant knows something, and that information helps.
WRONG!! The second contestant knows nothing but the chances are still asymmetric anyway. Why?
Look, one Box is protected no matter who is watching. When two doors remain, there is a protected door and an unprotected door. Monty could never open the protected door, but could have opened the unprotected one, so the chances are unequal, NO matter who walks in or out. It doesn't matter what the second contestant knows or doesn't know. The fact that the second contestant "knows nothing" merely means that he will probably pick either door with equal probablity and half the time get a 33% winning odds and half the time a 66% door with an average winning percent of 50%
Consider this: Monty has three boxes, in one of them is a hundred dollar bill. You don't know where. But then Monty says: Ok, I'll help you, point out two hats and I'll gladly combine their contents into a new box. You point out two, and he does as he said. So now there are two boxes left, the new one and the untouched one, one of them has the bill. But the odds are not 50-50: you know that the bill is twice as likely to be in the new box than in the untouched one.
Suppose they don't know which of the two doors I chose. Then aren't their chances 50-50?
- Sure.
BTW, the reason I'm raising these questions is to try to suggest why the Monty Hall problem is a "problem". It wouldn't be a problem if the answer were obvious...
- Consider the extreme case: there are 1000 doors, one has a car. I pick a door, then Monty opens
998 loser doors. Now you come in. Do you pick my door or the other one?
Yes, that is the argument that most convinces me that there is an advantage to switching. But how can we be sure that this result applies to the 3 door case. Certainly the benefit of switching is very much higher in the 1000 door case that in the 3 door case. It obviously decreases as the number of doors decreases. Maybe the advantage vanishes altogether in the 3 door case?
Suppose they choose the door you originally chose. Aren't their chances of winning the same as your chances if you choose not to switch?
- Yes.
I think this is the most suprised I've ever been my the output of a program I wrote myself. I was sure it was going to come up 50/50. (Empirical Proof)
- Me too, with this PHP script here - very suprised indeed. Now my brain hurts. --Dan Huby 08:57, 15 Jul 2004 (UTC)
- If you're still not convinced after seeing the results of your program, I have a poker game on Thursday nights you're welcome to join... --LDC
Here's a similar problem in the game of Texas hold 'em that many people don't believe at first: I offer you your choice among three two-card hands: (1) a pair of fours, (2) an ace and a king of different suits, (3) a ten and a jack of the same suit. After you choose, I choose one of the remaining hands. Then we deal five cards face up, and the player who can make the best five-card poker hand with any combination of his two cards plus the five on the board wins. Which cards do you choose, and what is the expected outcome? It turns out that whoever chooses first loses. If you choose the A-K, I choose the 4-4 and have a slight advantage. If you choose the 4-4, I choose the 10-J and have a slight advantage. If you choose the 10-J, I choose the A-K, and have an advantage!
This is an interesting and helpful article but it should be completely rewritten; it doesn't sound like an encyclopedia article to me. The link to the Perl program is interesting, too, but it's original research. What place does that have in an encyclopedia?
I take the rather abstract view of an encyclopedia as something which explains everything. Thus I find the article actually quite accessible and helpful, written in a friendly direct way. The perl code is perhaps less useful to someone wanting to understand the problem in an article-reading sort of way, but I think of it as sort of a multimedia addition - a working Monty Hall Problem machine to tinker with. -J
I moved the following explanation from the article:
In short, the reason for the above result is that Monty will ALWAYS show the other goat. Look at the following situation with C stand for Car, G for Goat, X for the goat that Monty picked, and with the player always pick the first door:
Initial Monty Result
================================================== CGG CGX Sticker wins and switcher lose. GCG GCX Sticker lose and switcher wins. GGC GXC Sticker lose and switcher wins.
As can been seen above, the chance for the switcher to win is twice that of the sticker.
I believe this is not a compelling argument; it will only convince people who already believe the result or don't follow closely. If I were to criticize the argumentation, I would point out that the case
CGG CXG Sticker wins and switcher loses
is missing. So this makes two cases in favor of the sticker and two cases in favor of the loser: 50-50. To counter this objection, one would have to argue lengthily that the four listed cases are not equally likely, that in fact the first and the fourth case both have probability 1/6 while the second and the third both have probability 1/3. But why? And so on.
The crucial and convincing argument is the one given in the article: a switcher wins the car if and only if his original choice was a goat. Only if the reader understands that point will they truly be convinced. AxelBoldt, Wednesday, July 3, 2002
What's the connection with Three card monte? The other seems to be a confidence trick rather than a probability trick? DJ Clayworth 17:08, 27 Aug 2003 (UTC)
Hoax
The article is wrong. Switching will do nothing to change the odds. The past has no effect whatsoever; all one needs to know, is the current state of affairs. That means that there is a 50/50 chance of getting a goat. Furthermore, doors don't know if they are picked; It is only in the human mind that you lay "ownership" to one or the other. So "switching" is in the mind of the human only. This article is a hoax. ChessPlayer 17:18, 8 Mar 2004 (UTC)
- The article is absolutely correct as written, and your comment demonstrates why the problem is so famous. It's not a hoax, it's just counterintuitive. Monty introduces information when he opens a door, because he always opens a door that has a goat. I would explain further, but this is already rehashed here on the talk page and in the article itself. Isomorphic 17:32, 8 Mar 2004 (UTC)
This is the simplest way to put it I think: you have a 1/3 chance of picking the car, so you have a 1/3 chance of winning by sticking with your choice.
As Isomorphic says, the article is correct. I saw this problem twenty years ago, and while it took me a very long time to be sure that the solution give here is the correct one. You're in good company if you don't believe the answer at first, but it is right.
Thee are plenty of different ways of approaching this problem, and I recommend just searching the various places where solutions are given. Another way is to imagine a version with 1000 doors, in which Monty shows you 998 goats after you choose.
On the subject of 'things don't know they are picked', no they don't. But probabilities change with the amount of information you have. For example: I throw two coins, and without showing them to you, and tell you that one of them is heads and one of them is tails (assume that you can rely on me telling the truth; assume I don't change the coin's positions at all). What is the probability of the left hand one being heads? Yes, 50%.
Now I show you the right hand one, and it is tails. Now what is the probability of the left hand one being heads?
The right hand coin doesn't know its been shown, and the left hand one doesn't know anything about the right hand one. Nonetheless the probability changes.
DJ Clayworth 17:35, 8 Mar 2004 (UTC)
- The situation is, there are two doors; one is a goat. Therefore the odds are fifty fifty, and choosing either door doesn't matter. ChessPlayer 20:14, 8 Mar 2004 (UTC)
- In the coin example I gave above there are also two possibilities; Things are not always as simple as they seem. DJ Clayworth 14:22, 9 Mar 2004 (UTC)
Here are some web pages that might help:
- http://math.ucsd.edu/~crypto/Monty/monty.html
- http://astro.uchicago.edu/rranch/vkashyap/Misc/mh.html
- http://www.math.uah.edu/statold/games/games6.html
- http://www.math.hmc.edu/funfacts/ffiles/20002.6.shtml
- http://mathforum.org/mathtools/tool.html?co=ps&new_id=832
- http://www.statslab.cam.ac.uk/~steve/Teaching/Monty/
- http://www.cmh.edu/stats/faq/faq13.asp
Don't forget the 'assumptions' section of the article. If Monty is deliberately trying to mislead you, or doesn't always open a door, then the solution is different. DJ Clayworth 15:30, 9 Mar 2004 (UTC)
- Like everyone, I had trouble wrapping my head around this. So I broke out a deck of cards and played it myself. I used a red King as the car and two black deuces as the goats. I played a few "hands"; each time I picked a random card and turned it over, so unlike the game show I could see whether my initial pick was correct immediately, but that didn't matter since I was always switching. Then I'd find a deuce and treat it as the door that Monty opens, and then I'd switch to the remaining card and see whether I won or lost. Rather quickly I noticed something: The moment I randomly pick the card and flip it over, I could tell instantly whether switching would cause me to win or lose, because when I picked the King (the car), I knew that switching would cause me to lose (because I'd have switch to a deuce, there's only one King). And conversely, whenever I picked a deuce, I knew that switching would cause me to win (because the only card to switch to will be the King after the other deuce is removed by Monty). That's when it hit me: 1. Whenever I pick a deuce, switching will cause me to win. 2. Everyone knows that two out of three times I'm going to pick a deuce. Therefore: Two out of three times I'm going to win by switching! ... Interesting side note: In the late sixties my parents were on Monty Hall's game show in Vegas, and they won the car... although my mom knew which curtain to pick because she could just barely see the wheels of the car below the curtain... needless to say she did not switch. =o) - Eisnel 08:28, 15 Jul 2004 (UTC)
Featured status? Patch, patch
Now that this is nominated for Featured status, it's going to get more demands for improvement. Mav has obejcted to the intro, and I agree that it's insufficient. OTOH I don't like the oft-cited general principle that the intro needs to be a mini encyclopedia article in itself. For a puzzle article, this would mean putting the solution into the intro, and I think that's so much of a spoiler as to be a disservice to the reader, who needs a chance to think about the problem before seeing the solution. To give a problem fully and clearly (necessary before giving the solution, I claim) may too much for an introductory paragraph. I'll try to submit a new intro if no one beats me to it. Dandrake 23:15, Mar 10, 2004 (UTC)
- I find this article much easier to understand now that the order of the sections has been shuffled. Could someone add the date of the problem occuring in Parade magazine to the quote? Is it 1990 like it said lower down (someone above on the talk page said they'd known of this problem 20 years ago though)?
- Looking at the rec.puzzles archives, 1990 looks about right for the Parade article, but the puzzle and the name 'Monty Hall' for it are much older. It looks like the Parade article may be the source of the goats, though. Matthew Woodcraft
- Indeed, it's descended from a problem that Martin Gardner—surprise!— wrote up in the 50s. I'll dig up the references in a day or two. They're in the Journal of Recreational Math paper, well worth seeing if you can find it. Dandrake 05:15, Mar 14, 2004 (UTC)
- The one thing I really had problems understanding was the first entry in the "aids to understanding" section (ironic, eh?). I had to reread whole sections before my brain could accept anything other than a 50/50 chance for the two doors left in that example.
- Here's what's in the article at the moment:
- It may be easier for the reader to appreciate the result by considering a hundred doors instead of just three, with one prize behind only one of the doors. After the player picks a door, Monty opens 98 doors with goats behind them. Clearly, there's now a very high chance (precisely 99/100) that the prize is in the other door Monty did not open.
- Here's my rephrasing.... any good?
- It may be easier for the reader to appreciate the result by considering a hundred doors instead of just three. In this case there are 99 doors with goats behind them and 1 door with a prize. The contestant picks a door; 99 out of 100 times the contest will pick a door with a goat. Monty then opens 98 of the other doors revealing 98 goats and offers the contestant the chance to switch to the other unopened door. On 99 out of 100 occasions the door the contestant can switch to will contain the prize as 99 out of 100 times the contest first picked a door with a goat. At this point a rational contestant should always switch.
- I quite agree with your reaction, that the first clarification didn't clarify it. I like your version a lot better. (Please, a semicolon rather than a comma in the third sentence.) Dandrake 05:19, Mar 14, 2004 (UTC)
Please look over the changes. I think it's imporoved, but I may be prejudiced. Anyway, do keep the Bayes derivation in some form; I know for sure that there's one person who once found it the most convincing argument. I haven't put in Fabiform's fix, though I will if no one else does anything; once it's in, I think this qualifies for Featured status. Dandrake 18:19, Mar 14, 2004 (UTC)
Text from another version, posted under the title Monte Hall Problem. Charles Matthews 20:57, 10 Jun 2004 (UTC)
The “Monte Hall Problem” (Applied Probability)
On the old game show “Let’s Make a Deal”, the final segment involved two contestants who had won the most money that day. There were 3 closed doors with prizes hidden behind each door. One of the doors had the grand prize. The two contestants each picked a different door. Monte then went to one of the contestants and showed them what was behind the door they chose. However, Monte wanted to keep the level of suspense high until the final moment of the show, so this first contestant never won the grand prize (Monte knew which of the 3 doors had the grand prize so he could always pick a contestant who would not win). So Monte showed this first contestant their (lesser) prize and then he went to the second contestant. There were now 2 remaining doors and 1 contestant who had already picked a door.
The question is: What are the odds that this second contestant will win the grand prize? Most people will insist vehemently that the odds are 1/2 (1 choice out of 2 doors). However, that is not the case. If the person got to choose the door after the first contestant lost, the odds would be 1/2 but they had already picked the door. The odds are actually 2/3. Here is the rationale.
Obviously before the results are known, the odds of a given contestant winning are 1/3 (1 choice out of 3 doors). But think of the 2 contestants together as a “group”. Obviously the odds of the “group” winning are 2/3 (2 choices out of 3 doors). In other words, the odds of 1 of the 2 contestants winning the grand prize are 2/3.
When Monte first eliminates one of the contestants (which he can always do because there is always at least one loser), he hasn’t changed the odds for the “group”. So the remaining contestant now carries the odds of winning for the “group” which was 2/3. Thinking in terms of “conditional probability”, the odds of the group winning before the results are known are the same as the odds of the group winning given that one of the contestants is a loser (which we already knew before any results were known). So Monte has not provided NEW information about the group as a whole -- he has only confirmed the previously known fact that one of the two is a loser.
This may also be a difficult result to embrace. This is partly due to the fact that obviously the odds have changed for the second contestant. They previously had odds of 1/3 and now they have odds of 2/3. But while Monte provided no new information about the “group”, he did provide information about which person in the group was a definite loser.
This problem can perhaps be more easily comprehended when thinking about larger numbers of doors and contestants. If there were 100 doors and 99 contestants. Monte could always pick 98 losers and show them their door. There would remain 2 doors and 1 contestant. But it might seem more obvious that the odds for that remaining contestant must be greater than 1/2 (the odds would be 99/100).
This article seems to be making an unstated assumption that the contestant wants the car and not a goat. Personally I'd go for the goat every time. Goatherd 21:09, 10 Jun 2004 (UTC)
So, according to this article, if someone chooses door #1, and Monty opens door #3 and it contains a goat, then the car is twice as likely to be in door #2 than #1. However, if the constestant chooses #2 to start with, and Monty opens #3, then #1 is twice as likely to contain a car. Which is right? - Evil saltine 07:22, 15 Jul 2004 (UTC)
I think this article goes around the houses too much. The problem is extremely simple to understand if one looks at it this way: The initial probability that you will choose a goat is 2/3. If you initially choose a goat and switch you will win every time. If you choose a car and switch you will lose every time. Therefore switching increases the odds that you will win from 1/3 to 2/3. This version of the solution is not given until very late in the article. Mintguy (T) 07:53, 15 Jul 2004 (UTC)
- I agree with Mintguy that this is the easiest way to understand the problem. However, I think it's OK to present aids like this later in the article, because it makes for a more rewarding article if you spend some time on the rather confusing dilemma before giving the reader the nugget that will make everything clear. I love this problem so much just because it took me a while to finally come to grips with it. But I think that the version of this explanation given in the article isn't very clear, it could use some work. (BTW, I hope you don't mind that I corrected a typo in your comment) - Eisnel 08:37, 15 Jul 2004 (UTC)
I have really not got my head round this. If the host knows what is behind the doors and allways picks a goat then surely, whether you have picked a car or a goat then you are left with a 50/50 chance of picking the right one by sticking or switching? 1. you have a 2/3 chance of picking a goat and 1/3 of picking a car 2. monty has a 100% chance of picking a goat as he knows what is behind the doors 3. You are finally left with a choice of two doors, one with a goat one with a car hence 50/50
- I finally understood it from this description.. er.. when you start out, two doors have a goat and one has the car. If you pick either of the two with a goat, and you switch, you win. If you pick the one with the car, and then you switch, you lose. There's two goats and one car, thus double the chance of picking a goat and double the chance of winning rather than losing if you switch. Yes, there are two choices, but there's not equal probability because you're more likely to have picked the goat in the first place.
I understand this, but this ignores all the possible sequences of events which are as follows in order of picking contestant-host-contestant
- 1. Goat1 Goat2 Car (win) = 1/3 x 1/1 x 1/2 = 1/6
- 2. Goat2 Goat1 Car (win)
- 3. Goat1 Goat2 Stick (lose)
- 4. Gost2 Goat1 Stick (lose)
total 4/6 If the car is picked first the host may pick either goat:
- 5. Car Goat1 Goat2 (lose) = 1/3 x 1/2 x 1/2 = 1/12
- 6. Car Goat2 Goat1 (lose)
- 7. Car Goat1 Stick (win)
- 8. Car Goat2 Stick (win)
total 2/6 AH-HA Got it thank you!
- Because the goats are indistinguishable, you have a 2/3 chance of getting one of scenario 1, 2, 3, or 4 on the first pick. You have a 1/3 chance of getting one of scenario 5, 6, 7, 8 on the first pick. Then Monte shows you a goat behind another door. Because Monte always shows you a goat, the probabilities of the game are determined only by the first pick — and the probability of success on your first pick doesn't change. If you picked it right the first time (1/3 probability), you can stick and win. If you picked it wrong the first time (2/3 probability), it is to your advantage to switch to the other door which originally had a 1/3 chance of winning. -- ke4roh 13:26, Jul 15, 2004 (UTC)
Is it not true that if one of the incorrect doors is eliminated as a part of the procedure, that there are effectively only two doors (one right and one wrong)? The third, for the point of probability, might as well not exist because it is opened before the final decision is made. The probability of any door (including the chosen one) is therefore changed, because the choice is not final when it is. It is ludicrous to assume that simply because a door had been chosen before, that notwithstanding the third's opening, the probability is still one in three - especially since it was one in two to begin with (see up). Falcon 16:49, 15 Jul 2004 (UTC)
- You've almost answered yourself. The point is that the third door DID exist when you made your choice and thus the probability that you chose a wrong door is 2/3. Therefore, with only two doors left there is still a 2/3 chance that the one you picked has a goat, so you are therefore better to switch to the other door where there is only a 1/3 chance of it having a goat. The answer to Exile (below) takes you through this in more detail.
This still looks like a hoax to me.
There are 2 possibilities.
1. You have picked a goat. Chance - 50%
2. You have picked a car. Chance - 50%.
Obviously, Monty has improved your chances of getting the car from 33% to 50% by opening a door and showing that one of the two doors you have NOT chosen contains a goat. BUT WHETHER YOU CHOOSE TO "STICK" OR "SWITCH" MAKES NO DIFFERENCE TO THE OUTCOME!
Exile 21:07, 15 Jul 2004 (UTC)
- I can see where you're coming from. That's how I thought for a long time. The thing to remember is that while the number of possibilities has been reduced to two, the number of possibilities you originally picked from remains at three.
- Try this: at the beginning, the chance of you picking the car is 1/3 and therefore the chance of you picking a goat is 2/3. When Monty reduces it to two doors, nothing changes and the chance that you picked a goat is still 2/3. However, there are now only two doors remaining and, given that there's a 2/3 chance that the door you picked has a goat behind it, you're better off switching to the other door that only has a 1/3 chance of having a goat behind it and therefore a 2/3 chance that it has the car behind it.
- Does that help?
- Well, you clearly see the paradox, now all that's left is to understand the mathematics. The fact that Monte shows you a wrong door (which we knew he would do), does not change the original 1/3 probability that you picked the correct door, and it tells you nothing more about what's behind the door you originally picked. So, you still have a 1/3 chance of having picked the correct door the first time around. But now that you've seen a goat, and you know that you had a p=1/3 chance of getting it right on the first pick, you also know you have a 1-p=2/3 chance of getting it right if you switch doors. -- ke4roh 03:00, Jul 16, 2004 (UTC)
I'm no mathematician and I don't know much probability theory. Having read the article I understand why the 'standard' solution is correct, according to probability theory, but I am still not convinced it is the correct answer to the problem. The probability theoretical solution is based on re-running the problem many times and finding a strategy that maximises the number of wins. That's an assumption built into any probability theory approach. But it is different to the Monty Hall problem. The Monty Hall problem asks whether you should switch doors in this one particular event. The contestant doesn't get to try this a hundred times; he gets to choose exactly once. I don't think probability theory can say that for this particular event that occurs exactly once, switching doors increases the chance of a win in that specific instance.
In the real world, there's probably not a large enough number of shows to demonstrate that the switch strategy maximises the number of wins in the proportion predicted by probability theory. In that case the 'standard' solution is no use.
Even if there was a large enough number of shows, at each run a different contestant is faced with the choice to switch. If they all switch they maximise the number of winning contestants. I'm not sure this affects the individual contestant's situation, i.e. it does not increase their chance of winning in their particular show.
- If the probability of getting a certain result is N each of several instances, then the probability of getting certain result is also N in any single instance (note the tautology). What you're saying amounts to "Just because the odds of getting a 3 when rolling a die is 1/6 each time I roll, doesn't mean that the odds of me getting a 3 is 1/6 at any one time." Simoes 15:40, 24 Jul 2004 (UTC)
two envelope problem
I was talking to a friend about the Monty Hall problem today and he told me about a similar problem, and I don't think there is a Wikipedia article on it and I'm not sure how to present the solution either, but anyway here is the problem:
You are on a gameshow and the host holds out two envelopes for you to choose from A and B. So you choose an envelope (A) and it's got $2000 in it. The presenter then says that one of the envelopes has twice as much money in it as the other one and offers you the chance to switch. So you think about it this way... "If I switch I will go home with either $4000 or $1000, by not switching I will go home with $2000. There is a 50/50 chance that I will double my money by switching. A normal 50/50 bet results in me either doubling my money or losing it all, whereas here I will only lose half. Therefore this is a better than evens bet so I will make the swap." You are just about to swap envelopes when you think about the problem some more - "Surely this can't be right... ". Mintguy (T) 16:13, 15 Jul 2004 (UTC)
- Interesting, but not really similar. In the Monty Hall problem, there's just one possible positive payoff, so the only concern is maximizing your chance of getting it. This problem is more complicated. It is true that if you switch, the expected value of the new envelope is (1000*0.5 + 4000*0.5) = 2,500, so if all you care about is the average amount of money you'll take home, you should switch. However, most people in the real world are risk averse, meaning that they may prefer the sure 2,000. Isomorphic 02:58, 16 Jul 2004 (UTC)
Thank you!
I'd never managed to get my head round this before (I knew it was correct, but I couldn't work it out myself), but after reading this article I finally understand it, so thank you to all involved (the aids to understanding were particularly helpful)!
Java program to compute chances of winning
My friend and I were arguing if the Monty Hall problem solution was true. He didn't believe me, so he made this program in java to calculate the chances of winning the car if you switch or stay. It was coded rapidly so it might not be the most efficient software. Even though it was written fast, it works perfectly well. I didn't find how to put the whole source in a box without removing the line breaks (sorry), if you know how leave me a message. If someone can compile this into an applet and host it somewhere, we would really appreciate it. --Jcmaco | Talk 03:39, Jul 18, 2004 (UTC)
import java.io.*; public class Paradox { public static void main(String[] args){ int choisie, ouverte, gagnante, nbFois = 1000; double reste = 0, change = 0; double temp; for(int i = nbFois; i > 0; i--){ temp = Math.random() * 3; gagnante = (int) Math.floor(temp)+1; temp = Math.random() * 3; choisie = (int) Math.floor(temp)+1; do{ temp = Math.random() * 3; ouverte = (int) Math.floor(temp)+1; }while(ouverte == gagnante || ouverte == choisie); System.out.println(gagnante + "\t" + choisie + "\t" + ouverte); if (gagnante == choisie) reste++; else change++; } // fin for System.out.println("reste: " + reste + "\tchange: " + change); double resteP = reste / nbFois * 100; double changeP = change / nbFois * 100; System.out.println("reste %: " + resteP + "\tchange %: " + changeP); } // fin main } // fin class
Removed Assumptions
I decided to be a little bold and remove the assumptions section. If you read the problem as stated, then you don't need to make any of the assumptions listed. E.g., it doesn't matter whether or not you assume "Monty always opens a door," or that there is "always a goat behind the door Monty opens," because the problem clearly states that Monty opens a door and reveals a goat. That is all you need to know in order to determine the correct answer. Other points in this section also showed an incorrect understanding of the problem. --Simoes 15:53, 24 Jul 2004 (UTC)
- No, you entirely misunderstand. He is known to open the door this time, as you say, but calculating the probability requires knowledge, or assumption, about what happens in general. What, for instance, was the probability of picking an Ace out of a deck of cards, if you know that I took one, and it turned out to be an Ace? Myabe you think it's 1 in 13, but in fact I was using a pinochle deck. The specific probability calculation requires general knowledge of the conditions. For more detail, showing how the assumption is necesary for this problem, read the assumptions section, which I am restoring to its rightful place. However, if it is unclear, let us by all means fix it. Also, you have not addressed others, such as the obvious matter of the value of a goat. Dandrake 22:31, Jul 24, 2004 (UTC)