Posts

Comments

Get Posts by E-mail

[BTW: Pardon my dust. I swapped hosts from GoDaddy to Site5. Our long national nightmare is over.]

Riley Lark:

Probability only works on large scales, and then it really only approaches working. The kids all get the basic stuff, like coin flips having 50/50 odds, they all don’t get the more complicated stuff like standard deviation. They lack the tools for problems with continuous distribution, and as far as I can tell that leaves us with carnival games and card tricks. Which only work on average. By being my interested, lively self during class I managed to interest half of them for most of the time (what are the odds that you‚ interested in this question, Johnny?). But it was a hard unit for me.

And then he kills the probability lesson anyway. It’s like watching Danny Ocean explain all the reasons why the safe positively cannot be cracked before shrugging his shoulders and cracking the safe anyway. And, make no mistake, Riley is breaking into the Fort Knox of probability problems here with Monty Hall, a problem that causes cranial hemorrhaging even among professional mathematists.

I’m grateful I have just enough classroom expertise to appreciate what a thing of beauty it is when Riley draws fifty marks on the board behind him, a subtle classroom action that’s the equivalent in precision and style to Magic Johnson flicking a no-look bounce pass between three defenders for the assist.

36 Responses to “The Trouble Teaching Probability”

  1. on 28 May 2010 at 12:23 pmMatt

    Hi Dan,

    Love the website, found you through your Ted talk. I’m a mathematics instructor beginning my career at the post-secondary level. I intend on dedicating my career to the improvement of math education.

    Some of the ideas you have I think are wonderful. In particular bringing context to math. I don’t believe students learn or retain information unless they sense a broader relevance. I volunteer my time in high schools giving a talk entitled ‘When Will We Ever USE This Stuff?!’ where over the course of an hour we tackle fun, interactive math problems and I try to convince students that fun exciting careers exist for those with mathematics backgrounds.

    Although at the post-secondary level I find issues with textbooks the least of my concerns, I hope that devices like the iPad may allow an aggregation of problems such as your container filling with water and possibly allow students the ability to interact with the video to do as you put it ‘the leg work’.

    Look forward to reading further,
    -Matt

  2. on 28 May 2010 at 12:24 pmTim Hunt

    Mention of the Monty Hall problems prompts me to ask if you have seen this one:

    “I have two children. One is a boy born on a Tuesday. What is the probability I have two boys?”

    Regrettably, the place I saw that was in connection with the New Scientist’s coverage of Martin Gardner’s death http://www.newscientist.com/article/dn18950-magic-numbers-a-meeting-of-mathemagical-tricksters.html. A sad loss, but a nice problem.

  3. on 28 May 2010 at 12:25 pmcorn walker

    I’ve never had this question explained to me properly about the Monty Hall problem. Why is the second choice opportunity NOT considered an independent event? In the first choice, you have no knowledge about any of the doors and choose a door at random. The probability that the door you chose has a car is 1/3.

    However, between the first and second choices you’re given knowledge about one of the doors you didn’t choose. Because that door has a goat, that door is removed from consideration. Revealing one of the unchosen doors doesn’t confer the combined probability on the remaining unchosen door for the second choice; at this point we must restate the problem as “there are two doors, behind one of them is a goat and behind the other a car. Choose one.”

    Just because they’re two of the three doors you considered in the first choice doesn’t mean the second choosing event is dependent upon the first. Consider the null event where you didn’t choose any doors, and Monty Hall shows you a door with a goat. Does that not leave you in exactly the same place, choosing between two remaining doors only one of which contains a car?

    When I first saw the proof that switching doors was better than staying with the first the math seemed sound to me. On reflection I think that proof has been tripped up on a semantic blunder in the statement of the problem, thereby creating the seeming “paradox.” I’d be most pleased if someone can show me where I am errant in my logic here.

  4. on 28 May 2010 at 12:25 pmRiley

    “Does that not leave you in exactly the same place, choosing between two remaining doors only one of which contains a car?”

    But you probably chose the wrong door the first time, right? That’s where the extra information comes from. 2/3 of the time you’ve already chosen the wrong door. That’s why switching is better. There is no “transferring” of probability. It’s just not true that the two remaining doors have an equal chance of hiding the car, and that’s the error in your logic.

    Thanks for the props, Dan. I’m glad you can appreciate the cinema of whirling around and drawing fifty marks (I counted aloud quickly ;)). Hey Dan’s readers, welcome to my blog! Please subscribe!

  5. on 28 May 2010 at 12:25 pmTodd

    Tom, just sent this to Dan before I read these comments. We made a video about that very problem:

    http://www.youtube.com/watch?v=2DN4uPvPEgU

    Enjoy,,
    Todd

  6. on 28 May 2010 at 12:26 pmTodd

    Oops – when I say Tom I, of course, mean Tim.

    T.

  7. on 28 May 2010 at 12:26 pmcorn walker

    We knew without him revealing the goat that 2/3 of the time I initially chose the wrong door. But who cares what I chose the first time? The statement, as I have seen it posed, suggests that if you switch doors when given the chance, you will “win” 2/3 of the time.

    “It’s just not true that the two remaining doors have an equal chance of hiding the car, and that’s the error in your logic.”

    The error in my logic is what exactly? Here’s the initial setup: The probability that the car is behind door 1 is 33%. The probability that the car is behind door 2 is 33%. The probability that the car is behind door 3 is 33%. I choose door 1. Door 2 is revealed. Is the probability that the car is behind door 3 now 66% for my second choice?

    Consider an explanation of the setup from a different perspective. Monty is always going to reveal a door without the car. When door 1 is revealed there are the following two possibilities:

    1) Door 2 has the car
    2) Door 3 has the car

    When door 2 is revealed there are the following two possibilities:

    3) Door 1 has the car
    4) Door 3 has the car

    When door 3 is revealed there are the following two possibilities:

    5) Door 1 has the car
    6) Door 2 has the car

    Regardless of what door is revealed, after it is revealed the probability of the car being behind either of the remaining two doors is equal. It simply doesn’t matter which door I initially chose because at the point that I am making my second choice this is the scenario that confronts me. So while in my initial choice I had a 2/3 probability of not choosing the car, after Monty reveals a door with a goat in my new choice I have a 1/2 probability of not choosing the car. Once revealed it’s like the third door never even existed.

    Let’s say there were 10 doors, and I chose door 1. The probability that I chose the wrong door is 90%. Monty then reveals doors 2 through 9 have goats. Is the probability that door 10 has the car 90% and I should switch? How about 100 doors, and Monty reveals doors 2 through 99 have goats. While the probability that I initially chose the correct door is 1%, after the 98 doors are revealed do I have a 99% chance of winning if I switch?

    I’m saying the addition of knowledge we didn’t have in the first choice changes the game for the second choice. If it didn’t, then it wouldn’t affect the probabilities of the car being behind each door. If we had that knowledge in the first choice we would state the game as “the car is behind door 1 or 3; door 2 hides a goat.” We wouldn’t then say door 1 has a 33% probability of having the car, would we?

  8. on 28 May 2010 at 12:26 pmcorn walker

    Tim, isn’t the problem with the Tuesday boy question one of an undefined population and undefined probabilities? Again, I think English is tripping us up here in our lack of precision.

    I have two children, what is the probability that one of them is a boy? If the population is just my household, the probability is 1. If the population is children of my male siblings, the probability is 1. If the population is the entire world…

    Let’s assume for this problem we are equally likely to have a girl or a boy. The probability that my first child is a boy is .5 while the probability that my second child is a boy is also .5. Because these are independent events (i.e the sex of the first child does not affect the sex of the second child) the probability of having two boys is .25. This is represented by the following pairings:

    1 2
    B B
    B G
    G B
    G G

    If you know that one of my children is a boy, that eliminates the possibility of the fourth pairing. But, and here’s the crucial inference, is the question still referring to the entire population or, by saying I already have a boy, are we now restricting the population to only include those who have boys?

    If the population changes, the question is now “of those families that have two children, at least one of which is a boy, what is the probability that there are two boys?”

    Note that the original question is ambiguous as to the change in population under consideration. Likewise including Tuesday either changes the population or it doesn’t, but the question as originally posed doesn’t suggest one way or the other.

  9. on 28 May 2010 at 12:27 pmSam Critchlow

    You are in good company arguing (strenuously) against the non-intuitive answer to the Monty Hall problem. Famously, phds in math, scientists, and other brainiacs wrote in to argue against the solution published in Parade magazine. The simplest way I can explain it is this:

    If you have initially chosen the car, switching will ALWAYS LOSE.
    If you have initially chosen a goat, switching will ALWAYS WIN.
    Because you will initially choose a goat 2/3 of the time, choosing to switch will make you win 2/3 of the time.

    If you have initially chosen the car, staying will ALWAYS WIN.
    If you have initially chosen the goat, staying will ALWAYS WIN.
    Because you will initially choose the car 1/3 of the time, choosing to stay will make you win 1/3 of the time.

    http://en.wikipedia.org/wiki/Monty_Hall_problem has some good trees and charts to explain this graphically in a way that should help you to find your error in reasoning.

  10. on 28 May 2010 at 12:27 pmSam Critchlow

    EDIT:

    “If you have initially chosen the car, staying will ALWAYS WIN.
    If you have initially chosen the goat, staying will ALWAYS LOSE.
    Because you will initially choose the car 1/3 of the time, choosing to stay will make you win 1/3 of the time.”

  11. on 28 May 2010 at 12:28 pmTodd

    As for the two kids/boys problem, I agree that how we phrase probability is the crucial thing. An “and” instead of an “or” changes everything. We have to know that the family has two and only two children if we’re going to be clear, I think.

    Whenever I’m teaching probability, I’m always reminded of some comic I once heard talking about the skill/trick of catching a bullet in their teeth. He was trying to figure out how you lean how to do it: do they toss you a few and then put one in a gun while saying “this one’s coming a bit faster…?” As Riley mentioned in his original post, the absolute basics like a coin flip are intuitive to kids. Go just a little father into the subject matter and the slope of abstraction takes off almost vertically. I think that’s why people (myself included) struggle with it so much.

    As for the Monty Hall, I love reading the ideas here. I actually had an interesting experience with this: I was once back home (in LA) where I was introduced to the writer of the movie “21.” (never saw it, but he was a nice guy). When he found out that I was a math teacher, he excitedly told me that he had just written the scene with the Monty Hall problem. When he explained his understanding…well, it wasn’t exactly rock solid. We talked about it for a while and one thing that helped him was to focus on the number of ways a certain strategy (like switching) can win vs. the number ways the other strategy can win. It was very cool.

  12. on 28 May 2010 at 12:28 pmcorn walker

    I don’t doubt the math, I doubt there are three doors and that the initial probabilities are what they are stated to be.

    If Monty Hall is known to always take one of the goats out of contention, then we don’t have a 33% chance of winning, we always have a 50% chance of winning because it’s the second choice that counts.

    My contention is that language and a simplistic computation of the initial probabilities is causing us to think erroneously about the problem.

    Let’s say I initially choose door 1. If all doors are then revealed it is true that I had a 1/3 probability of having chosen the wrong door. But that is not what happens. Instead one of the doors is taken away and I’m asked to choose again. My contention is that the first choice is somewhat irrelevant to the problem.

    Let’s say instead of doors Monty has a fair three-sided coin with either one head and two tails or two heads and one tail – you don’t know. He flips the coin and asks you to call it. He then tells you he’s going to flip a fair two-sided coin. You can either stay with the previous call or you can change your call.

    What I am stating is that this is what’s happening. When Monty reveals one of the doors you didn’t choose you’re invited to play a new game with better odds. Except you have no choice, because he changes the game you’re already playing the game with better odds.

    I can come up with examples like this all day. You have a bag containing thirty m&m candies, ten red, ten blue, ten green. You’re asked to choose a color (let’s say red). Whatever color you choose, Monty is going to remove all of the m&m candies of one of the other colors (he chooses blue) and ask you to choose a color from the two colors remaining. Now you can stay with your original color (red) or you can switch colors (green). What is the probability you will pick your chosen color out of the bag?

    You are NOT choosing to stay with your original probability P of having won or, in switching choices, increasing your odds to 1-P. I contend you are asked to play a new game with a different set of probabilities.

    How is the second choice NOT a new game with different probabilities than the first?

  13. on 28 May 2010 at 2:02 pmScott

    Corn walker: The second choice is not a new game because Monty makes his decision based upon yours. Sam’s, and Riley’s explanations cover this.

    Monty doesn’t take away just *any* door. He takes away a door with a Goat. This is the crux of the problem. It is this action that links the two decisions.

    When I first encountered this problem, I didn’t believe it either. So I made a java program to simulate it. You may want to try simulating it yourself, either by programming or carrying out experiments in RL (have someone be Monty).

  14. on 28 May 2010 at 2:14 pmFrane

    I am with you Corn Walker.
    If the assumption is that Monty knows where the car is, and that he will always open a door containing a Goat, then our first choice is irrelevant.
    No matter what door we choose Monty will open another containing a goat. This is relevant because we can dismiss this first step since it will always end up in the same situation: two doors, one with a car, one with a goat and having to make a decision about which door to choose.

    “Is it to your advantage to switch your choice?” Faced with two doors and one option you have a 50% chance.

    If Monty did not know where the car is then the first choice would matter, but since we assume he kows the first choice is irrelevant.

  15. on 28 May 2010 at 2:52 pmWilhelm

    When confronted with this problem (Monty Hall) in High School I did what most engineers would do – I simulated. It’s not that difficult to program the rules of the game show so you can play against your computer (or in my case my alculator). Then I let the calculator play against itself; always choosing at random initially, 1000 times staying, 1000 times switching. The results were clearly in favour of switching.

    I’m not sure about the pedagogical value of this, though. In my case, it helped me convince myself to leave my intuitive comprehension and take in the more mathematical. But it could also give the student a feeling that math “comes out of the magic box”.

  16. on 28 May 2010 at 4:13 pmDoug

    Check out this this paper which discusses the psychology behind the Monty Hall problem:

    http://fox-lab.org/papers/Fox&Levav%282004%29.pdf

    They conclude:

    “we suggest that introductory
    courses in probability might begin by acknowledging this intuitive
    predilection. For instance, such courses might give some attention
    to the questions of what constitutes an appropriate partition in light
    of the conditioning information and the experiment that generates
    it. Indeed, Nisbett, Krantz, Jepson, and Kunda (1983) observed
    that greater clarity of the sample space and sampling process
    facilitates the use of strategies based on formal statistical principles
    (see also Nisbett, Fong, Lehman, & Cheng, 1987). After
    students are proficient in this intuitive approach to conditional
    probability, they may be more receptive to a more flexible, computational
    approach. Indeed, Sedlmeier and Gigerenzer (2001)
    reported more success teaching Bayesian reasoning to students
    when problems were presented in simple frequency format rather
    than probability format. We surmise that simple frequency formats
    facilitate appropriate use of the partition– edit– count strategy.”

  17. on 28 May 2010 at 6:28 pm@thescamdog

    @corn walker

    “How about 100 doors, and Monty reveals doors 2 through 99 have goats. While the probability that I initially chose the correct door is 1%, after the 98 doors are revealed do I have a 99% chance of winning if I switch?”

    Actually, yes.

    “I’d be most pleased if someone can show me where I am errant in my logic here.”

    I’d love to. Let’s set up a simulation. You come over to my place with a number of $100 bills. We’ll take 100 envelopes. In 99 of them, you put scrap paper. In one of them, you put a $100 bill. I’ll pick an envelope at random. Then you eliminate 98 of them that have scrap paper in them leaving my original choice and one other. Give me the choice of sticking to my original choice or switching, and I’ll switch. I’d be happy to repeat this simulation with you as many times as you like.

    That’s why this is such a great experiment to try in class. Intuitively, kids (and some PhD’s think you have a 50% chance of winning if you switch). Set up a simulation, and once they see the experimental probability is much different, then you can use tree diagrams or other methods to get at the theoretical probability.

  18. on 28 May 2010 at 6:39 pmJohn Scammell

    @Todd, I love the video.

  19. on 28 May 2010 at 7:15 pmZeno

    The question is : should you switch?
    The answer is: you should switch.

    Why should you switch? Because if you switch you’ll win 2/3 of the time. If you don’t switch, you’ll win only 1/3 of the time.

    In other words, the probability of you winning if you switch is 2/3, whereas the probability of you winning if you don’t switch is only 1/3.

    It’s not true that you “always have a 50% chance of winning because it’s the second choice that counts”. Actually, the second choice is not a choice at all (if you’re trying to maximize your chance of winning) because you should always switch.

    The second choice is not a new game. Rather, it is a choice between playing by the original game rules or changing to new game rules that give you a better chance of winning.

    Because Monty opens all the other doors save one (which therefore must have the car if the door you picked doesn’t), switching has the effect of reversing the result of your original pick. If you’ve lost before switching, then you’ll win after switching, and vice versa. So switching is equivalent to playing the game with the original rule for winning reversed.

    The original game rule (if you don’t switch) is: you win the car if you picked the door with the car. The new game rule (if you switch) is: you win the car if you didn’t pick the door with the car. Your chance of having picked a door without the car (2/3) is better than your chance of having picked the door with the car (1/3). So the new game rule (door without car wins car) gives you a better chance of winning the car than the original game rule (door with car wins car). That’s why you should always switch.

  20. on 28 May 2010 at 10:57 pmDoug

    The explanation I like is that given the 3 doors, you’re going to get a goat most of the time. Therefore the other 2 doors are most likely to contain the car. And since Monty is forced to display one of those doors with the goat, that means the remaining door is more likely to have the car than your door. And so you should switch doors.

    Conversely, if Monty were trying to maximize your chances of getting a goat, he would never do what he is doing. Thinking of Monty being FORCED to give up this information may help in rationalizing this.

  21. on 29 May 2010 at 4:40 amcorn walker

    Wait, I get it now. If you switch you will NOT win 2/3 of the time, you’ll win 1/3 of the time. If you don’t switch you will win 1/6 of the time.

    I was getting tripped up on the textual description of “winning,” that switching will cause you to win 2/3 of the time. In fact, the probability of winning is actually 1/2 as we intuitively expect it to be.

    Once I stopped, breathed, and more importantly started doing the calculations it became clear that there were two probabilities. Thanks all for helping me see this.

  22. on 29 May 2010 at 6:52 amFrane

    I see it too now… I had to try with cards. For me the way to understand it is that your first choice creates two sets one containing one door (1/3) and the other one contains two door with a probability of 2/3. Monty reveals one of the doors of the second set and that doesn’t change the probabilities of that second set, so it is still 1/3 against 2/3…

  23. on 29 May 2010 at 6:55 amTim

    It seems like Dan ‘s post 2796, What Can You Do With This: ELA Edition http://blog.mrmeyer.com/?p=2796, foreshadows the problems discussed on these comments, especially the ones Devlin discussed in April, http://www.maa.org/devlin/devlin_04_10.html . The more the office knows, the more Michael is guilty, just as the more you know, the better chance there are boys in the family. To me, the key to these problems is that, although you know something about _a_ boy or _a_ door, you don’t know which boy or which door you have information about.

  24. on 29 May 2010 at 7:54 amRiley

    “Wait, I get it now. If you switch you will NOT win 2/3 of the time, you’ll win 1/3 of the time. If you don’t switch you will win 1/6 of the time.”

    I want to be clear that if you always switch, then on average you will get the car 2/3 of the time and a goat 1/3 of the time. The probability of winning is not 1/2 in any way. The reasons have all been stated here as clearly as I know how, and the facts of these answers are supported by simulation. This can be the end of that conversation – we’ve covered all the arguments and counter-arguments I can think of!

    Another interesting point which is raised in the comments at my blog (link at the top of Dan’s post), which you are welcome to spam with probability arguments all you like, is about the fairness of the comparison of the 3-door game with the 50-door game. Why do we assume the host will open 48 doors instead of 1 door, leaving 48 from which to choose? It’s a great question that did not come up in the class period I wrote about – but I wish it did!

  25. on 29 May 2010 at 8:36 amThomas

    Even though it sounds like many people have finally “gotten it” regarding the three doors, I wanted to simplify the explanation a bit more for anyone who is still confused.

    The reason the odds don’t change to 50/50 after one of the doors is eliminated is because the car and the remaining goat DON’T get reshuffled after your first pick; everything remains in the same place. This small but important bit of information is what trips most people up. They want to insist that they have even odds because they (perhaps subconsciously) see it as a BRAND NEW game with everything scrambled again. But it’s not.

    That’s why the knowledge that you were only 1/3 likely to make the correct choice the first time is useful; it tells you that the car is 2/3 likely to be elsewhere. The car therefore REMAINS likely to be elsewhere because nothing gets shuffled between your first choice and your second choice.

    In other words, think of the game as two GROUPS: the door you choose, and the doors you don’t choose. The GROUP you choose is only 1/3 likely to have the car. The GROUP you don’t choose is 2/3 likely to have the car. No matter what kind of adjustments you make within the other GROUP (the one you didn’t choose), it remains a closed set and is therefore still 2/3 likely to have the car in it when you’re finished. Therefore, you should always choose to switch to that GROUP. It’s just that now, that GROUP has been reduced to one door, too.

    Hope that helps clear things up!

  26. on 29 May 2010 at 9:18 amZeno

    Suppose Monty didn’t give you the option of switching. You choose a door and you get what you get, end of story. In that case, it seems clear that you’d win the car 1/3 of the time.

    Now suppose Monty gives you the option of switching, but you adopt a strategy of NEVER switching. In that case, Monty could have saved his breath. The outcome will always be the same as if he’d never asked. So the results are the same as the previous case. You’d win the car 1/3 of the time.

    Now suppose Monty gives you the option of switching, and you adopt a strategy of ALWAYS switching. In that case, the outcome will always be the opposite of what it would be if you never switched. So the results are the opposite of the previous case. You’d lose the car 1/3 of the time and you’d win the car 2/3 of the time.

    So if you always switch you’d win the car 2/3 of the time, while if you never switch you’d win the car only 1/3 of the time. That’s why you should always switch.

    What if you were to decide whether to switch or not by tossing a coin? In that case, the chance you’d switch is 1/2, so half of the wins would become losses, and half of the losses would become wins. As a result you’d win the car 1/2 of the time.

    Perhaps this last case indicates the source of the confusion. If you choose whether to switch or not at random, your odds of winning the car are 1/2. So it might seem that whether you choose to switch or not doesn’t matter. But you don’t have to choose whether to switch or not at random, and as we have seen, you can improve your odds of winning the car by always choosing to switch.

    Why does this seem counter-intuitive? Perhaps it is due to learning to think about probability through coin tossing.

    If you toss a coin, and choose whether to call heads or tails at random, you’ll win 1/2 of the time. If instead you decide to always call heads, you’ll also win 1/2 of the time.

    One might reason by analogy in the Monty Hall game that since the odds of winning the car if you choose whether to switch at random is 1/2 (correct), then the odds of winning the car if you always switch must also be 1/2 (incorrect).

    Why does the analogy fail? Well, why does always calling heads in the coin toss win 1/2 the time? It’s not just because heads is one of two possibilities (heads or tails), but also depends on the fact that the probability of the coin landing heads up is equal to the probability of its landing tails up. In other words, it’s assumed that the coin is fair. But suppose the coin were weighted so that it lands heads up 2/3 of the time. Calling heads or tails at random would still win 1/2 of the time. But always calling heads would win 2/3 of the time.

    In the Monty Hall game, the probability that the car is in the door you chose (1/3) is not the same as the probability that the car is in the other door (2/3). That’s why you can improve your chance of winning if you always choose the other door.

    But (assuming the game is fair) all the doors should have an equal probability of having the car, right? So why isn’t the probability that the car is in the door you chose the same as the probability that the car is in the other door? The answer is that Monty does’t choose the other door at random. He never opens the door with the car. So the other door (the one he doesn’t open) will always have the car, except when the car is in the door you chose. Thus 1/3 of the time the car is in the door you chose, and the rest (2/3) of the time the car is in the other door. That’s why you should always switch.

  27. on 29 May 2010 at 9:21 amDoug

    Let’s assume that his friend walks in after the first round without any info. To him the odds are 50:50. To the contestant they are 1/3:2/3. It’s confusing how the same thing can have different odds.

  28. on 29 May 2010 at 9:55 amThomas

    corn walker wrote: “Wait, I get it now. If you switch you will NOT win 2/3 of the time, you’ll win 1/3 of the time. If you don’t switch you will win 1/6 of the time.

    I was getting tripped up on the textual description of “winning,” that switching will cause you to win 2/3 of the time. In fact, the probability of winning is actually 1/2 as we intuitively expect it to be.”

    No. This is incorrect.

    Here’s another way of understanding this. We’re going to play two variations of the same game, only with cards. I have three cards: two 2’s and one Ace. You’re trying to find the Ace.

    In the first variation, I lay them face down in front of you and tell you to point to one but don’t look at it. You point at one of the cards. I then discard one of the other two, showing you that it’s a 2. That means that there’s a 2 and an Ace still on the table. Now, I PICK UP the two cards, shuffle them where you can’t see, then lay them back down and tell you to pick one. Obviously, you have a 50/50 chance of getting the Ace.

    Now for the second variation. I lay the three cards face down in front of you and tell you to pull one to yourself and KEEP it there, face down. I then pick up the other two cards, discard one — showing you that it’s a 2 — and lay the remaining one face down and ask you if you want to switch. In this variation, there is only 1 chance in 3 that the card in front of you was the Ace when you chose it. It’s stayed in front of you, and I haven’t done anything to it. Therefore, it’s STILL only 1/3 likely to be the Ace. No matter what I’ve done to the other SET of cards, the Ace is STILL more likely to be on my side of the table. Thus, if I am forced to get rid of a 2, whatever card I have left is 2/3 likely to be the Ace.

    Does this help clear things up?

  29. on 29 May 2010 at 1:36 pmbluefox420

    I tried to explain it to myself this way:

    Here’s the setup: G1 G2 C (Goat1 Goat2 Car)

    Scenario A: Always switch
    I chose G1, he showed G2, I switched to C -> I won
    I chose G2, he showed G1, I switched to C -> I won
    I chose C, he showed G1, I switched to G2 -> I lost

    Wins 2, Losses 1

    Scenario B: Never switch
    I chose G1, he showed G2, I didn’t switch -> I lost
    I chose G2, he showed G1, I didn’t switch -> I lost
    I chose C, he showed G1, I didn’t switch -> I won

    Wins 1, Losses 2

  30. on 30 May 2010 at 8:14 amCarl Malartre

    Hi Dan and Riley, the NetLogo tool helped me visually some years ago with that nice problem:

    NetLogo by
    http://ccl.northwestern.edu/netlogo/
    The Monty-Hall simulation is included in the program.

    Take a lot of turtles and make them advance on a football field when they win a game. Configure some of them to always switch door, some of them to always keep the door and some of them to randomly switch.

    Now make them play the game until they reach the end of the football field. That should show clearly that switching is a great strategy. I’m not sure how useful it would be pedagogically (I’m a programmer).

    I’m also developing a probability website with simulators (it’s free), but it’s in French, it’s beta and it needs the latest Flash Player. It does include a Monty-Hall simulator:
    http://www.netmaths.net/FeteForaine/

    Have fun and thx for the great post
    Carl from BuzzMath.com

  31. on 31 May 2010 at 10:56 amElsie Dabney

    geez..probability has been one of my not so liked topics in statistics before..but if you would be my teacher…i think id change my mind ;)

  32. on 31 May 2010 at 5:28 pmcorn walker

    Curses, in the excitement of figuring it out I’ve gone and explained myself poorly.

    If you choose at random the second time, you have a .5 probability of winning, however this probability is not distributed evenly. 1/6 of the time you would win by sticking with your original choice, 1/3 of the time you would win by switching, 1/3 of the time you would lose by sticking with your original choice, and 1/6 of the time you would lose by switching.

    Because this probability is not evenly distributed, you should choose to switch. Your probability of winning by staying then is 1/3 and your probability of winning by switching is 2/3.

    Now the explanations I’ve seen didn’t do it for me. What did it form me was to think of it as sets of door combinations. Assuming you’ve picked door 1, there are three possibilities for these doors: {W, L, L}, {L, W, L}, and {L, L, W}. Monty is always going to show one of the losing doors that you didn’t pick, reducing our sets to: {W, L}, {L, W}, and {L, W}. If you pick at random, there is one way of winning by sticking with door 1, and two ways of winning by switching to the remaining door.

    I’ve read this over three times now and think it makes sense. Please tell me I haven’t botched it again. :)

  33. on 31 May 2010 at 10:59 pmthe prom queen wannabe

    I have to agree with Elsie..*giggle*Though I wasnt the kid that sucked at math, probability was slightly different to the part of math that I love..Exams in probability just had too many stuff to read and to memorize…that’s the part that I hate a lot

  34. on 01 Jun 2010 at 12:52 pmblaw003

    Fun probability problems, and jees there are so many more. Doug speaks to the counter-intuitive nature of so many great questions.

    I am ultimately curious with why my first interpretation of Dan Meyer’s comment, “And then he kills the probability lesson anyway.” was opposite of his intent. Seems to me the measure of good teaching strategy is clarity of explanation combined with slick technology use.

    Am I off? I would have thought good teaching might be measured with something like quantifying (or characterizing in some form) student learning. Engaging kids is clearly a huge part of the battle. But creating mathematical ways of knowing & understanding for learners is the pedagogical hurdle.

    Else, it is merely another, slicker, snake oil sale.

    No hate intended to all commenters. I am a fan, and truly do dig the multimedia use and the impressive ideas for engaging kids.

  35. on 01 Jun 2010 at 1:31 pmDan Meyer

    For the record, I meant “kills” in the most flattering sense of the word. As in, “he nailed it.”

    To your question, I suppose I’m a little unclear what technology Riley has used in his lesson. Plastic bowls? Candy? Whiteboard markers?

    My appreciation of the lesson is rooted in one particular moment, where Riley takes a student rebuttal and runs with it, drawing fifty marks on the board and asking a pointed question that helps the student strengthen his own conceptual framework for the Monty Hall problem.

    That took courage, content knowledge, and empathy; not much technology. But that’s just my interpretation of the whole thing.

  36. on 03 Jun 2010 at 3:58 amrichcatt

    The comments and clarifications and questions above are why I introduce the ideas of probability to my year 10 students by getting them to role play a simulation of the Monty Hall problem, and then to go out in the playground in groups of 4 or 5 and repeat the simulation plenty of times. They get interested.
    What I’d love are problems that get them interested in the ideas of algebra (because, despite the great ideas on Dan’s blog, we maths teachers are constrained to impart a body of knowledge to the students at present; and I’m not really free to just have fun with problems until my year 13 Calculus class).
    Keep up the great work all of you.
    Greetings from Middle Earth (where the set is still up!).