Get Posts by E-mail

Pool table math is a common feature of a lot of geometry textbooks. Billiards hit a cushion and leave it at about the same angle. We have a real-world application! But as we’ll see in this week’s WWIB installment, not all treatments of that application are equal. In fact, commenters found them all wanting in various ways. I invite you to click through to this week’s three contestants:

  1. Discovering Geometry
  2. CME Project
  3. College Preparatory Mathematics

What You Said

In the preview post, commenters called out the following turn-offs in different versions.

  • “It jumps to the math notation too quickly.”
  • “There is a ton of language in these problems.”
  • “Two of the books just state that the angle of incidence and angle of reflection are the same and the other just expects students to know that.”
  • “I feel like if I sat down and solved the problem that follows their explanation, I’d be copying their steps rather than really thinking it out for myself in a way that would make sense of it.”

On Twitter, Rose Roberts urges us to be careful here as, “Problems involving pool and mini-golf were the reason I decided I hated geometry in 8th grade. The sole reason.”

I’ll try to summarize the critiques using language that’s common to this blog without putting too many words in my commenters mouths. These textbook treatments rush to a formal level of abstraction too quickly. They don’t do a sufficient job developing the question for which “angle of incidence = angle of reflection” is the answer, or helping students develop an intuition about that answer.

In Discovering Geometry, for example, the formal equivalence statement is given and then the text asks students to apply it with their protractor.


A number of my commenters offer variations on, “Just take ’em to the pool hall!” This idea sounds great and will scan to many as suitably progressive, inquiry-based, student-centered, etc. But I’m unsatisfied. Mr. Bishop took us to the pool hall when I was a high school student and let us watch a local pro knock down a rack. I think he let us shoot a bit ourselves. I remember enjoying myself. I don’t remember learning more math than I did in his classroom lesson.

Pro pool players don’t use protractors.

For one reason, they’ve internalized that mathematics through practice. For another, the player can’t measure the angle of the ball in real time. The ball moves too quickly and the pool player’s eye-level view of the pool table is unlike the bird’s-eye view that would allow her to measure that angle.

This is a problem.

What I Need

Here is the resource I need. I’d like students to experience mathematical analysis as power, rather than punishment.

So let’s start with a tool that comes easily to students: their intuition. Let’s invite them to use their intuition in the context of a pool table. And let’s establish the context so that their intuition fails them, or at most earns a C-.

Then, let’s help students learn how to analyze the path of the pool ball mathematically. We’ll repeat the previous exercise and point at the end to the superior results that accrue when students analyze the pool table mmathematically instead of intuitively. (If superior results don’t accrue, we should either re-design the context to better highlight math’s power on a pool table or admit to ourselves we were wrong about math’s power.)


John Golden gets us close to that resource, inviting teachers to pull out still frames from this video of billiard shots for student analysis. But that analysis is much more complex than the level of the textbooks we’re critiquing today. Billiards ricochet off of other billiards in that video.

The resource I need doesn’t seem to exist yet, so I’ll try to build it. I’ll start with this game. Stay tuned.

Larry Cuban has spent the last year observing and documenting the practices of schools that are known for successful technology implementation.

Here are eight different yet interacting moving parts that I believe has to go into any reform aimed at creating a high-achieving school using technology to prepare children and youth to enter a career or complete college (or both).

Notably, none of them are explicitly about technology.

I have a recurring happy dream that I’m on Jeopardy. It’s the final round. The Trebekbot 2000 reads the final clue:

“These are the dimensions of the rectangle that has the largest area given a fixed perimeter.”

“WHAT IS A SQUARE!” I yell out while my competitors are still thinking quietly. I have disqualified myself and ruined the round, but I don’t care. I start high-kicking around the set while security tries to wrangle me away and I still don’t care because I finally found some use for this fact that takes up a significant chunk of my brain’s random access memory.

It’s a question you’ll find in every quadratics unit, every textbook, everywhere. I could have selected this week’s Who Wore It Best contestants from any print textbook, but instead I’d like to compare digital curricula. I have included links and attachments below to versions of the same task from GeoGebra, Desmos, and Texas Instruments, three thoughtful companies all doing interesting work in math edtech. (Disclosure: I work for Desmos, but don’t let that fact sweeten your remarks about the Desmos version or sour your remarks about the others. Just be thoughtful.)

So: who wore it best?

Click each image for the full version.

Version #1 – GeoGebra


Version #2 – Desmos


Version #3 – Texas Instruments


Steve Phelps suspects I stacked the deck in favor of Desmos here, taking full advantage of our platform while taking only partial advantage of GeoGebra and the Nspire. John Golden concurs, hypothesizing that “there would be a worksheet to go with the GeoGebra sketch.”

So a note on sampling: the GeoGebra example is the most viewed lesson on the subject I could find at their Materials site. The Texas Instruments lesson is the only lesson on the subject I could find at their Activities site. I told Steve, and I’ll tell you, that if anybody can come up with a better lesson on either platform. I’ll be happy to feature it. This isn’t much fun for me (or useful to Desmos) if I stack the deck.

Both Lisa Bejarano and John Golden call out the Desmos lesson as “too helpful” – they know how to make it sting – in the transition from screen 5 (“Collecting data!”) to screen 6 (“Here! We’ll represent the data as a graph for you.”).


I’l grant that it seems abrupt. I don’t think this kind of help is necessarily counterproductive, but it doesn’t seem as though we’ve developed the question well enough that the answer – “graph the data!” – is sensible. The Texas Instruments version has a solution to that problem I’ll attend to in a moment.

My concern with the GeoGebra applet is that the person who made the applet has done the most interesting mathematical thinking. I love creating Geogebra applets. I generally don’t have a good story for what students do with those applets, though. In this example, I suspect the student will drag the slider backwards and forwards, watching for when the numbers go from small to big and then small again, and then notice that the rectangle at that point is a square. The person who made the applet did much more interesting work.

Let me close with one item I prefer about the Desmos treatment and one item I prefer about the Texas Instruments treatment.

First, my understanding of Lisa Kasmer’s research into estimation and Paul Silvia’s research into interest led me to create this screen where I ask students “Which of these three fields has the biggest perimeter?” knowing full well they all have the same perimeter:


Still later, I ask students to estimate a rectangle they think will have the greatest area. That kind of informal cognitive work is largely absent from the TI version, which starts much more formally by comparison.


TI does have a technological advantage when they allow students to sample lots of rectangles and quickly capture data about those rectangles in a table.


Desmos is working on its own solution there, but for now, we punt and include prefabricated data, which I think both companies would agree is less interesting, less useful, and more abrupt, as I mentioned above.

That’s my analysis of these three computer-based approaches to the same problem. What’s your analysis? And it’s also worth asking, “Would a non-computer-based approach be even better?” Is the technology just getting in the way of student learning?

You can also pitch your thoughts in on next week’s installment: Pool Table Math.

2016 Jul 8. Steve Phelps has created a different GeoGebra applet, as has Scott Farrar.

2016 Jul 9. Harry O’Malley uploads another GeoGebra interpretation, one that strikes a very interesting balance between print and digital media.

Every week this summer I’m posting three versions of the same real-world task. Please tell me: who wore it best?

  • In what ways are they different?
  • What do their differences say about their authors’ beliefs about students, learning, and math?
  • Would you make changes? Which and why?

Every secondary teacher and secondary textbook author knows that parabolas are #realworld because they describe the path of projectiles subject to gravity. Forgive me. “Projectiles” are not #realworld. “Baseballs” are #realworld.

But let’s not relax simply because we’ve drawn a line between the math inside the classroom and the student’s world outside the classroom. Three different textbooks will treat that application three different ways.

Click each image for a larger version.

Version #1


Version #2


Version #3


Chris Hunter claims, “The similarities here overwhelm any differences.” That’s probably true. So let’s talk about some of those similarities and what we can do about them.

My Least Favorite Phrase in Any Math Textbook

They each include the phrase “is modeled by,” which is perhaps my least favorite phrase in any math textbook. Whenever you see that phrase, you know it is preceded by some kind of real world phenomenon and proceeded by some kind of algebraic representation of that phenomenon, a representation that’s often incomprehensible and likely a lie. eg. The quartic equation that models snowboarding participation. No.


Chris Hunter notes that the equations “come from nowhere” and seem like “magic.” True.

@dmcimato and John Rowe point out that what normal people wonder about baseball and what these curriculum authors wonder about baseball are not at all the same thing.


That isn’t necessarily a problem. Maybe we think we should ask the authors’ questions anyway. As John Mason wrote in a comment on this very blog on the day that I now refer to around the house as John Mason Wrote a Comment on My Blog Day:

Schools as institutions are responsible for bringing students into contact with ideas, ways of thinking, perceiving etc. that they might not encounter if left to their own devices.

But these questions are really strange and feel exploitative. If we’re going to use, rather than exploit, baseball as a context for parabolic motion, let’s ask a question like: “Will the ball go over the fence?”

And let’s acknowledge that during the game no baseball player will perform any of those calculations. This is not job-world math. So the pitch I’d like to make to students (heh) is that, yes, your intuition will serve you pretty well when it comes to answering both of those questions above, but calculations will serve you even better.

Ethan Weker suggests using a video, or some other visual. I think this is wise, not because “kids like YouTubes,” but because it’s easier to access our intuition when we see a ball sailing through the air than when we see an equation describing the same motion.

Here’s what I mean. Guess which of these baseballs clears the fence:


Now guess which of these baseballs clears the fence:


They’re different representations of the same baseballs – equations and visuals – but your intuition is more accessible with the visuals.

We can ask students to solve by graphing or, if we’d like them to use the equations, we can crop out the fence. If we’d like students to work with time instead of position, we can add an outfielder and ask, “Will the outfielder catch the ball before it hits the ground?”


This has turned into more of a Makeover Monday than a Who Wore It Best Wednesday and I shall try in the future to select examples of problems that differ in more significant ways than these. Regardless, I love how our existing curricula offer us so many interesting insights into mathematics, learning, and curriculum design.

Featured Comment

Karim Ani:

I’ll throw ours into the ring: In which MLB park is it hardest to hit a home run?

The Hechinger Report asks, “Is it better to teach pure math instead of applied math?”:

In the report, “Equations and Inequalities: Making Mathematics Accessible to All,” published on June 20, 2016, researchers looked at math instruction in 64 countries and regions around the world, and found that the difference between the math scores of 15-year-old students who were the most exposed to pure math tasks and those who were least exposed was the equivalent of almost two years of education.

The people you’d imagine would crow about these findings are, indeed, crowing about them. If I were the sort of person inclined to ignore differences between correlation and causation, I might take from this study that “applied math is bad for children.” A less partisan reading would notice that OECD didn’t attempt to control the pure math group for exposure to applied math. We’d expect students who have had exposure to both to have a better shot at transferring their skills to new problems on PISA. Students who have only learned skills in one concrete context often don’t recognize when new concrete contexts ask for those exact same skills.

If you wanted to conclude that “applied math is bad for children” you’d need a study where participants were assigned to groups where they only received those kinds of instruction. That isn’t the study we have.

The OECD’s own interpretations are much more modest and will surprise very few onlookers:

  • “This suggests that simply including some references to the real-world in mathematics instruction does not automatically transform a routine task into a good problem” (p. 14).
  • “Grounding mathematics using concrete contexts can thus potentially limit its applicability to similar situations in which just the surface details are changed, particularly for low-performers” (p. 58).

BTW. I was asked about the report on Twitter, probably because I’m seen as someone who is super enthusiastic about applied math. I am that, but I’m also super enthusiastic about pure math, and I responded that I don’t tend to find categories like “pure” and “applied” math all that helpful. I try to wonder instead, what kind of cognitive and social work are students doing in those contexts?

BTW. Awhile back I wrote that, “At a time when everybody seems to have an opinion or a comment [about mathematics education], it’s really hard for me to locate NCTM’s opinion or comment.” So credit where it’s due: it was nice to see NCTM Past President Diane Briars pop up in the article for an extended response.

Featured Comment:

Chris Shore:

What is often overlooked in these kind of studies is the students who are enrolled in the various courses. The correlation between pure math courses and higher level math exists because higher achieving students are placed in the pure math classes, while lower performing students are placed in applied math.

Same thing is true for studies that claim that students who take calculus are the most likely to succeed in college. No duh! That is because those who are most likely to succeed in college take calculus.

The course work does not cause the discrepancy, the discrepancy determines the course work.

Next »