- In what ways are they different?
- What do their differences say about their authors’ beliefs about students, learning, and math?
- Would you make changes? Which and why?

Every secondary teacher and secondary textbook author knows that parabolas are #realworld because they describe the path of projectiles subject to gravity. Forgive me. “Projectiles” are not #realworld. “Baseballs” are #realworld.

But let’s not relax simply because we’ve drawn a line between the math inside the classroom and the student’s world outside the classroom. Three different textbooks will treat that application three different ways.

Click each image for a larger version.

**Version #1**

**Version #2**

**Version #3**

Chris Hunter claims, “The similarities here overwhelm any differences.” That’s probably true. So let’s talk about some of those similarities and what we can do about them.

**My Least Favorite Phrase in Any Math Textbook**

They each include the phrase “is modeled by,” which is perhaps my least favorite phrase in any math textbook. Whenever you see that phrase, you know it is *preceded* by some kind of real world phenomenon and *proceeded* by some kind of algebraic representation of that phenomenon, a representation that’s often incomprehensible and likely a lie. eg. The quartic equation that models snowboarding participation. No.

Chris Hunter notes that the equations “come from nowhere” and seem like “magic.” True.

@dmcimato and John Rowe point out that what *normal people* wonder about baseball and what *these curriculum authors* wonder about baseball are not at all the same thing.

That isn’t *necessarily* a problem. Maybe we think we should ask the authors’ questions anyway. As John Mason wrote in a comment on this very blog on the day that I now refer to around the house as John Mason Wrote a Comment on My Blog Day:

Schools as institutions are responsible for bringing students into contact with ideas, ways of thinking, perceiving etc. that they might not encounter if left to their own devices.

But these questions are *really* strange and feel exploitative. If we’re going to *use*, rather than *exploit*, baseball as a context for parabolic motion, let’s ask a question like: “Will the ball go over the fence?”

And let’s acknowledge that *during the game* no baseball player will perform *any* of those calculations. This is not *job-world math*. So the pitch I’d like to make to students (heh) is that, yes, your intuition will serve you pretty well when it comes to answering both of those questions above, but *calculations* will serve you even better.

Ethan Weker suggests using a video, or some other visual. I think this is wise, not because “kids like YouTubes,” but because it’s easier to access our intuition when we see a ball sailing through the air than when we see an equation describing the same motion.

Here’s what I mean. Guess which of these baseballs clears the fence:

Now guess which of *these* baseballs clears the fence:

They’re different representations of the *same* baseballs – equations and visuals – but your intuition is more accessible with the visuals.

We can ask students to solve by graphing or, if we’d like them to use the equations, we can crop out the fence. If we’d like students to work with time instead of position, we can add an outfielder and ask, “Will the outfielder catch the ball before it hits the ground?”

This has turned into more of a Makeover Monday than a Who Wore It Best Wednesday and I shall try in the future to select examples of problems that differ in more significant ways than these. Regardless, I love how our existing curricula offer us so many interesting insights into mathematics, learning, and curriculum design.

**Featured Comment**

]]>I’ll throw ours into the ring: In which MLB park is it hardest to hit a home run?

In the report, “Equations and Inequalities: Making Mathematics Accessible to All,” published on June 20, 2016, researchers looked at math instruction in 64 countries and regions around the world, and found that the difference between the math scores of 15-year-old students who were the most exposed to pure math tasks and those who were least exposed was the equivalent of almost two years of education.

The people you’d imagine would crow about these findings are, indeed, crowing about them. If I were the sort of person inclined to ignore differences between correlation and causation, I might take from this study that “applied math is bad for children.” A less partisan reading would notice that OECD didn’t attempt to control the pure math group for *exposure to applied math*. We’d expect students who have had exposure to *both* to have a better shot at transferring their skills to new problems on PISA. Students who have only learned skills in one concrete context often don’t recognize when new concrete contexts ask for those exact same skills.

If you wanted to conclude that “applied math is worse for children than pure math” you’d need a study where participants were assigned to groups where they *only* received those kinds of instruction. That isn’t the study we have.

The OECD’s own interpretations are much more modest and will surprise very few onlookers:

- “This suggests that simply including some references to the real-world in mathematics instruction does not automatically transform a routine task into a good problem” (p. 14).
- “Grounding mathematics using concrete contexts can thus potentially limit its applicability to similar situations in which just the surface details are changed, particularly for low-performers” (p. 58).

**BTW**. I was asked about the report on Twitter, probably because I’m seen as someone who is super enthusiastic about applied math. I *am* that, but I’m also super enthusiastic about *pure* math, and I responded that I don’t tend to find categories like “pure” and “applied” math all that helpful. I try to wonder instead, what kind of cognitive and social work are students *doing* in those contexts?

**BTW**. Awhile back I wrote that, “At a time when everybody seems to have an opinion or a comment [about mathematics education], it’s really hard for me to locate NCTM’s opinion or comment.” So credit where it’s due: it was nice to see NCTM Past President Diane Briars pop up in the article for an extended response.

**Featured Comment**:

]]>What is often overlooked in these kind of studies is the students who are enrolled in the various courses. The correlation between pure math courses and higher level math exists because higher achieving students are placed in the pure math classes, while lower performing students are placed in applied math.

Same thing is true for studies that claim that students who take calculus are the most likely to succeed in college. No duh! That is because those who are most likely to succeed in college take calculus.

The course work does not cause the discrepancy, the discrepancy determines the course work.

In spite of following 150 creative math teachers on Twitter and subscribing to 750 creative math teacher blogs (including one blog that’s dedicated *exclusively* to creative math), I’m only *now* learning about Gordon Hamilton’s Unsolved K-12 Project. It’s creative. It’s math. It’s almost *three years old*!

Better late than never.

See, Hamilton convened a bunch of creative math types in Banff in November 2013 to a) select unsolved math problems and b) adapt them for use at every grade in K-12. Not a simple task, and I’m enormously impressed by their results. You can watch videos introducing the problems at this page or read about them in these slides.

Here are two of my favorites. (Click for larger.)

**Grade 3: Graceful Tree Conjecture**

**Grade 10: Imbedded Square**

These two problems have the capacity to develop fluency just as well as any worksheet or worksite. In working out their solutions, students will perform the same operation *dozens* of times – subtracting whole numbers in the third grade task and calculating slope and distance in the Cartesian plane in the tenth grade task. But these problems ask students to think strategically and systematically *in addition to* practicing efficiently and accurately. That’s no easy feat, but Hamilton and his team pulled it off thirteen times in a row.

**Related**:

If you tell me you’re a fan of real-world math, I know almost nothing about what goes on in your classroom. That’s because there’s enormous variation *within* real-world tasks. Almost as much as there is *between* real and “fake” tasks. (I’ve written about this before.)

This summer we’ll interrogate that thesis. Every week I’ll post three versions of the same real-world task.

Please tell me: who wore it best?

- In what ways are they different?
- What do their differences say about their authors’ beliefs about students, learning, and math?
- Would you make changes? Which and why?

We’ll begin with Barbie Bungee, a lesson which is as old as math teaching itself. (The earliest reference I found in an exhaustive #lazyweb search on Twitter was this 1993 *Mathematics Teacher* article. Thanks, Norma.) If you’ve never heard of it, here is a video summary from Teacher Channel.

Click each image for a larger image. Or click through for the PDF.

I previewed these problems on Twitter a week ago.

A number of people noticed that Version 3 asks for a lot of literacy *in addition to* numeracy. “Example 3 is too wordy for me and the students that I work most closely with,” said Bridget Dunbar.

I’m sympathetic. I was initially repelled by the dense text, but several educators I respect came along and noted that Version 3 leaves a lot of room for the teacher to develop the question along with students. Andrew Morrison said that “the structure of the activity is a lot more open ended than I expected based on the amount of text I initially saw.” Paul Jorgens used some of my favorite advice to support Version 3: “You can’t subtract but you can always add,” continuing to say that “the third one seems the easiest to start thin and add as necessary.”

“A thin start.” Great description.

A number of participants in the discussion said variations on, “It depends on the student.” That seems hard to falsify. Even if it’s true with these three worksheets, though, I don’t think that advice extends to *any* version of this task. Some are probably just bad.

For my part, I look at each version and try to imagine the *verbs*, the mental work students do. In each version of the task, the work becomes formal and operational *very* quickly. Version 1 has students *measuring* precisely in its first step. Version 2 has students *graphing* precisely in its first step. That’s important work but once the task has been formalized like that, it’s very difficult to ask students to do informal, imprecise work, which is just as important and often more interesting.

Like *wondering* what kinds of questions a bungee jump operator would wonder.

Like *estimating* how many rubber bands would be ideal for a given bungee jumping scenario. (Bridget Dunbar with the eagle eyes: “Version 2 misses estimation while Version 1 asks for it, but at the end of all of the directions.”)

Like *abstracting* the world of bungee jumping into a few manageable pieces of data which we can measure and track. (eg. The temperature outside probably doesn’t matter. The number of rubber bands probably does.)

Like *sketching* the relationship between rubber bands and fall height before *graphing* it.

It’s difficult to load all of those tasks onto the same piece or pieces of paper. Perhaps impossible, as later tasks will provide the answer to earlier tasks. My ideal Barbie Bungee task (and modeling task, in general) requires a dialog between teacher and students, with the teacher adding context, questions, and help, as the situation and students require it.

*Watch Twitter for next week’s preview. You should find three versions of a task and play along at home.*

**Featured Comments**

I had a similar reaction to Version #3: all that text. Marc often asks “Who’s doing the math?” This, like “You can always add. You can’t subtract,” rattles around my head when designing or evaluating tasks. Version #3, for all that text, probably wore it best; in both Versions 1&2, the answer to Marc’s question is “The author.”

I wonder if the “step by step” worksheets that educators can be so fond of stems from the fear that students wouldn’t know what to do to solve the problem or from the idea that teachers have a vision for what the students’ output should look like. In versions 1 and 2, students will most likely have much cleaner products than those of version 3. I assume it’s some combination of both.

]]>I wouldn’t say that it’s so much the case that “it depends on the student” as that it depends on the classroom culture. Teachers that have cultivated a culture of risk-taking, serious inquiry, and other habits of mind and practices that draw on the natural curiosity and need to know and understand that we all have (before it’s schooled out of us) are going to have enormous flexibility in designing or adapting tasks like this one so as to not wind up stifling individual thinking and productive struggle.

Ann Shannon asks teachers to avoid “GPS-ing” their students:

When I talk about GPSing students in a mathematics class I am describing our tendency to tell students—step-by-step—how to arrive at the answer to a mathematics problem, just as a GPS device in a car tells us – step-by-step – how to arrive at some destination.

Shannon writes that when she used her GPS, “I usually arrived at my destination having learned little about my journey and with no overview of my entire route.”

True to the contested nature of education, we will now turn to someone who advocates exactly the opposite. Greg Ashman recommends novices learn new ideas and skills through explicit instruction, one facet of which is step-by-step worked examples. Ashman took up the GPS metaphor recently. He used his satellite navigation system in new environs and found himself able to re-create his route later without difficulty.

What can we do here? Shannon argues from intuition. Ashman’s study lacks a certain rigor. Luckily, researchers have actually studied what people learn and don’t learn when they use their GPS!

In a 2006 study, researchers compared two kinds of navigation. One set of participants used traditional, step-by-step GPS navigation to travel between two points in a zoo. Another group had to construct their route between those points using a map and then travel segments of that route from memory.

Afterwards, the researchers assessed the route knowledge and survey knowledge of their participants. Route knowledge helps people navigate between landmarks directly. Survey knowledge helps people understand spatial relationships *between* those landmarks and plan new routes. At the end of the study, the researchers found that map users had better survey knowledge than GPS users, which you might have expected, but map users outperformed the GPS users on measures of *route* knowledge as well.

So your GPS does an excellent job transporting you efficiently from one point to another, but a poor job helping you acquire the survey knowledge to understand the terrain and adapt to changes.

Similarly, our step-by-step instructions do an excellent job transporting students efficiently from a question to its answer, but a poor job helping them acquire the domain knowledge to understand the deep structure in a problem set and adapt old methods to new questions.

I’ll take that trade with my GPS, especially on a dull route that I travel infrequently, but that isn’t a good trade in the classroom.

The researchers explain their results from the perspective of active learning, arguing that travelers need to do something *effortful* and *difficult* while they learn in order to remember both route and survey knowledge. Designing learning for the right kind of effort and difficulty is one of the most interesting tasks in curriculum design. Too *much* effort and difficulty and you’ll see our travelers try to navigate a route without a GPS *or* a map. While blindfolded. But the GPS offers too *little* difficulty, with negative consequences for drivers and even worse ones for students.

**2016 Jun 17**. The two most common critiques of this post have been, one, that I have undervalued step-by-step instructions in math, and two, that this GPS study offers very few insights into math education. I respond to both critiques in this comment.

A poker face? A bit of malice? Nitsa Movshovits-Hadar argues [pdf] that it requires only the ability to trick yourself into forgetting that you know every triangle has the *same* interior angle sum. “Suppose we do not know it,” she writes, which is easier said than done.

The premise of her article is that “… all school theorems, except possibly a very small number of them, possess a built-in surprise, and that by exploiting this surprise potential their learning can become an exciting experience of intellectual enterprise to the students.”

This is such a delightful paper – extremely readable and eminently practical. Without knowing me, Movshovits-Hadar took several lessons that I love, but which seemed to me totally disparate, and showed me how they connect, and how to replicate them. I’m pretty sure I was grinning like an idiot the whole way through this piece.

[via Danny Brown]

**Featured Tweets**

@rawrdimus i.e. think less like a math teacher who knows how to write a circle equation

— Dan Anderson (@dandersod) June 10, 2016

Not easy for math teachers to do!

@ddmeyer I did a similar thing with my Year 9 students and the trig ratios!!! Heaps of fun and surprise!

— David Ross Lang (@Davidinho_78) June 12, 2016

What if you asked two questions: which triangle has the longest perimeter and which triangle has the largest angle sum? It might clarify what can change in a triangle and what cannot. Also it hides the surprise better. If you teach via surprise consistently, kids start looking for the punchline.

**Featured Comments**

Jo:

Elementary may actually have an advantage here! We play these games all the time. Some favorites:

Draw me a two-sided quadrilateral

Draw me a triangle with three right angles (or three obtuse angles)

(We have a manipulative that consist of little plastic sticks that snap together to build things)–Build me a triangle with the red stick (6″), the purple stick (1″) and the green stick (2″ )Once the whole class is convinced they can’t you can get at why and then writing a rule for it. There is nothing an 8 year old likes better than proving the teacher wrong.

]]>Theorems and formulae in textbooks should be marked with a “spoiler alert”.

Ok #MTBoS. What's your favorite first day of school activity? For high school. Something that helps me get to know my students??

— Ali Grace (@AGEiland) June 9, 2016

My contributions:

- Personality Coordinates Icebreaker.
- Who I Am. (Return these at the end of the year for more fun.)
- The Collaborative Icosahedron.
- Stacking Cups.

Help the rest of us out in the comments. What do you do on the first day of school?

**2016 Jul 27**. A Collection of First Week Activities.

To say that the community repository model has done wonders for open source software is a massive understatement. To what extent that success translates to curriculum I’m obviously unsure, but I have randomly-ordered reasons to suspect it’s appreciable.

I attended EdFoo earlier this year, an education conference at Google’s campus attended by lots of technologists. Speakers posed problems about education in their sessions and the solutions were often techno-utopian, or techno-optimistic at the very least.

One speaker wondered why teachers spend massive amounts of time creating lessons plans that don’t differ *all that much* from plans developed by another teacher several states away or several doors down the hall. Why don’t they just build it once, share it, and let the community modify it? Why isn’t there a GitHub for lesson plans?

I’m not here to say that’s a bad idea in theory, just to say that the idea very clearly hasn’t caught on in practice.

Exhibit A: BetterLesson, which pivoted from its original community lesson repository model to a lesson repository *stocked by master teachers* and now to professional development. (Its lesson repository is currently a blink-and-you’ll-miss-it link in the footer of their homepage.) The idea has failed to catch on with secondary educators to such a degree that it’s worth asking them why they don’t seem to want it.

Our room at EdFoo was notably absent of practicing secondary teachers so I went on Twitter to ask a few thousand of them, “Why don’t you use lesson download sites?” (I asked the same question two years ago as well.) Here are helpful responses from actual, really real current and former secondary teachers:

Using someone else’s lesson plan is like wearing a friend’s underwear. It may do the job but ultimately doesn’t fit quite right.

Their wheels aren’t the right size for my car.

Linux works because code compiles. Syllabi don’t compile. If I add a block/lesson, I never know who it helps.

I don’t require a script, just decent ideas now and then.

I’m not sure they solve for the problems they think they’re trying to solve. It takes time to read / internalize / modify others’ plans.

It’s challenging to sequence, connect, plan, and enact someone else’s lesson.

The plan itself is the least important element. The planning is what’s critical.

**2016 Jun 11**. Dwight Eisenhower:

In sum: “Small differences between lessons plans are enormously important, enormously time-consuming to account for and fix, and whatever I already have is probably good enough.” It turns out that even if two lesson plans don’t differ *all that much* they already differ *too much*.

Any lesson sharing site will have to account for that belief before it can offer teachers even a fraction of GitHub’s value to programmers.

**2016 Jun 8**. Check out Bob Lochel’s tweet above and Julie Reulbach’s tweet below. Both express a particular sentiment that the nuts and bolts of a lesson plan are less important than the chassis. (I don’t know a thing about cars.)

I was chatting with EdSurge’s Betsy Corcoran about that idea at EdFoo and she likened it to “the head” in jazz music. (I don’t know a thing about jazz music.) The head contains crucial information about a piece of music – the key, the tempo, the chord changes. Jazz musicians memorize the head but they’ll build and develop the performance off of it. The same head may result in *several* different performances. What I want – along with Bob and Julie and many others – is a jazz musician’s fake book – a repository of creative premises I can easily riff off of.

(Of course, it’s worth noting here that many people believe that teachers should be less like jazz musicians and more like player pianos.)

**Featured Tweets**

@ddmeyer @rawrdimus yes. We need ideas. Not lessons. That's our speciality.

— Julie (@jreulbach) June 8, 2016

@ddmeyer GitHub for lessons doesn’t appeal to mea, but GitHub for *planning* does.

— Ben Samuels-Kalow (@bensk) June 8, 2016

@ddmeyer you save time and learn new often better approaches by using someone else's plan, AND it still requires a lot of your own planning

— Paul Edelman (@TpTFounder) June 9, 2016

**Featured Comments**

There seems to be a general distrust of “other people’s lessons.” Which I get. But nothing about this model would change the extent to which you do or do not teach other people’s lessons, or the fidelity with which you do it. Again, the whole thing that got me thinking in this vein was the problem of managing, in some kind of coherent way, all the changes that teachers already make as a matter of course. If you’re starting with an existing curriculum, then you’re using other people’s stuff to some extent. And once you alter that extent, it might be nice to track it, for all sorts of reasons. Maybe classroom teachers don’t find that interesting, but somebody in the chain between publisher and implementer certainly does. Not totally sure who the best target audience might be.

Jo:

As an elementary math coach I don’t want a repository of lesson plans either but my teachers long for one. However, when given pre-written lesson plans they’re not happy with them–for all the reasons listed above.

The hardest thing about elementary math is that most elementary teachers go into teaching because they love reading and they want to share that. They rarely feel that way about math. So, they want a guided lesson that will teach the requisite skills. Unfortunately, it doesn’t work for them any better than it works for secondary teachers.Even in elementary it’s the process of planning that’s important. My brain needs to go through the work of planning–what leads to what, what is going to confuse the kids, what mistakes are they likely to make, what false paths are they likely to follow. The only way to deeply understand the material and how to present it is to plan it. The only way to truly understand the standard is to wrestle with what it really means.

Planning is the work; teaching is just the performance.

I get a lot out of reading other lesson plans/approaches to teaching/ideas, and steal activities fairly regularly, but my actual lesson plans aren’t copies of others’. It’s more like they’re inspired by what other people do. This is where the artistry of teaching comes in.

I get it – we don’t want a repository of lessons, but what happens once those lessons get downloaded and re-worked? Right now there isn’t a way to see derivatives of those lessons, which could be very important.

Brandon, I love that idea. Recipe websites do this — what can be substituted for what.. how can different teachers with different ingredients, different tools and in different places.. these are good parallels for the teaching world.

**2016 Jun 13**. Mike Caulfield offers an illustration of the value of *planning* relative to *plans*.

Danny Brown has expressed an interest in teaching mathematics that is relevant to students, relevant in important, sociological ways especially. This puts him in a particular bind with mathematics like Thales’ Theorem, which seems neither important nor relevant.

Here is Thales’ theorem. Every student in the UK must learn this theorem as part of the Maths GCSE. You are explaining Thales’ theorem, when one of the students in your class asks, “When will we ever need this in real life?” How might you respond?

He proceeds to offer several possible responses and then, with admirable empathy for teenagers, rebut them. Brown finds none of our best posters for math particularly compelling. You know the ones.

- Math is everywhere.
- Math develops problem solving skills.
- Math is beautiful.
- Etc.

So instead of fixing our posters, let’s fix the product itself.

Brown’s premise is that students are listening to him “explaining Thales’ theorem.” Let’s question that premise for a moment. Is that the only or best way to introduce students to that proof? [**2016 Jun 3**. Brown has informed me that explanation is not his preferred pedagogy around proof and I have no reason not to take him at his word. So feel free to swap out “Brown” in the rest of this post with your recollection of nearly every university math professor you’ve ever had.]

Among other purposes, every proof is the answer to a question. Every proof is the rejection of doubt. It isn’t clear to me that Brown has developed the question or planted the doubt such that the answer and the explanation seem *necessary* to students.

So instead of starting with the explanation of an answer, let’s develop the question instead.

Let’s ask students to *create* three right triangles, each with the same hypotenuse. Thales knows what our students might not: that a circle will pass through all of those vertices.

Let’s ask them to *predict* what they think it will look like when we lay all of our triangles on top of each other.

Let’s reveal what several hundred people’s triangles look like and ask students to *wonder* about them.

My hypothesis is that we’ll have provoked students to *wonder* more here than if we simply ask students to listen to our explanation of *why it works*.

**“Methods”**

To test that hypothesis, I ran an experiment that uses Twitter and the Desmos Activity Builder and is pretty shot through with methodological flaws, but which is *suggestive* nonetheless, and which is also way more than you oughtta expect from a quickie blog post.

I asked teachers to send their students to a link. That link randomly sends students to one of two activities. In the control activity, students click slide by slide through an explanation of Thales’ theorem. In the experimental activity, students create and predict like I’ve described above.

At the end of both treatments, I asked students “What questions do you have?” and I coded the resulting questions for any relevance to mathematics.

77 students responded to that final prompt in the experimental condition next to 47 students in the control condition. 47% of students in the experimental group asked a question next to 30% of students in the control group. (See the data.)

This *suggests* that interest in Thales’ theorem doesn’t depend strictly on its social relevance. (Both treatments lack social relevance.) Here we find that interest depends on what students *do* with that theorem, and in the experimental condition they had more interesting options than simply listening to us explain it.

So let’s invite students to stand in Thales’ shoes, however briefly, and experience similar questions that led Thales to sit down and wonder “*why*.” In doing so, we honor our students as sensemakers and we honor math as a discipline with a history and a purpose.

**BTW**. For another example of this pedagogical approach to proof, check out Sam Shah’s “blermions” lesson.

**BTW**. Okay, study limitations. (1) I have no idea who my participants are. Some are probably teachers. Luckily they were randomized between treatments. (2) I realize I’m testing the *converse* of Thales’ theorem and not Thales’ theorem itself. I figured that seeing a circle emerge from right triangles would be a bit more fascinating than seeing right triangles emerge from a circle. You can imagine a parallel study, though. (3) I tried to write the explanation of Thales’ theorem in conversational prose. If I wrote it as it appears in many textbooks, I’m not sure anybody would have completed the control condition. Some will still say that interest would improve enormously with the addition of call and response questions throughout, asking students to repeat steps in the proof, etc. Okay. Maybe.

**Featured Comments**

Danny Brown responds in the comments.

Michael Ruppel responds to the charge that Thales theorem isn’t important mathematics:

]]>As to the previous commenter, Thales’ theorem is not a particularly important piece of content in and of itself, but it’s one of my favorite proofs for students to build. It requires careful attention to definitions and previously-learned theorems as well as a bit of creativity. (Drawing that auxiliary line.) Personally, my favorite part of the proof is that students don’t solve for a or b, and in fact have no knowledge of what a and b are. but they prove that a+b=90. The proof is a different flavor than they are used to.

Are your students overgeneralizing their models? After working exclusively with proportional relationships for the last month, are they describing *every* new relationship as proportional?

This isn’t a task, or a lesson, or anything of that scope. It’s a resource, a provocation, one that gives students the chance to check their assumptions about *what’s going on*.

Play this video and pause it periodically, asking students to decide for themselves, and then tell a neighbor, what’s coming next.

10 marbles weigh 350 grams. So 20 marbles should weigh how much? I’m curious which students will say the answer is less than, exactly, or more than 700 grams. I’m curious which students will say it’s impossible to know.

Reveal the answer.

That will be surprising for some. Now invite them to speculate about 30 marbles. 40 marbles. And 0 marbles.

Let me end with three notes.

**First**, my thanks to Kevin Hall who had the fine idea for the video and encouraged me to make it. I’ve never met Kevin. That’s the kind of internet collaboration that makes my week.

**Second**, the stacking cups lesson offers a similar moment of dissonance. Can you find it?

**Third**, here’s Hans Freudenthal on technology in 1981:

What I seek is neither calculators and computers as educational technology nor as technological education but as a powerful to arouse and increase mathematical understanding.

**Featured Tweet**

I always like creating a proportional reasoning speed bump by giving these types of questions.

**Featured Comment**

]]>Hey! Nice idea for helping kids make the turn from proportional to linear relationships. There were two things I wanted to change:

• the discrete nature of the domain

• the way it’s not clear in the still images whether we are being shown the mass of just the marbles or the mass of the marbles + the glass together (the brief shot of the balance scale with the glass on it at the beginning of the video wasn’t doing it for me).So I made a video! Here! It was shot on my phone using a jar of cumin to stabilize, so it could certainly be professionalized.