I raved for a minute on Twitter last week about this New York Times article. You should read it (play it? experience it?) and then come back so I can explain why it’s what math curriculum could and should become.

**The lesson asks for an imprecise sketch rather than a precise graph.**

This is so rare. More often than not, our curricula rushes past lower, imprecise, informal, concrete rungs on the ladder of abstraction straight for the highest, most precise, most formal, most abstract ones. That’s a disservice to our learners and the process of learning.

You can always ask a student to move higher but it’s difficult to ask a student to move lower, forgetting what they’ve already seen. You can always ask for precisely plotted points of a model on a coordinate plane. But once you ask for them you can’t *unask* for them. You can’t then ask the question, “What *might* the model look like?” Because they’re *looking* at what the model looks like. So the Times asks you to sketch the relationship before showing you the precise graph.

Their reason is exactly right:

We asked you to take the trouble to draw a line because we think doing so makes you think carefully about the relationship, which, in turn, makes the realization that it’s a line all the more astonishing.

That isn’t just their intuition about learning. It’s Lisa Kasmer’s research. And it won’t happen in a print textbook. We eventually *need* students to see the answer graph and whereas the Times webpage can *progressively disclose* the answer graph, putting up a wall until you commit to a sketch, a paper textbook lacks a mechanism for preventing you from moving ahead and seeing the answer.

This isn’t just great *digital* pedagogy, it’s great *pedagogy*. You can and should ask students to sketch relationships without any technology at all. But the *digital* sketch offers some incredible advantages over the same sketch in pencil.

For instance:

**The lesson builds your thinking into its instruction.**

Once it has your guess —Â a sketch representing your best thinking about the relationship between income and college participation — it tailors its instruction to that sketch. (See the highlighted sentences.)

The lesson is the same but it is *presented* differently and responsively from student to student. All the highlighted material is tailored to my graph. I watched an adult experience this lesson yesterday, and while she read the personalized paragraph with interest, she only *skimmed* the later prefabricated paragraphs. It should go without saying that print textbooks are entirely prefabricated.

**It makes your classmates’ thinking visible.**

The lesson makes my classmates’ thinking visible in ways that print textbooks and flesh-and-blood teachers cannot. At the time of this posting, 70,000 people have sketched a graph. It’s interesting for me to know how much more accurate my sketch is than my classmates. It’s interesting to see the heatmap of their sketches. And it’s interesting to see the heatmap converge around the point that the lesson gave us for free, a point where there is much less doubt.

In a version of this article designed for the classroom, students would sketch their graphs and the textbook would adaptively pair one group of students up with another when their graph indicated disagreement. Debate it.

I’m not saying any of this is easy. (“Sure! Do that for factoring trinomials!”) But we aren’t exactly drowning in great examples of instruction enhanced by technology. Take a second and appreciate this one. Then let me know where else you think this kind of technology would be helpful to you in your teaching.

**Featured Comment**

And as far as I know, even with Apple proclaiming “Textbooks that go beyond the printed page” since 2012?, there isn’t a single digital math textbook doing this yet.

## 17 Comments

## jim cibulka

June 1, 2015 - 11:12 am -This was GREAT! I see you had basically the same curve I drew . . . glad to see you post some new material as well . . . now, tell me how I can use Vpython in my 9th grade physics class next year!

## Derek Oldfield

June 1, 2015 - 11:22 am -@mr_oldfield on Twitter:

Dan, this reminded me of https://app.nearpod.com/#/?pin=7878377978656BE2858B26930E2EDE5A-0 this lesson of yours (Joulies) that I basically inserted into a Nearpod lesson to allow my students the opportunities to draw the graphs, share some of their drawings, provide the feedback, etc. Check out the lesson, though I’m realizing I could have offered more opportunities for the students to write about why their graphs predictions look the way they do.

## Mark

June 1, 2015 - 12:19 pm -Dan I like your quote about it being difficult to start with the most abstract and come down to the more general discoveries within a problem. Do you know of any research that backs up this assertion? I’m just curious because I fear I often go to the most abstract levels too quickly myself and want to see if what I’ve done is irreversible or if there’s any benefits.

## Chris Goedde

June 1, 2015 - 12:44 pm -Are you familiar with Eric Mazur’s Learning Catalytics (https://learningcatalytics.com). You can use it to do what you suggest. If you have students make a prediction, it will match students for discussion. I haven’t used it personally, but others in my department have, I think.

## Travis

June 1, 2015 - 1:13 pm -right off the bat, multiple transformations with either

1 mapping rules or

2 dynamic geometry commands.

Is it in the correct quadrant? proximate in size? orientation?

## Avery

June 1, 2015 - 3:48 pm -And as far as I know, even with Apple proclaiming “Textbooks that go beyond the printed page” since 2012?, there isn’t a single digital math textbook doing this yet.

## Dan Meyer

June 1, 2015 - 5:28 pm -@

Avery, the iBooks Author platform requires some serious hacking to share work between students. It’s a pretty grim landscape. (Or full of opportunity, depending on how you look at it.)@

Chris, Learning Analytics allows for the aggregation of sketches (and other inputs) but it isn’t embedded in curriculum AFAIK. It’s a teaching tool, not a textbook. Now since Pearson owns it, one would think they might try to integrate it.@

Derek Oldfield, love that Joulies Nearpod treatment. Now if that were embedded in the instruction, and the instruction responded to the Joulies sketch, that would be something new entirely.@

Mark, I recommend teachers start with Dan Schwartz’s work on Invention as Preparation for Future Learning.## Zack Miller

June 1, 2015 - 8:04 pm -Peardeck is the best classroom tool I know to do stuff like this. I often use Peardeck to help start math tasks as close to the bottom rung of the ladder of abstraction as possible. It allows me to bring up student work quickly and easily on the projector, or show an aggregate of a class the way the NY Times did. Spending class minutes on activities like this has really paid off in my class (in terms of learning, as well as student engagement and participation)

## Xavier

June 1, 2015 - 10:58 pm -Another graph which I think it’s useful for understand economy and students “read” information is [this](http://www.nytimes.com/interactive/2015/03/19/upshot/3d-yield-curve-economic-growth.html?abt=0002&abg=0)

## jkern

June 2, 2015 - 4:58 am -@Zack Miller, thanks for the heads up on Peardeck.

It would be nice if it was set up to compile lines instead of just dots. I’ve seen MathChat as an option for sharing individual resonses for graphs/drawing/writing, but it doesn’t compile student responses for the full feedback about where each student lands in relation to the class, like we see in the Times article. That is the app that we really need. Does it exist anywhere?

## Bradley

June 2, 2015 - 6:24 am -I would love to have a tool like this for all the graphing we do in Physics. I can imagine having kids graph where they think a falling object will be after 1 second, 2seconds, 5 seconds… Then comparing and contrasting each other’s graphs could lead to great discussions! I think students could discuss their graphs before seeing the real data. In physics, then, we could actually do the research to come up with the “real” graph, maybe using research from all sections of physics at my school.

## jkern

June 2, 2015 - 6:36 am -@Bradley, the best tool I had for this in Physics was just whiteboards or papers, so they could compare with each other and draw the “real” graph over their prediction when the demonstration happened.

What I added in was a betting system to get students to be a bit more excited about making predictions. They got 2 meaningless points per question. They could bet 0, 1, or 2, depending on their confidence. So students who were unsure could bet 0 and still end up with 2 points no matter what. Confident students who bet both of their points could end up with 0 or 4, so they really had to assess their confidence levels and understanding if they wanted to “win.” Some liked the 0-bet safety net. Some liked the 2-bet risk. I would usually make them bet on a set of problems, then we would compare, demo, and discuss. Then they would have a chance to bet on another set of problems. This was using the Physics Modeling curriculum from Arizona State, so each problem was a little bit different, keeping it challenging throughout.

## Cathy Yenca

June 3, 2015 - 4:59 am -I’ve also used Nearpod to attempt to provide students with a platform for graphing something on their own, then submitting it digitally to be shared, compared, and affirmed or denied. An example of this was transformations of 2D figures in the coordinate plane. While students knew the transformation vocabulary, transferring these ideas to figures with vertices at specific coordinates was not intuitive.

Once I had all students’ initial work images in Nearpod, I clicked through the graphs quickly using my Smart Board. In doing so, each student’s graph flashed across the screen briefly, creating somewhat of an animation experience. Variations in student work became efficiently apparent. Though the software itself didn’t provide tailored feedback, my students and I took care of that part.

After clicking through student graphs for several cycles, my next “slide” was a [SILENT SOLUTION] animation/video using Keynote that reads like a solution cartoon, showing the correct image occurring from the preimage in motion. As that animation occured, the class erupted, with students either feeling very affirmed in their previously submitted graph, or denied in a discrepant event of, what the heck? (Which is a prime moment for learning!)

Here’s a post that features a bit of this so you can see what I’m saying: http://www.mathycathy.com/blog/2015/02/transformations-and-animations/

I find it curious that I also got the exact same feedback from the New York Times graphing experience as Dan… Now I want to break it. How many possible lists of feedback does that graphing tool *really* give? Is it as precise as it seems at first glance, or is it like one of those Facebook things where you type in your first name, and it tells you what your name means… except you get the same name-meaning for your actual first name as you do when you type in “Potato”? (P.S. Yes, I actually typed in Potato to test it.)

## Cathy Yenca

June 3, 2015 - 7:26 am -P.S. The graphing tool in the New York Times site passed the “Potato Test” (see comment above). I tried to break it, and I still got pretty precise feedback each time.

## Dan Meyer

June 3, 2015 - 1:22 pm -@

Cathy, someone else was on your exact frequency.## Cathy Yenca

June 3, 2015 - 3:14 pm -@Dan INDEED! :-)

## Vishakha

June 3, 2015 - 4:00 pm -@Bradley and @jkerns – check out the work that the Concord Consortium is doing with it’s CODAP project – it is actually an acronym for collaborative data analysis platform ;-)

http://codap.portal.concord.org