In the early 20th century, Karl Groos claimed in The Play of Man that “the joy in being a cause” is fundamental to all forms of play. One hundred years later, Phil Daro would connect Groos’s theory of play to video gaming:
Every time the player acts, the game responds [and] tells the player your action causes the game action: you are the cause.
Most attempts to “gamify” math class learn the wrong lessons from video games. They import leaderboards, badges, customized avatars, timed competitions, points, and many other stylistic elements from video games. But gamified math software has struggled to import this substantial element:
Every time the player acts, the game responds.
When the math student acts, how does math class respond? And how is that response different in video games?
Watch how a video game responds to your decision to jump off a ledge.
Now watch math practice software responds to your misinterpretation of “the quotient of 9 and c.”
The video game interprets your action in the world of the game. The math software evaluates your action for correctness. One results in the joy in being the cause, a fundamental feature of play according to Groos. The other results in something much less joyful.
To see the difference, imagine if the game evaluated your decision instead of interpreting it.
I doubt anyone would argue with the goals of making math class more joyful and playful, but those goals are more easily adapted to a poster or conference slidedeck than to the actual experience of math students and teachers.
So what does a math class look like that responds whenever a student acts mathematically, that interprets rather than evaluates mathematical thought, that offers students joy in being the cause of something more than just evaluative feedback.
“Have students play mathematical or reasoning games,” is certainly a fair response, but bonus points if you have recommendations that apply to core academic content. I will offer a few examples and guidelines of my own in the comments later tomorrow.
I feel like a lot of the best Desmos activities do that, because they can interpret (some of) what the learner inputs. When you do the pool border problem, it doesnâ€™t tell you that your number of bricks is wrong â€“ it just makes the bricks, and you can see if that is too many, too few, or just right.
In general, a reaction like â€œWell, letâ€™s see what happens if that were trueâ€ seems like a good place to start.
My favorite example of this is when Cannon Manâ€™s body suddenly multiplies into two or three bodies if a student draws a graph that fails the vertical line test.
I am so intrigued by the word interpret. â€œInterpretâ€ is about translating, right? Sometimes when we try to interpret, we (unintentionally) make assumptions based on our own experiences. Recently, I have been pushing myself to linger in observing students as they work, postponing interpretations. I have even picked up a pencil and â€œtried onâ€ their strategies, particularly ones that are seemingly not getting to a correct solution. I have consistently been joyfully surprised by the math my students were playing with. Iâ€™m wondering how this idea of â€œtrying onâ€ student thinking fits with technology. When/how does technology help us try on more student thinking?
I think that many physical games give clear [evaluative] feedback as well, insofar as you test out a strategy, and see if you win or not. Adults can ruin these for children by saying, â€œare you sure thatâ€™s the right move?â€ rather than simply beating them so they can see what happens when they make that move. The trick there is that some games you improve at simply by losing (Iâ€™d put chess in this column, even though more focused study is essential to get really good), where others require more insight to see what you actually need to change.