Category: tech contrarianism

Total 134 Posts

Learning the Wrong Lessons from Video Games

[This is my contribution to The Virtual Conference on Mathematical Flavors, hosted by Sam Shah.]

In the early 20th century, Karl Groos claimed in The Play of Man that “the joy in being a cause” is fundamental to all forms of play. One hundred years later, Phil Daro would connect Groos’s theory of play to video gaming:

Every time the player acts, the game responds [and] tells the player your action causes the game action: you are the cause.

Most attempts to “gamify” math class learn the wrong lessons from video games. They import leaderboards, badges, customized avatars, timed competitions, points, and many other stylistic elements from video games. But gamified math software has struggled to import this substantial element:

Every time the player acts, the game responds.

When the math student acts, how does math class respond? And how is that response different in video games?

Watch how a video game responds to your decision to jump off a ledge.

Now watch math practice software responds to your misinterpretation of “the quotient of 9 and c.”

The video game interprets your action in the world of the game. The math software evaluates your action for correctness. One results in the joy in being the cause, a fundamental feature of play according to Groos. The other results in something much less joyful.

To see the difference, imagine if the game evaluated your decision instead of interpreting it.

I doubt anyone would argue with the goals of making math class more joyful and playful, but those goals are more easily adapted to a poster or conference slidedeck than to the actual experience of math students and teachers.

So what does a math class look like that responds whenever a student acts mathematically, that interprets rather than evaluates mathematical thought, that offers students joy in being the cause of something more than just evaluative feedback.

“Have students play mathematical or reasoning games,” is certainly a fair response, but bonus points if you have recommendations that apply to core academic content. I will offer a few examples and guidelines of my own in the comments later tomorrow.

Featured Comments

James Cleveland:

I feel like a lot of the best Desmos activities do that, because they can interpret (some of) what the learner inputs. When you do the pool border problem, it doesn’t tell you that your number of bricks is wrong — it just makes the bricks, and you can see if that is too many, too few, or just right.

In general, a reaction like “Well, let’s see what happens if that were true” seems like a good place to start.

Kevin Hall:

My favorite example of this is when Cannon Man’s body suddenly multiplies into two or three bodies if a student draws a graph that fails the vertical line test.

Sarah Caban:

I am so intrigued by the word interpret. “Interpret” is about translating, right? Sometimes when we try to interpret, we (unintentionally) make assumptions based on our own experiences. Recently, I have been pushing myself to linger in observing students as they work, postponing interpretations. I have even picked up a pencil and “tried on” their strategies, particularly ones that are seemingly not getting to a correct solution. I have consistently been joyfully surprised by the math my students were playing with. I’m wondering how this idea of “trying on” student thinking fits with technology. When/how does technology help us try on more student thinking?

Dan Finkel:

I think that many physical games give clear [evaluative] feedback as well, insofar as you test out a strategy, and see if you win or not. Adults can ruin these for children by saying, “are you sure that’s the right move?” rather than simply beating them so they can see what happens when they make that move. The trick there is that some games you improve at simply by losing (I’d put chess in this column, even though more focused study is essential to get really good), where others require more insight to see what you actually need to change.

The Four Questions I Always Ask About New Technology in Education

A tweet where someone asks my impressions about Graspable Math.

A tool called Graspable Math found an audience on Twitter late last week, and a couple of people asked me for my opinion. I’ll share what I think about Graspable Math, but I’ll find it more helpful to write down how I think about Graspable Math, the four questions I ask about all new technology in education. [Full disclosure: I work in this field.]

1. What does it do?

That question is easier for me to answer with basic calculators and graphing calculators than with Graspable Math. Basic calculators make it easy to compute the value of numerical expressions. Graphing calculators make it easy to see the graphical representation of algebraic functions.

Graspable Math’s closest cousins are probably the Dragonbox and Algebra Touch apps. All of these apps offer students a novel way of interacting with algebraic expressions.

Drag a term to the opposite side of an equality and its sign will change.

Move a term from one side of the equation to the other.

Double click an operation like addition and it will execute that operation, if it’s legal.

Click to perform an operation like addition.

Drag a coefficient beneath the equality and it will divide the entire equation by that number.

Drag to divide by a coefficient.

Change any number in that sequence of steps and it will show you how that change affects all the other steps.

Change a number in one place in the sequence of steps and it will change it everywhere else.

You can also link equations to a graph.

Connect the equation with a graph.

2. Is that a good thing to do?

No tool is good. We can only hope to figure out when a tool is good and for whom and for what set of values.

For example, if you value safety, an arc torch is a terrible tool for a toddler but an amazing tool for a welder.

I value a student’s conviction that “Mathematics makes sense” and “I am somebody who can make sense of it.”

So I think a basic calculator is a great tool for students who have a rough sense of the answer before they enter it. (ie. I know that 125 goes into 850 six-ish times. A basic calculator is perfect for me here.)

A graphing calculator is a great tool for a student who understands that a graph is a picture of all the x- and y-values that make an algebraic statement true, a student who has graphed lots of those statements by hand already.

A basic and graphing calculator can both contribute to a student’s idea that “Mathematics doesn’t make a dang bit of sense” and “I cannot make sense of it without this tool to help me” if they’re used at the wrong time in a student’s development.

The Graspable Math creators designed their tool for novice students early in their algebraic development. Is it a good tool for those students at that time? I’m skeptical for a few reasons.

First, I suspect Graspable Math is too helpful. It won’t let novice students make computational errors, for example. Every statement you see in Graspable Math is mathematically true. It performs every operation correctly. But it’s enormously helpful for teachers to see a student’s incorrect operations and mathematically false statements. Both reveal the student’s early understanding of really big ideas about equivalence.

In one of their research papers, the Graspable Math team quotes a student as saying, “[Graspable Math] does the math for you — you don’t have to think at all!” which is a red alert that the tool is too helpful, or at least helpful in the wrong way.

Second, Graspable Math’s technological metaphors may conceal important truths about mathematics. “Drag a term to the opposite side of an equality and its sign will change” isn’t a mathematical truth, for example.

Move a term from one side of the equation to the other.

It’s a technological metaphor for the mathematical truth that you can add the same number (3 in this case) to both sides of an equal sign and the new equation will have all the same solutions as the first one. That point may seem technical but it underpins all of Algebra and it isn’t clear to me how Graspable Math supports its development.

Third, Graspable Math may persuade students that Algebra as a discipline is very concerned with moving symbols around based on a set of rules, rather than with understanding the world around them, developing the capacity for conjecturing, or some other concern. I’m speaking about personal values here, but I’m much more interested in helping students turn a question into an equation and interpret the solutions of that equation than I am in helping them solve the equation, which is Graspable Math’s territory.

These are all tentative questions, skepticisms, and hypotheses. I’m not certain about any of them, and I’m glad Graspable Math recently received an IES grant to study their tool in more depth.

3. What does it cost?

While Graspable Math is free for teachers and students, money isn’t the only way to measure cost. Free tools can cost teachers and students in other ways.

For instance, Graspable Math, like all new technology, will cost teachers and students time as they try to understand how it works.

I encourage you to try to solve a basic linear equation with Graspable Math, something like 2x – 3 = 4x + 7. Your experience may be different from mine, but I felt pretty silly at several points trying to convince the interface to do for me what I knew I could do for myself on paper. (Here’s a tweet that made me feel less alone in the world.)

Graspable Math performs algebraic operations correctly and quickly but at the cost of having to learn a library of gestures first, effectively trading a set of mathematical rules for a set of technological rules. (There is a cheat sheet.) That kind of cost is at least as important as money.

2018 Aug 8. Elizabeth Hernandez writes in the comments:

One thing I might add to the section about cost is that it is so important to find out how student data is being used. Resources that are labeled as “free” often make students and teachers pay with their data. That is unethical if the vendor doesn’t provide information about what data they collect and how it is used. Graspable Math is a no-go for me because I can’t find their terms of use or privacy policy. The only information I saw about data collected was one vague sentence when I click login.

4. What do other people think about this?

I spent nearly as much time searching Twitter for mentions of Graspable Math as I did playing with the tool itself. Lots of people I know and respect are very excited about it, which gives me lots of reasons to reconsider my initial assessment.

While I find teachers on Twitter are very easily excited about new technology, I don’t know a single one who is any less than completely protective of their investments of time and energy on behalf of their students. Graspable Math may have value I’m missing and I’m looking forward to hearing about it from you folks here and on Twitter.

BTW. Come work with me at Desmos!

If you find these questions interesting and you’d like to chase down their answers with me and my amazing colleagues at Desmos, please consider applying for our teaching faculty, software engineering, or business development jobs.

2018 Jun 11. Cathy Yenca pulls out this helpful citation from Nix the Tricks (p. 54).

An image showing the page from Nix the Tricks.

2018 Jun 11. The Graspable Math co-founders have responded to some of the questions I and other educators have raised here. Useful discussion!

Must Read: Larry Berger’s Confession & Question About Personalized Learning

Larry Berger, CEO of Amplify, offers a fantastic distillation of the promises of digital personalized learning and how they are undone by the reality of learning:

We also don’t have the assessments to place kids with any precision on the map. The existing measures are not high enough resolution to detect the thing that a kid should learn tomorrow. Our current precision would be like Google Maps trying to steer you home tonight using a GPS system that knows only that your location correlates highly with either Maryland or Virginia.

If you’re anywhere adjacent to digital personalized learning — working at an edtech company, teaching in a personalized learning school, in a romantic relationship with anyone in those two categories — you should read this piece.

Berger closes with an excellent question to guide the next generation of personalized learning:

What did your best teachers and coaches do for you–without the benefit of maps, algorithms, or data–to personalize your learning?

My best teachers knew what I knew. They understood what I understood about whatever I was learning in a way that algorithms in 2018 cannot touch. And they used their knowledge not to suggest the next “learning object” in a sequence but to challenge me in whatever I was learning then.

“Okay you think you know this pretty well. Let me ask you this.”

What’s your answer to Berger’s question?

BTW. It’s always the right time to quote Begle’s Second Law:

Mathematics education is much more complicated than you expected even though you expected it to be more complicated than you expected.

Featured Comment

SueH:

I have come to believe that all learning is personalized not because of what the teacher does but because of what’s happening inside the learner’s brain. Whatever pedagogical choices a teacher makes, it’s the student’s work that causes new neural networks to be created and pre-existing ones to be augmented or strengthened or broken or pruned.

Scott Farrand:

I’ll accept the risk of stating the obvious: my best teachers cared about me, and I felt that. Teaching is an act of love. A teacher who cares about each student is much more likely to, in that instant after a student responds to a question, find the positive value in the response and communicate encouragement to the student, verbally and nonverbally. And students who feel cared for are more likely to have good things going on in their brains, as described by SueH.

1,000 Math Teachers Tell Me What They Think About Calculators in the Classroom

Yesterday, I asked teachers on Twitter about their classroom calculator policy and 978 people responded.

I wanted to know if they allow calculators a) during classwork, b) during tests, and also which kinds of calculators:

  • Hardware calculators (like those sold by Texas Instruments, Casio, HP, etc.).
  • Mobile phone calculators (like those you can download on your Android or iOS phone).

(Full disclosure: I work for a company that distributes a free four function, scientific, and graphing calculator for mobile phones and other devices.)

I asked the question because hardware calculators don’t make a lot of financial sense to me.

Here are some statistics for high-end HP and Texas Instruments graphing calculators along with a low-end Android mobile phone. (Email readers may need to click through to see the statistics.)

[table id=6 /]

You pay less than 2x more for the mobile phone and you get hardware that is between 30x and 300x more powerful than the hardware calculators. And the mobile phone sends text messages, takes photos, and accesses webpages. In many cases, the student already has a mobile phone. So why spend the money on a second device that is much less powerful?

1,000 teachers gave me their answer.


The vast majority of respondents allow hardware calculator use in their classes. I suspect I’m oversampling for calculator-friendly teachers here, by virtue of drawing that sample from a digital medium like Twitter.

734 of those teachers allow a hardware graphing calculator but not a mobile phone on tests. 366 of those teachers offered reasons for that decision. They had my attention.

Here are their reasons, along with representative quotes, ranked from most common to least.

Test security. (173 votes.)

It’s too easy for students to share answers via text or picture.

Internet access capabilities and cellular capabilities that make it way too easy for the device to turn from an analysis/insight tool to the CheatEnable 3000 model.

School policy. (68 votes.)

School policy is that phones are in lockers.

It’s against school policy. They can use them at home and I don’t have a problem with it, but I’m not allowed to let them use mobile devices in class.

Distraction. (67 votes.)

Students waste time changing music while working problems, causing both mistakes due to lack of attention and inefficiency due to electronic distractions.

We believe the distraction factor is a negative impact on learning. (See Simon Sinek’s view of cell phones as an “addiction to distraction.”)

Test preparation. (54 votes.)

I am also preparing my students for an IB exam at the end of their senior year and there is a specific list of approved calculators. (Phones and computers are banned.)

Basically I am trying to get students comfortable with assessments using the hardware so they won’t freak out on our state test.

Access. (27 votes.)

Our bandwidth is sometimes not enough for my entire class (and others’ classes) to be online all at once.

I haven’t determined a good way so that all students have equal access.

Conclusion

These reasons all seem very rational to me. Still, it’s striking to me that “test security” dwarfs all others.

That’s where it becomes clear to me that the killer feature of hardware calculators is their lack of features. I wrote above that your mobile device “sends text messages, takes photos, and accesses webpages.” At home, those are features. At school, or at least on tests, they are liabilities. That’s a fact I need to think more about.

Featured Comments

Jennifer Potier:

I work in a BYOD school. What I have learned is that the best way to disengage students from electronic devices is to promote learning that involves student sharing of discussion, planning, thinking, and solving problems. When the students are put “centre stage,” the devices start becoming less interesting.

Chris Heddles:

The restriction on calculation aids and internet connections still stems from a serious cultural issue we have in mathematics teaching — the type of questions that we ask. While we continue to emphasise the importance of numerical calculations and algebraic manipulation in assessment, electronic aids to these skills will continue to be an issue.

Instead, we should shift the focus to understanding the situation presented, setting up the equations and then making sense of the calculation results. With this shift, the calculations themselves are relatively unimportant so it doesn’t really matter how the student process them. Digital aids can be freely used because they are off little use when addressing the key aspects of the assessment tasks.

In many ways our current mathematics assessment approach is equivalent to a senior secondary English essay that gave 80% of the grade for neat handwriting and correct spelling. If this were the case then they too, would have to ban all electronic aids to minimise the risk of “cheating” by typing and using spell checking software.

If we change what we value in assessment then we can open up better/cheaper electronic aids for students.

2017 Mar 24. Related to Chris’s comment above, I recently took some sample SAT math tests and was struck by how infrequently I needed a calculator. Not because I’m any kind of mental math genius. Simply because the questions largely concerned analysis and formulation over calculation and solution.

Featured Tweets

Problems with Personalized Learning

A reader pointed me to this interesting article in the current Educational Leadership on “personalized learning.” She said it raised an alarm for her that she couldn’t quite put into words and she asked if I heard that same alarm and, if so, what words I’d use to describe it.

I hear a few alarms, some louder and faster than others. Let me point them out in the piece.

Here we describe a student’s typical day at a personalized learning school. The setting is the Waukesha STEM Academy-Saratoga Campus in Waukesha, Wisconsin.

You could be forgiven for not knowing, based on that selection, that one of the authors is the principal of the Waukesha STEM Academy and that his two co-authors have financial ties to the personalized learning industry the Waukesha STEM Academy is a client of the other two authors [see this exchange. -dm]. What should be disclosed in the article’s first paragraph can only be inferred from the authors’ biographies in its footer. This minimal disclosure is consistent with what I perceive to be irresponsible self-promotion on the part of the personalized learning industry. (See also: “… this robot tutor can essentially read your mind.”)

(Full disclosure: I work in education technology.)

Then, in describing a student’s school experience before personalized learning, the authors write:

… [Cal’s] planner, which looked similar to those of the other 27 students in his class, told him everything he needed to know: Math: Page 122; solve problems 2—18 (evens). [..] Each week looked about the same.

If this is truly the case, if students didn’t interact with each other or their teacher at all, if they simply opened their books and completed a textbook assignment every day, every week, we really can’t do much worse. Most alternatives will look great. This isn’t a sober analysis of available alternatives. Again, this is marketing.

[Cal] began to understand why he sometimes misses some of the things that he hears in class and ands more comfort in module-based courses, where he can fast forward and rewind videos and read instructions at his own pace.

Fast-forwarding, rewinding, and pausing instructional videos are often cited as advantages of personalized learning, not because this is necessarily good instruction, but because it’s what the technology permits.

And this isn’t good instruction. It isn’t even good direct instruction. When someone is explaining something to you and you don’t understand them, you don’t ask that person to “repeat exactly what you just said only slower.” You might tell them what you understand of what they were saying. Then they might back up and take a different approach, using different examples, metaphors, or illustrations, ideally responding using your partial understanding as a resource.

I’m describing a very low bar for effective instruction. I’m describing techniques you likely employ in day-to-day conversation with friends and family without even thinking about them. I’m also describing a bar that 2017 personalized learning technology cannot clear.

His students don’t report to class to be presented with information. Instead, they’re empowered to use a variety of learning tools. Some students, like Cal, prefer step-by-step videos; others prefer songs and catchy rhymes to help them learn concepts. [..] He opens a series of videos and online tutorials, as well as tutorials prepared by his teacher.

In the first sentence, we’re told that students like Cal aren’t presented with information. Then, in the following sentences, we’re told all the different ways that those students are presented with information.

Whether you learn concepts from a step-by-step video, a rap, or a written tutorial, you are being presented with information. And a student’s first experience with new information shouldn’t be someone on a screen presenting it, no matter the style of presentation.

Because there is work students can do before that presentation to prepare themselves to learn and enjoy learning from it.

Because the video presenter treats students as though they have the same existing knowledge and prior conceptions about that information, even though those conceptions vary widely, even though some of them are surprisingly durable and require direct confrontation.

Because these video presentations communicate to students the message that math is something you can’t make sense of unless some adult explains it to you, that learning is something you do by yourself, and that your peers have nothing to offer your understanding of that new information.

I like a lot of the ethos around personalized learning — increasing student agency and metacognition, for example — but the loudest, fastest alarm in the article is this:

The medium is the message. Personalized learning is only as good as its technology, and in 2017 that technology isn’t good enough. Its gravity pulls towards videos of adults talking about math, followed by multiple choice exercises for practice, all of which is leavened by occasional projects. It doesn’t matter that students can choose the pace or presentation of that learning. Taking your pick of impoverished options still leaves you with an impoverished option.

2017 Mar 22. There are too many interesting comments to feature them individually. I’ll single out two of them directly, however:

  • Todd Gray, the Superintendent of the School District of Waukesha.
  • Anthony Rebora, the Editor-in-Chief of Educational Leadership.