Category: tech contrarianism

Total 130 Posts

Problems with Personalized Learning

A reader pointed me to this interesting article in the current Educational Leadership on “personalized learning.” She said it raised an alarm for her that she couldn’t quite put into words and she asked if I heard that same alarm and, if so, what words I’d use to describe it.

I hear a few alarms, some louder and faster than others. Let me point them out in the piece.

Here we describe a student’s typical day at a personalized learning school. The setting is the Waukesha STEM Academy-Saratoga Campus in Waukesha, Wisconsin.

You could be forgiven for not knowing, based on that selection, that one of the authors is the principal of the Waukesha STEM Academy and that his two co-authors have financial ties to the personalized learning industry the Waukesha STEM Academy is a client of the other two authors [see this exchange. -dm]. What should be disclosed in the article’s first paragraph can only be inferred from the authors’ biographies in its footer. This minimal disclosure is consistent with what I perceive to be irresponsible self-promotion on the part of the personalized learning industry. (See also: “… this robot tutor can essentially read your mind.”)

(Full disclosure: I work in education technology.)

Then, in describing a student’s school experience before personalized learning, the authors write:

… [Cal’s] planner, which looked similar to those of the other 27 students in his class, told him everything he needed to know: Math: Page 122; solve problems 2–18 (evens). [..] Each week looked about the same.

If this is truly the case, if students didn’t interact with each other or their teacher at all, if they simply opened their books and completed a textbook assignment every day, every week, we really can’t do much worse. Most alternatives will look great. This isn’t a sober analysis of available alternatives. Again, this is marketing.

[Cal] began to understand why he sometimes misses some of the things that he hears in class and ands more comfort in module-based courses, where he can fast forward and rewind videos and read instructions at his own pace.

Fast-forwarding, rewinding, and pausing instructional videos are often cited as advantages of personalized learning, not because this is necessarily good instruction, but because it’s what the technology permits.

And this isn’t good instruction. It isn’t even good direct instruction. When someone is explaining something to you and you don’t understand them, you don’t ask that person to “repeat exactly what you just said only slower.” You might tell them what you understand of what they were saying. Then they might back up and take a different approach, using different examples, metaphors, or illustrations, ideally responding using your partial understanding as a resource.

I’m describing a very low bar for effective instruction. I’m describing techniques you likely employ in day-to-day conversation with friends and family without even thinking about them. I’m also describing a bar that 2017 personalized learning technology cannot clear.

His students don’t report to class to be presented with information. Instead, they’re empowered to use a variety of learning tools. Some students, like Cal, prefer step-by-step videos; others prefer songs and catchy rhymes to help them learn concepts. [..] He opens a series of videos and online tutorials, as well as tutorials prepared by his teacher.

In the first sentence, we’re told that students like Cal aren’t presented with information. Then, in the following sentences, we’re told all the different ways that those students are presented with information.

Whether you learn concepts from a step-by-step video, a rap, or a written tutorial, you are being presented with information. And a student’s first experience with new information shouldn’t be someone on a screen presenting it, no matter the style of presentation.

Because there is work students can do before that presentation to prepare themselves to learn and enjoy learning from it.

Because the video presenter treats students as though they have the same existing knowledge and prior conceptions about that information, even though those conceptions vary widely, even though some of them are surprisingly durable and require direct confrontation.

Because these video presentations communicate to students the message that math is something you can’t make sense of unless some adult explains it to you, that learning is something you do by yourself, and that your peers have nothing to offer your understanding of that new information.

I like a lot of the ethos around personalized learning – increasing student agency and metacognition, for example – but the loudest, fastest alarm in the article is this:

The medium is the message. Personalized learning is only as good as its technology, and in 2017 that technology isn’t good enough. Its gravity pulls towards videos of adults talking about math, followed by multiple choice exercises for practice, all of which is leavened by occasional projects. It doesn’t matter that students can choose the pace or presentation of that learning. Taking your pick of impoverished options still leaves you with an impoverished option.

2017 Mar 22. There are too many interesting comments to feature them individually. I’ll single out two of them directly, however:

  • Todd Gray, the Superintendent of the School District of Waukesha.
  • Anthony Rebora, the Editor-in-Chief of Educational Leadership.

[Desmos Design] Why We’re Suspicious of Immediate Feedback

One of our design principles at Desmos is to “delay feedback for reflection, especially during concept development activities.” This makes us weird, frankly, in Silicon Valley where no one ever got fired for promising “immediate feedback” in their math edtech.

We get it. Computers have an enormous advantage over humans in their ability to quickly give students feedback on certain kinds of work. But just because computers can deliver immediate feedback doesn’t mean they always should.

For example, Simmons and Cope (1993) found that students were more likely to use procedural strategies like trial-and-error in a condition of immediate feedback than a condition of delayed feedback.

I think I can illustrate that for you with this activity, which has two tasks. You get immediate feedback on one and delayed feedback on the other.

I’ll ask you what I asked 500 Twitter users:

How was your brain working differently in the “Circle” challenge [delayed feedback] than the “Parabola” challenge [immediate feedback]?

Exhibit A:

The circle one was both more challenging and fun. I found myself squinting on the circle to visualize it in my head while with the parabola I mindlessly did trial and error.

Exhibit B:

With the circle, the need to submit before seeing the effect made me really think about what each part of the equation would effect the graph in each way. This resulted in a more strategic first guess rather than a guess and check approach.

Exhibit C:

I could guess & check the parabola challenge. In the circle challenge I had to concentrate more about the center of the circle and the radius. Much more in fact.

Exhibit D:

I couldn’t use trial and error. I had to visualize and estimate and then make decisions. My brain was more satisfied after the circle.

Exhibit E:

I probably worked harder on [the circle] because my answer was not shown until I submitted my answer. It was more frustrating than the parabola problem – but I probably learned more.

This wasn’t unanimous, of course, but it was the prevailing sentiment. For most people, the feedback delay provoked thoughtfulness where the immediate feedback provoked trial-and-error.

We realize that the opposite of “immediate feedback” for many students is “feedback when my teacher returns my paper after a week.” Between those two options, we side with Silicon Valley’s preference for immediate feedback. But if computers can deliver feedback immediately, they can also deliver feedback almost immediately, after a short, productive delay. That’s the kind of feedback we design into our concept development activities.

BTW. For a longer version of that activity, check out Building Conic Sections, created by Dylan Kane and edited with love by our Teaching Faculty.

What’s Wrong with This Experiment?

If you’re the sort of person who helps students learn to design controlled experiments, you might offer them W. Stephen Wilson’s experiment in The Atlantic and ask for their critique.

First, Wilson’s hypothesis:

Wilson fears that students who depend on technology [calculators, specifically –dm] will fail to understand the importance of mathematical algorithms.

Next, Wilson’s experiment:

Wilson says he has some evidence for his claims. He gave his Calculus 3 college students a 10-question calculator-free arithmetic test (can you multiply 5.78 by 0.39 without pulling out your smartphone?) and divided the them into two groups: those who scored an eight or above on the test and those who didn’t. By the end of the course, Wilson compared the two groups with their performance on the final exam. Most students who scored in the top 25th percentile on the final also received an eight or above on the arithmetic test. Students at the bottom 25th percentile were twice as likely to score less than eight points on the arithmetic test, demonstrating much weaker computation skills when compared to other quartiles.

I trust my readers will supply the answer key in the comments.

BTW. I’m not saying there isn’t evidence that calculator use will inhibit a student’s understanding of mathematical algorithms, or that no such evidence will ever be found. I’m just saying this study isn’t that evidence.

Featured Tweet

Got one!

Featured Comment

Scott Farrand:

The most clarifying thing that I can recall being told about testing in mathematics came from a friend in that business: you’ll find a positive correlation between student performance on almost any two math tests. So don’t get too excited when it happens, and beware of using evidence of correlation on two tests as evidence for much.

Collective Effervescence Is the Cost of Personalized Learning

Cross-posted from the Desmos blog. I’m happy enough with this post to re-broadcast it here. The Desmos blog doesn’t have comments, also, which makes this a better forum for you to tell me if I’m wrong.

190924_1

We’re proud to debut our free Classroom Conversation Toolset, which has been the labor of our last three months. You can pause your students’ work. You can anonymize your students’ names. You can restrict the pace of your students through the activity. We believe there are productive and counterproductive ways to use these tools, so let us explain why we built them.

First, the edtech community is extremely excited about personalized learning – students learning at their own pace, uninhibited by their teacher or classmates. Our Activity Builder shares some of that enthusiasm but not all. Until last week, students could click through an activity from the first screen to the last, inhibited by nothing and nobody.

But the cost of personalized learning is often a silent classroom. In the worst-case scenario, you’ll walk into a classroom and see students wearing headphones, plugged into computers, watching videos or clicking multiple choice questions with just enough interest to keep their eyes open. But even when the activities are more interesting and cognitively demanding than video-watching and multiple choice question-clicking, there is still an important cost. You lose collective effervescence.

Collective effervescence is a term that calls to mind the bubbles in fizzy liquid. It’s a term from Émile Durkheim used to describe a particular force that knits social groups together. Collective effervescence explains why you still attend church even though the sermons are online, why you still attend sporting events even though they’re broadcast in much higher quality with much more comfortable seats from your living room. Collective effervescence explains why we still go to movie theaters; laughing, crying, or screaming in a room full of people is more satisfying than laughing, crying, or screaming alone.

An illustrative anecdote. We were testing these features in classes last week. We watched a teacher – Lieva Whitbeck in San Francisco – elicit a manic cheer from a class of ninth-graders simply by revealing the graph of a line. She brought her class together and asked them to predict what they’d see when she turned on the graph. They buzzed for a moment together, predicted a line, and then she gave the crowd what they came for.

She brought them together. She brought back the kids who were a bit ahead and she brought forward the kids who were a bit behind. She de-personalized the learning so she could socialize it. Because arguments are best with other people. Because the negotiation of ideas is most effective when you’re negotiating with somebody. And because collective effervescence is impossible to experience alone.

So these tools could very easily have been called our Classroom Management Toolset. They are useful for managing a class, for pausing the work so you can issue a new prompt or so you can redirect your class. But we didn’t build them for those purposes. We built them to restore what we feel the personalized-learning moment has missed. We built them for conversation and collective effervescence.

Featured Tweets

Why Secondary Teachers Don’t Want a GitHub for Lesson Plans

Chris Lusto calls for a GitHub for lesson plans:

To say that the community repository model has done wonders for open source software is a massive understatement. To what extent that success translates to curriculum I’m obviously unsure, but I have randomly-ordered reasons to suspect it’s appreciable.

I attended EdFoo earlier this year, an education conference at Google’s campus attended by lots of technologists. Speakers posed problems about education in their sessions and the solutions were often techno-utopian, or techno-optimistic at the very least.

One speaker wondered why teachers spend massive amounts of time creating lessons plans that don’t differ all that much from plans developed by another teacher several states away or several doors down the hall. Why don’t they just build it once, share it, and let the community modify it? Why isn’t there a GitHub for lesson plans?

I’m not here to say that’s a bad idea in theory, just to say that the idea very clearly hasn’t caught on in practice.

Exhibit A: BetterLesson, which pivoted from its original community lesson repository model to a lesson repository stocked by master teachers and now to professional development. (Its lesson repository is currently a blink-and-you’ll-miss-it link in the footer of their homepage.) The idea has failed to catch on with secondary educators to such a degree that it’s worth asking them why they don’t seem to want it.

Our room at EdFoo was notably absent of practicing secondary teachers so I went on Twitter to ask a few thousand of them, “Why don’t you use lesson download sites?” (I asked the same question two years ago as well.) Here are helpful responses from actual, really real current and former secondary teachers:

Nancy Mangum:

Using someone else’s lesson plan is like wearing a friend’s underwear. It may do the job but ultimately doesn’t fit quite right.

Jonathan Claydon:

Their wheels aren’t the right size for my car.

Justin Reich:

Linux works because code compiles. Syllabi don’t compile. If I add a block/lesson, I never know who it helps.

Bob Lochel:

I don’t require a script, just decent ideas now and then.

Grace Chen:

I’m not sure they solve for the problems they think they’re trying to solve. It takes time to read / internalize / modify others’ plans.

David Wees:

It’s challenging to sequence, connect, plan, and enact someone else’s lesson.

Mark Pettyjohn:

The plan itself is the least important element. The planning is what’s critical.

2016 Jun 11. Dwight Eisenhower:

In preparing for battle I have always found that plans are useless, but planning is indispensable.

In sum: “Small differences between lessons plans are enormously important, enormously time-consuming to account for and fix, and whatever I already have is probably good enough.” It turns out that even if two lesson plans don’t differ all that much they already differ too much.

Any lesson sharing site will have to account for that belief before it can offer teachers even a fraction of GitHub’s value to programmers.

2016 Jun 8. Check out Bob Lochel’s tweet above and Julie Reulbach’s tweet below. Both express a particular sentiment that the nuts and bolts of a lesson plan are less important than the chassis. (I don’t know a thing about cars.)

I was chatting with EdSurge’s Betsy Corcoran about that idea at EdFoo and she likened it to “the head” in jazz music. (I don’t know a thing about jazz music.) The head contains crucial information about a piece of music – the key, the tempo, the chord changes. Jazz musicians memorize the head but they’ll build and develop the performance off of it. The same head may result in several different performances. What I want – along with Bob and Julie and many others – is a jazz musician’s fake book – a repository of creative premises I can easily riff off of.

(Of course, it’s worth noting here that many people believe that teachers should be less like jazz musicians and more like player pianos.)

Featured Tweets

Featured Comments

Chris Lusto:

There seems to be a general distrust of “other people’s lessons.” Which I get. But nothing about this model would change the extent to which you do or do not teach other people’s lessons, or the fidelity with which you do it. Again, the whole thing that got me thinking in this vein was the problem of managing, in some kind of coherent way, all the changes that teachers already make as a matter of course. If you’re starting with an existing curriculum, then you’re using other people’s stuff to some extent. And once you alter that extent, it might be nice to track it, for all sorts of reasons. Maybe classroom teachers don’t find that interesting, but somebody in the chain between publisher and implementer certainly does. Not totally sure who the best target audience might be.

Jo:

As an elementary math coach I don’t want a repository of lesson plans either but my teachers long for one. However, when given pre-written lesson plans they’re not happy with them–for all the reasons listed above.
The hardest thing about elementary math is that most elementary teachers go into teaching because they love reading and they want to share that. They rarely feel that way about math. So, they want a guided lesson that will teach the requisite skills. Unfortunately, it doesn’t work for them any better than it works for secondary teachers.

Even in elementary it’s the process of planning that’s important. My brain needs to go through the work of planning–what leads to what, what is going to confuse the kids, what mistakes are they likely to make, what false paths are they likely to follow. The only way to deeply understand the material and how to present it is to plan it. The only way to truly understand the standard is to wrestle with what it really means.

Planning is the work; teaching is just the performance.

Ethan Weker:

I get a lot out of reading other lesson plans/approaches to teaching/ideas, and steal activities fairly regularly, but my actual lesson plans aren’t copies of others’. It’s more like they’re inspired by what other people do. This is where the artistry of teaching comes in.

Brandon Dorman:

I get it – we don’t want a repository of lessons, but what happens once those lessons get downloaded and re-worked? Right now there isn’t a way to see derivatives of those lessons, which could be very important.

Stephanie:

Brandon, I love that idea. Recipe websites do this — what can be substituted for what.. how can different teachers with different ingredients, different tools and in different places.. these are good parallels for the teaching world.

2016 Jun 13. Mike Caulfield offers an illustration of the value of planning relative to plans.