Big Online Courses Have a Problem. Here’s How We Tried to Fix It.

The Problem

Here is some personal prejudice: I don’t love online courses.

I love learning in community, even in online communities, but online courses rarely feel like community.

To be clear, by online courses I mean the kind that have been around almost since the start of the internet, the kind that were amplified into the “Future of Education™” in the form of MOOCs, and which continue today in a structure that would be easily recognized by someone defrosted after three decades in cold storage.

These courses are divided into modules. Each module has a resource like a video or a conversation prompt. Students are then told to respond to the resource or prompt in threaded comments. You’re often told to make sure you respond to a couple of other people’s responses. This is community in online courses.

The reality is that your comment falls quickly down a long list as other people comment, a problem that grows in proportion to the number of students in the course. The more people who enroll, the less attention your ideas receive and consequently you’re less interested in contributing your ideas, a negative feedback loop which offers some insight into the question, “Why doesn’t anybody finish these online courses?

I don’t love online courses but maybe that’s just me. Two years ago, the ShadowCon organizers and myself, created four online courses to extend the community and ideas around four 10-minute talks from the NCTM annual conference. We hosted the courses using some of the most popular online course software.

The talks were really good. The assignments were really good. There’s always room for improvement but the facilitators would have had to quit their day jobs to increase the quality even 10%.

And still retention was terrible. 3% of participants finished the fourth week’s assignment who finished the first week’s.

Low retention from Week 1 to Week 4 in the course.

The organizers and I had two hypotheses:

  • The size of the course enrollment inhibited community formation and consequently retention.
  • Teachers had to remember another login and website in order to participate in the course, creating friction that decreased retention.

Our Solution

For the following year’s online conference extensions, we wanted smaller groups and we wanted to go to the people, to whatever software they were already using, rather than make the people come to us.

So we used technology that’s even older than online course software, technology that is woven tightly into every teacher’s daily routine: email.

Teachers signed up for the courses. They signed up in affinity groups – coaches, K-5 teachers, or 6-12 teachers.

The assignments and resources they would have received in a forum posting, they received in an email CC’d to two or three other participants, as well as the instructor. They had their conversation in that small group rather than in a massive forum.

Of course this meant that participants wouldn’t see all their classmates’ responses in the massive forum, including potentially helpful insights.

So the role of the instructors in this work wasn’t to respond to every email but rather to keep an eye out for interesting questions and helpful insights from participants. Then they’d preface the next email assignment with a digest of interesting responses from course participants.

The Results

To be clear, the two trials featured different content, different instructors, different participants, and different grouping strategies. They took place in different years and different calendar months in those years. Both courses were free and about math, but there are plenty of variables that confound a direct comparison of the media.

So consider it merely interesting that average course retention was nearly 5x when the medium was email rather than online course software.

Retention was nearly five times greater in the email course than LMS.

It’s also just interesting, and still not dispositive, that the length of the responses in emails were 2x the length of the responses in the online course software.

Double the word count.

People wrote more and stuck around longer for email than for the online course software. That says nothing about the quality of their responses, just the quantity. It says nothing about the degree to which participants in either medium were building on each other’s ideas rather than simply speaking their own truth into the void.

But it does make me wonder, again, if large online courses are the right medium for creating an accessible community around important ideas in our field, or in any field.

What do you notice about this data? What does it make you wonder?

Featured Comments

Leigh Notaro:

By the way, the Global Math Department has a similar issue with sign-ups versus attendance. Our attendance rate is typically 5%-10% of those who sign up. Of course, we do have the videos and the transcript of the chat. So, we have made it easy for people to participate in their own time. Partipating in PD by watching a video though is never the same thing as collaborating during a live event – virtually or face-to-face. It’s like learning in a flipped classroom. Sure, you can learn something, but you miss out on the richness of the learning that really can only happen in a face-to-face classroom of collaboration.

William Carey:

At our school now, when we try out new parent-teacher communication methods, we center them in e-mail, not our student information system. It’s more personal and more deeply woven into the teachers’ lives. It affords the opportunity for response and conversation in a way that a form-sent e-mail doesn’t.

Cathy Yenca:

At the risk of sounding cliché or boastful about reaching “that one student”, how does one represent a “data point” like this one within that tiny 3%? For me, it became 100% of the reason and reward for all of the work involved. I know, I know, I’m a sappy teacher :-)

Justin Reich is extremely thoughtful about MOOCs and online education and offered an excellent summary of some recent work.

What Does Fluency Without Understanding Look Like?

In the wake of Barbara Oakley’s op-ed in the New York Times arguing that we overemphasize conceptual understanding in math class, it’s become clear to me that our national conversation about math instruction is missing at least one crucial element: nobody knows what anybody means by “conceptual understanding.”

For example, in a blog comment here, Oakley compares conceptual understanding to knowing the definition of a word in a foreign language. Also, Oakley frequently cites a study by Paul Morgan that attempts to discredit conceptual understanding by linking it to “movement and music” (p. 186) in math class.

These are people publishing their thoughts about math education in national publications and tier-one research journals. Yet you’d struggle to find a single math education researcher who’d agree with either of their characterizations of one of the most important strands of mathematical proficiency.

Here are two useful steps forward.

First, Adding It Up is old enough to vote. It was published by the National Research Council. It’s free. You have no excuse not to read its brief chapter on procedural fluency. Then critique that definition.

Conceptual understanding refers to an integrated and functional grasp of mathematical ideas. Students with conceptual understanding know more than isolated facts and methods. They understand why a mathematical idea is important and the kinds of contexts in which is it useful. They have organized their knowledge into a coherent whole, which enables them to learn new ideas by connecting those ideas to what they already know. Conceptual understanding also supports retention. Because facts and methods learned with understanding are connected, they are easier to remember and use, and they can be reconstructed when forgotten. (pp. 118-119.)

If you’re going to engage with the ideas of a complex field, engage with its best. That’s good practice for all of us and it’s especially good practice for people who are commenting from outside the field like Oakley (trained in engineering) and Morgan (trained in education policy).

Second, math education professionals need to continually articulate a precise and practical definition of “conceptual understanding.” In conversations with people in my field, I find the term tossed around so casually so often that everyone in the conversation assumes a convergent understanding when I get the sense we’re all picturing it rather differently.

To that end, I think it would be especially helpful to compile examples of fluency without understanding. Here are three and I’d love to add more from your contributions on Twitter and in the comments.

A student who has procedural fluency but lacks conceptual understanding …

  • Can accurately subtract 2018-1999 using a standard algorithm, but doesn’t recognize that counting up would be more efficient.
  • Can accurately compute the area of a triangle, but doesn’t recognize how its formula was derived or how it can be extended to other shapes. (eg. trapezoids, parallelograms, etc.)
  • Can accurately calculate the discriminant of y = x2 + 2 to determine that it doesn’t have any real roots, but couldn’t draw a quick sketch of the parabola to figure that out more efficiently.

This is what worries the people in one part of this discussion. Not that students wouldn’t experience delirious fun in every minute of math class but that they’d become mathematical zombies, plodding functionally through procedures with no sense of what’s even one degree outside their immediate field of vision.

Please offer other examples in the comments from your area of content expertise and I’ll add them to the post.

BTW. I’m also enormously worried by people who assume that students can’t or shouldn’t engage creatively in the concepts without first developing procedural fluency. Ask students how they’d calculate that expression before helping them with an algorithm. Ask students to slice up a parallelogram and rearrange it into a more familiar shape before offering them guidance. Ask students to sketch a parabola with zero, one, or two roots before helping them with the discriminant. This is a view I thought Emma Gargroetzi effectively critiqued in her recent post.

BTW. I’m happy to read a similar post on “conceptual understanding without procedural fluency” on your blog. I’m not writing it because a) I find myself and others much less confused about the definition of procedural fluency than conceptual understanding (oh hi, Adding It Up!) and b) I find it easier to help students develop procedural fluency than conceptual understanding by, like, several orders of magnitude.

2018 Sep 05: The Khan Academy Long-Term Research team saw lots of students who could calculate the area of a kite but wrote variations on “idk” when asked to defend their answer.

2018 Sep 09: Here’s an interesting post on practice from Mark Chubb.

Featured Tweets

Featured Comments

Karen Campe:

Can find zeros of factored quadratic that equals zero, but uses same approach when doesn’t equal zero. E.g. can solve (x-3)(x-2) = 0 but also answers 3 and 2 for (x-3)(x-2) = 6.

Ben Orlin:

The big, weird thing about math education is that most pupils have no experience of what mastery looks like. They’ve heard language spoken; they’ve watched basketball; they’ve eaten meals; but they probably haven’t seen creative mathematical problem-solving. This makes it extra important that they have *some* experience of this, as early as possible. Otherwise math education feels like running passing drills when you’ve never seen a game of basketball.

Mike:

Today a student correctly solved -5=7-4x but then argued that -4x +7=-5 was a different equation that had to have a different answer.

Michael Pershan:

This has definitely not been my experience, and I don’t think this is consistent with the idea that conceptual and procedural fluency co-develop — an idea rooted in research.

William Carey:

I really like that way of talking about it. The way I think of it is a bit like exploration of an unknown continent. One the one hand, you have to spend time venturing boldly out into the unknown jungle, full of danger and mistakes and discovery. But if you venture too far, you can’t get food, water, and supplies up to the party. Tigers eat you in the night. So you spend time consolidating, building fortified places, roads, wells, &c. Eventually, the territory feels safe, and that prepares you to head into the unknown again.

Jane Taylor:

A student who can calculate slope but has no idea what it means as the rate of change in a real context.

Kim Morrow-Leong:

An example of procedural fluency without conceptual understanding is adding up a series of integers one by one instead of finding additive inverses (no need to even call it an additive inverse – calling it “canceling” would even be ok.) Example: -4 + 5 + -9 + -5 + 4 + 9

Drill-Based Math Instruction Diminishes the Math Teacher as Well

Emma Gargroetzi posts an astounding rebuttal to Barbara Oakley’s New York Times op-ed encouraging drill-based math instruction. Gargroetzi highlights two valid points from Oakley and then takes a blowtorch to the rest of them.

I haven’t been able to stop thinking about her last sentence since I read it yesterday.

Anyone who teaches children that they need to silently comply through painful experiences before they will be allowed to let their brilliance shine has no intention of ever allowing that brilliance to shine, and will not be able to see it when it does.

I’m perhaps more hesitant than Gagroetzi to judge intent. Lots of teachers were, themselves, victimized by drill-based instruction as students and may lack an imagination for anything different. But I’m absolutely convinced that a) we act ourselves into belief rather than believing our way into acting, and b) actions and beliefs will accumulate over a career like rust and either inhibit or enhance our potential as teachers.

A math program that endorses drills and pain as the foundational element of math instruction (rather than a supporting element) and as a prerequisite for creative mathematical thought (rather than a co-requisite) inhibits the student and the teacher both, diminishing the student’s interest in producing that creativity and the teacher’s ability to notice it.

Teachers need to disrupt the harmful messages their students have internalized about mathematics. But we also need to disrupt the harmful messages that teachers have internalized as well.

What experiences can disrupt the harmful messages teachers have internalized about math instruction? Name some in the comments. I’ll add my own suggestions later tomorrow.

2018 Aug 25. I added my own suggestion here.

Featured Comments

Faye calls out the process of learning content and pedagogy simultaneously:

Many mathematics teachers do not have the mathematics content knowledge that they need themselves. The Greater Birmingham Mathematics Partnership has found that teaching teachers mathematics using inquiry based instruction results in increased content knowledge for the teachers and a change in their beliefs about how and what all children can learn, i.e., acting themselves into changed beliefs.

Chris:

Math teachers circles (www.mathteacherscircle.org/). They provide the space for math teachers to be mathematicians (in the same way a lot of the arts teachers I know are still practicing artists).

Another Chris echoes:

It wasn’t until I was asked to think about mathematical tasks and ideas for my own understanding that I could ask the same of my students. And then, it was unavoidable…there was no going back.

William Thill elaborates:

But when I can tap into the emotional and intellectual highs that emerge from playing with cherished colleagues, I am more likely to “set the buffet” for my students with more open-ended exploration times.

Martha Mulligan:

… watching yourself teach on video is a great experience to disrupt harmful messages about math instruction, like talking too much as the teacher. I know that many math teachers feel the need to provide the most perfect, refined, rehearsed explanation so that students can see what they are supposed to see in the way they are supposed to see it. I certainly felt (at time still feel?) that way. That practice diminishes the students’ roles of sense-making on their own. But watching a video of myself teaching was one of the most humbling things I’ve done and it changed my practice so much. I also watched them among other trusted teachers from whom I learned so much. Having time to stop a video, talk about, reflect on it, etc is very powerful. Even seemingly simple things like wait time and teacher movement/positioning can look very different than what we imagine we look like.

Alexandra Martinez calls out the limitation of reading narratives and watching videos of innovative teaching:

I think the most powerful way to disrupt teacher’s own experiences and expectations is new creative experiences with their own students. The evidence and reflection can support teachers in seeing what is possible. If we ask teachers to imagine what is possible through narrative, they won’t always believe it. But when they see their own students speaking and thinking as mathematicians, that evidence disrupts their established belief systems. So I’d say observations, modeling, Coteaching, pushing in, PLC planning with lesson study can all potentially do this.

Be sure, also, to check into Chris Heddles’ a/k/a Third Chris’s dissent:

I’m going to go against the grain and admit that I use drill as a prerequisite (or at least an opening activity) with many of my students.

Learning the Wrong Lessons from Video Games

[This is my contribution to The Virtual Conference on Mathematical Flavors, hosted by Sam Shah.]

In the early 20th century, Karl Groos claimed in The Play of Man that “the joy in being a cause” is fundamental to all forms of play. One hundred years later, Phil Daro would connect Groos’s theory of play to video gaming:

Every time the player acts, the game responds [and] tells the player your action causes the game action: you are the cause.

Most attempts to “gamify” math class learn the wrong lessons from video games. They import leaderboards, badges, customized avatars, timed competitions, points, and many other stylistic elements from video games. But gamified math software has struggled to import this substantial element:

Every time the player acts, the game responds.

When the math student acts, how does math class respond? And how is that response different in video games?

Watch how a video game responds to your decision to jump off a ledge.

Now watch math practice software responds to your misinterpretation of “the quotient of 9 and c.”

The video game interprets your action in the world of the game. The math software evaluates your action for correctness. One results in the joy in being the cause, a fundamental feature of play according to Groos. The other results in something much less joyful.

To see the difference, imagine if the game evaluated your decision instead of interpreting it.

I doubt anyone would argue with the goals of making math class more joyful and playful, but those goals are more easily adapted to a poster or conference slidedeck than to the actual experience of math students and teachers.

So what does a math class look like that responds whenever a student acts mathematically, that interprets rather than evaluates mathematical thought, that offers students joy in being the cause of something more than just evaluative feedback.

“Have students play mathematical or reasoning games,” is certainly a fair response, but bonus points if you have recommendations that apply to core academic content. I will offer a few examples and guidelines of my own in the comments later tomorrow.

Featured Comments

James Cleveland:

I feel like a lot of the best Desmos activities do that, because they can interpret (some of) what the learner inputs. When you do the pool border problem, it doesn’t tell you that your number of bricks is wrong – it just makes the bricks, and you can see if that is too many, too few, or just right.

In general, a reaction like “Well, let’s see what happens if that were true” seems like a good place to start.

Kevin Hall:

My favorite example of this is when Cannon Man’s body suddenly multiplies into two or three bodies if a student draws a graph that fails the vertical line test.

Sarah Caban:

I am so intrigued by the word interpret. “Interpret” is about translating, right? Sometimes when we try to interpret, we (unintentionally) make assumptions based on our own experiences. Recently, I have been pushing myself to linger in observing students as they work, postponing interpretations. I have even picked up a pencil and “tried on” their strategies, particularly ones that are seemingly not getting to a correct solution. I have consistently been joyfully surprised by the math my students were playing with. I’m wondering how this idea of “trying on” student thinking fits with technology. When/how does technology help us try on more student thinking?

Dan Finkel:

I think that many physical games give clear [evaluative] feedback as well, insofar as you test out a strategy, and see if you win or not. Adults can ruin these for children by saying, “are you sure that’s the right move?” rather than simply beating them so they can see what happens when they make that move. The trick there is that some games you improve at simply by losing (I’d put chess in this column, even though more focused study is essential to get really good), where others require more insight to see what you actually need to change.

Orchestrate More Productive Mathematics Discussions with Desmos Snapshots

Let me describe a powerful teaching tool we just released and the company values that compelled us to build it.

First, let’s acknowledge that statements of values are often useless. Values are only useful if they help people make hard decisions. Our company values should (a) help educators decide how we’re different from other math edtech companies, (b) help us decide how to spend our limited time in the world. So here is one of our values:

We believe that math class should be social and creative – that students should create mathematics in every form and then share those creations with each other and their teachers.

Many other companies disagree with those values, or at least they spend their limited time in the world acting on different ones. For example, many other companies think it’s sufficient for students to create multiple choice and numerical responses to express their mathematical thinking and to share those responses with a grading algorithm alone.

Our values conflict, and the result is that other companies spend their time optimizing adaptive grading algorithms while we spend our time thinking about ways to provoke mathematical creativity that algorithms can’t grade at all. We may both work in “math edtech” but we are on very different paths, and our path recently led us to a very thorny question:

What should teachers do with all these expressions of mathematical creativity that algorithms can’t grade?

Let’s say we ask students an interesting question about mathematics or we ask them to define a relationship and sketch its graph. That’s good math, but the teacher now has dozens of written answers and sketches that their computers can’t grade.

Other math edtech software offers teachers scarce insight into the ways students think mathematically. We offer teachers abundant insight which is a different kind of problem, and just as serious. We’ve spent months building a solution to this problem of abundance and we likely would have spent years if not for one book:

Mary Kay Stein and Margaret Smith’s Five Practices for Orchestrating Productive Mathematical Discussions.

Smith and Stein describe five teaching practices that promote student learning through summary discussions. Teachers should (1) anticipate ideas students will produce during a task or activity and then (2) monitor student work during class for those ideas and others that weren’t anticipated. Then the teacher should (3) select a subset of those interesting student ideas, (4) sequence the order of their presentation, and then help students (5) connect them.

In our classroom observations of our activities, we noticed teachers struggling to select student ideas because there were so many of them streaming from the students’ heads into the teacher’s dashboard. Sometimes teachers would make a note about an idea they wanted to select later, but when “later” came around, the student had already developed the idea further. So then we saw teachers take screenshots of that idea and paste them into slide software for sequencing. Smith and Stein’s recommendations are already ambitious and our software was not making it easier for teachers to enact them.

So we built “Snapshots.”

If you see interesting ideas at any time during an activity, press the camera icon next to it.

Then go to the “Snapshots” tab.

Sequence the ideas by dragging them into a collection.

Add a comment or a question to help students connect their classmates’ ideas to the main ideas of the lesson.

Then press “Present.”

We tested the tool ourselves during a summer school session in Berkeley, CA, and also with teachers around the country. What we’ve noticed is that students pay much more attention to discussions when the discussion isn’t about a page from the textbook or a worked example from the teacher but about ideas from the students themselves.

It’s the difference between “Let me tell you about a really useful strategy for multiplying two-digit number” and “Let me show you some useful strategies from around the class for multiplying two-digit numbers. They’re all correct. Decide which seems like less work to you.”

Here are some of our other favorite uses from the last month of testing.

Match the diagram to the expression.

Which of these answers are equivalent? How do you know?

Values help us all decide how to spend our limited time in the world, and nobody feels those limits quite like classroom teachers. Teachers frequently, and with good cause, evaluate new ideas and innovations by asking, “Does my class have time for this? What will we have to skip if we do this?”

Your decision to spend your limited class time talking about your ideas, your textbook’s ideas, or your students’ ideas is a loud expression of your values. Students hear it. We hope your students hear how much you value their mathematical creativity, explicitly in your words and implicitly in how you spend your time. You bring those values. We’ll keep working on tools to help you live them out in your classroom every day.