Posts

Comments

Get Posts by E-mail

tl;dr — Khan Academy claims alignment with the Common Core State Standards (CCSS) but an analysis of their eighth-grade year indicates that alignment is loose. 40% of Khan Academy exercises assessed the acts of calculating and solving whereas the Smarter Balanced Assessment Consortium’s assessment of the CCSS emphasized those acts in only 25% of their released items. 74% of Khan Academy’s exercises resulted in the production of either a number or a multiple-choice response, whereas those outputs accounted for only 25% of the SBAC assessment.

Introduction

My dissertation will examine the opportunities students have to learn math online. In order to say something about the current state of the art, I decided to complete Khan Academy’s eighth grade year and ask myself two specific questions about every exercise:

  • What am I asked to do? What are my verbs? Am I asked to solve, evaluate, calculate, analyze, or something else?
  • What do I produce? What is the end result of my work? Is my work summarized by a number, a multiple-choice response, a graph that I create, or something else?

I examined Khan Academy for several reasons. First, because they’re well-capitalized and they employ some of the best computer engineers in the world. They have the human resources to create some novel opportunities for students to learn math online. If they struggle, it is likely that other companies with equal or lesser human resources struggle also. I also examined Khan Academy because their exercise sets are publicly available online, without a login. This will energize our discussion here and make it easier for you to spotcheck my analysis.

My data collection took me three days and spanned 88 practice sets. You’re welcome to examine my data and critique my coding. In general, Khan Academy practice sets ask that you complete a certain number of exercises in a row before you’re allowed to move on. (Five, in most cases.) These exercises are randomly selected from a pool of item types. Different item types ask for different student work. Some item types ask for multiple kinds of student work. All of this is to say, you might conduct this exact same analysis and walk away with slightly different findings. I’ll present only the findings that I suspect will generalize.

After completing my analysis of Khan Academy’s exercises, I performed the same analysis on a set of 24 released questions from the Smarter Balanced Assessment Consortium’s test that will be administered this school year in 17 states.

Findings & Discussion

Khan Academy’s Verbs

141202_7lo

The largest casualty is argumentation. Out of the 402 exercises I completed, I could code only three of their prompts as “argue.” (You can find all them in “Pythagorean Theorem Proofs.”) This is far out of alignment with the Common Core State Standards, which has prioritized constructing and critiquing arguments as one of its eight practice standards that cross all of K-12 mathematics.

141202_1lo

Notably, 40% of Khan Academy’s eighth-grade exercises ask students to “calculate” or “solve.” These are important mathematical actions, certainly. But as with “argumentation,” I’ll demonstrate later that this emphasis is out of alignment with current national expectations for student math learning.

The most technologically advanced items were the 20% of Khan Academy’s exercises that asked students to “construct” an object. In these items, students were asked to create lines, tables, scatterplots, polygons, angles, and other mathematical structures using novel digital tools. Subjectively, these items were a welcome reprieve from the frequent calculating and solving, nearly all of which I performed with either my computer’s calculator or with Wolfram Alpha. (Also subjective: my favorite exercise asked me to construct a line.) These items also appeared frequently in the Geometry strand where students were asked to transform polygons.

141202_2lo

I was interested to find that the most common student action in Khan Academy’s eighth-grade year is “analyze.” Several examples follow.

141202_5lo

Khan Academy’s Productions

These questions of analysis are welcome but the end result of analysis can take many forms. If you think about instances in your life when you were asked to analyze, you might recall reports you’ve written or verbal summaries you’ve delivered. In Khan Academy, 92% of the analysis questions ended in a multiple-choice response. These multiple-choice items took different forms. In some cases, you could make only one choice. In others, you could make multiple choices. Regardless, we should ask ourselves if such structured responses are the most appropriate assessment of a student’s power of analysis.

Broadening our focus from the “analysis” items to the entire set of exercises reveals that 74% of the work students do in the eighth grade of Khan Academy results in either a number or a multiple-choice response. No other pair of outcomes comes close.

141202_8lo

Perhaps the biggest loss here is the fact that I constructed an equation exactly three times throughout my eighth grade year in Khan Academy. Here is one:

141202_6lo

This is troubling. In the sixth grade, students studying the Common Core State Standards make the transition from “Number and Operations” to “Expressions and Equations.” By ninth grade, the CCSS will ask those students to use equations in earnest, particularly in the Algebra, Functions, and Modeling domains. Students need preparation solving equations, of course, but if they haven’t spent ample time constructing equations also, those advanced domains will be inaccessible.

Smarter Balanced Verbs

The Smarter Balanced released items ask comparatively fewer “calculate” and “solve” items (they’re the least common verbs, in fact) and comparatively more “construct,” “analyze,” and “argue.”

141202_9lo

This lack of alignment is troubling. If one of Khan Academy’s goals is to prepare students for success in Common Core mathematics, they’re emphasizing the wrong set of skills.

Smarter Balanced Productions

Multiple-choice responses are also common in the Smarter Balanced assessment but the distribution of item types is broader. Students are asked to produce lots of different mathematical outputs including number lines, non-linear function graphs, probability spinners, corrections of student work, and other productions students won’t have seen in their work in Khan Academy.

141202_10lo

SBAC also allows for the production of free-response text while Khan Academy doesn’t. When SBAC asks students to “argue,” in a majority of cases, students express their answer by just writing an argument.

141202_11lo

This is quite unlike Khan Academy’s three “argue” prompts which produced either a) a multiple-choice response or b) the re-arrangement of the statements and reasons in a pre-filled two-column proof.

Limitations & Future Directions & Conclusion

This brief analysis has revealed that Khan Academy students are doing two primary kinds of work (analysis and calculating) and they’re expressing that work in two primary ways (as multiple-choice responses and as numbers). Meanwhile, the SBAC assessment of the CCSS emphasizes a different set of work and asks for more diverse expression of that work.

This is an important finding, if somewhat blunt. A much more comprehensive item analysis would be necessary to determine the nuanced and important differences between two problems that this analysis codes identically. Two separate “solving” problems that result in “a number,” for example, might be of very different value to a student depending on the equations being solved and whether or not a context was involved. This analysis is blind to those differences.

We should wonder why Khan Academy emphasizes this particular work. I have no inside knowledge of Khan Academy’s operations or vision. It’s possible this kind of work is a perfect realization of their vision for math education. Perhaps they are doing exactly what they set out to do.

I find it more likely that Khan Academy’s exercise set draws an accurate map of the strengths and weaknesses of education technology in 2014. Khan Academy asks students to solve and calculate so frequently, not because those are the mathematical actions mathematicians and math teachers value most, but because those problems are easy to assign with a computer in 2014. Khan Academy asks students to submit their work as a number or a multiple-choice response, not because those are the mathematical outputs mathematicians and math teachers value most, but because numbers and multiple-choice responses are easy for computers to grade in 2014.

This makes the limitations of Khan Academy’s exercises understandable but not excusable. Khan Academy is falling short of the goal of preparing students for success on assessments of the CCSS, but that’s setting the bar low. There are arguably other, more important goals than success on a standardized test. We’d like students to enjoy math class, to become flexible thinkers and capable future workers, to develop healthy conceptions of themselves as learners, and to look ahead to their next year of math class with something other than dread. Will instruction composed principally of selecting from multiple-choice responses and filling numbers into blanks achieve that goal? If your answer is no, as is mine, if that narrative sounds exceedingly grim to you also, it is up to you and me to pose a compelling counter-narrative for online math education, and then re-pose it over and over again.

Blanton & Kaput modify the Christmas carol The Twelve Days of Christmas for an algebraic reasoning task befitting the season:

How many gifts did your true love receive on each day? If the song was titled “The Twenty-Five Days of Christmas,” how many gifts would your true love receive on the twenty-fifth day? How many total gifts did she or he receive on the first two days? The first three days? The first four days? How many gifts did she or he receive on all twelve days?

“The X Days of Christmas.” I like it.

November Remainders

Hi again. It was a busy November. I spoke at the three NCTM regional conferences, keynoting two of them. That plus the Thanksgiving holiday, some family fun, some preschool volunteer work, and some forward progress on my dissertation has left blogging somewhere around eleventh place on the to-do list.

All of that makes your blogging more useful to me than ever. Please keep posting your interesting classroom anecdotes.

Here are all the blogs I subscribed to during November 2014:

Great Classroom Action

141028_1

Coral Connor’s students created 3D chalk charts to demonstrate their understanding of trig functions:

As a showcase entry we spent several lessons developing the Maths of perspective drawings of representations of comparisons between Australia and the mission countries- income, death rates, life expectancy etc, and finished by creating chalk drawings around the school for all to see.

Malke Rosenfeld assigned the Hundred-Face Challenge – make a face using Cuisenaire Rods that up to 100 – and you should really click through to her gallery of student work:

Some kids just made awesome faces. Me: “Hmmm…that looks like it’s more than 100. What are you going to do?” Kid: “I guess we’ll take off the hair.”

One of my favorite aspects of Bob Lochel’s statistics blogging is how cannily he turns his students into interesting data sets for their own analysis:

Both classes gave me strange looks. But with instructions to answer as best they could, the students played along and provided data. Did you note the subtle differences between the two question sets?

Jonathan Claydon shows us how to cobble together a document camera using nothing more than a top of the line Mac and iPad.

Malcolm Swan, on how to begin a lesson:

Every lesson should begin by getting [students] to articulate something about what they already understand or know about something or their initial ideas. So you try and uncover where they’re starting from and make that explicit. And then when they start working on an activity, you try to confront them with things that really make them stop.

And it might be that you can do this by sitting kids together if they’ve got opposing points of views. So you get conflict between students as well as within. So you get the conflict which comes within, when you say, “I believe this, but I get that and they don’t agree.” Or you get conflict between students when they just have fundamental disagreements, when there’s a really nice mathematical argument going on. And they really do want to know and have it resolved. And the teacher’s role is to try to build a bit of tension, if you like, to try and get them to reason their way through it.

And I find the more students reason and engage like that then they can get quite emotional. But when they get through it, they remember the stuff really well. So it’s worth it.

« Prev - Next »