Month: January 2013

Total 11 Posts

Better Best Squares & The Future Of Math Textbooks

a/k/a Dave Major Rides Again

It turned out to be productive and fun arguing over who among the four contestants in this video did the best job drawing a square. Video served us well. It gave us something to look at, argue about, and abstract. But video is still a static medium in many ways. The pictures are moving but it doesn’t edit well. It doesn’t personalize. It doesn’t reflect the learner in any way.

So Dave Majors and I partnered up again to kick around an idea of what this task would look like in code, in a web browser, and came up with better best squares.

He’s written a post describing some of his technical innovations. I’m going to use this space to point out our pedagogical innovations.

  1. The most obvious difference here is that instead of watching four people attempt to draw a square, you get to attempt to draw a square yourself.
    130112_1lo
  2. That quadrilateral then follows you throughout the text. Rather than using a generic example to illustrate a mathematical concept, we use the example you created. We talk about its perimeter. We talk about its area. The diagrams in the margins change. The text in the textbook changes.
    130112_1lo
  3. You see your classmates’ quadrilaterals and make an intuitive ranking of their square-ness. When we formalize the concept of square-ness later, we’ll refer back to our initial rankings. Ideally, the mathematics will validate the student’s intuition and vice versa.
    130112_1lo
  4. You can revise and skip most questions. We’re deviating here from our last experiment where each question had to be completed before you could move on. In a print textbook, you can always flip forward and see what’s next or move onto a new task if you don’t want to complete the current one. So you can leave an answer blank. You can go back and revise your answers. The textbook doesn’t judge you. It doesn’t say, “You’re wrong.” It reports your response (or non-response) to your teacher and lets your teacher make the pedagogical judgement there.
  5. The teacher’s edition is so useful. I asked Dave to let me see all responses disaggregated a) by student and b) by question. I want to click on Mike’s name and see all his progress throughout this unit — everything he drew, everything he wrote. Then I want to click on each question and see every response. Dave went above and beyond here. You see every student response but you also see the revision history on those responses. You can trace the student’s thinking. You can also flag student responses to show the class. I’m such a fan of Dave’s work here.
    130112_1lo
  6. Don’t like our definition of “best square” as being the ratio of areas? Submit your own. The system will accept your formula, send it to the teacher, and then use it to rank the entire class’ quadrilaterals.
    130112_1lo

Dave and I both agreed this problem is a little too obscure and weird to justify all the effort we put into it. But critique the digital pedagogies rather than the task itself. These pedagogies can transfer to other, better tasks. Critique this definition of personalized learning.

Previously. Dave Major Shows You The Future Of Math Textbooks.

2013 Mar 27. A UK teenager codes the algorithm for judging the best circle. Be sure to stick around for the part where the cat judges you.

Is This Press Release From 2012 or 1972?

130105_1

Here are five quotes, some of which are from edtech startups in 2012 while others are from an advertorial for “Individually Prescribed Instruction” published in ASCD in 1972. Can you tell them apart?

#1

Educators and parents across the country seem to agree that a system of individualized instruction is much needed in our schools today. This has been evident to any parent who has raised more than one child and to every teacher who has stood in front of a class.

#2

[This product] allows the teacher to monitor the child’s progress but more important it allows each child to monitor his own behavior in a particular subject.

#3

The objectives of the system are to permit student mastery of instructional content at individual learning rates and ensure active student involvement in the learning process.

#4

This is a step towards the superior classroom, because the system includes material that can be used independently, allowing each child to learn at his own rate and realize success.

#5

The technology, training program, and management technique give the teacher tools for assessment, mastery measurement, and specified management techniques.

Okay, they’re all from 1972, from a piece called “Do Schools Need IPI? Yes!” [pdf]. But really the only line that’s obviously out of the past is:

The aide’s most important functions is the scoring, recording, and filing of students’ test and skill sheets.

Computers now handle that scoring, recording, and filing. But in every other way, you could have ripped the text of that article from a Techcrunch article or New Schools prospectus.

I’m not merely snarking that what we think is new and great isn’t so new. I’m also saying it still isn’t great. Stanley Erlwanger wrote an incredible piece in 1973 illustrating how easy it was for a student named Benny to appear successful in IPI while actually knowing very little. Both in 1972 and in 2012, these systems ask questions that are trivial enough to be gamed. The only difference is that instead of writing questions to accommodate the limitations of a human-scorer, we’re now writing questions to accommodate the limitations of a machine-scorer.

If you’re in this industry, read those papers close enough that you can tell yourself, “I understand why IPI failed. This is how we’re different.” Basically, IPI is a free failure for you and your company. I hope you won’t pass it up.

BTW. Justin Reich points me to the opposing piece from the same ASCD issue:

While some persons see the IPI program as aimed in the direction of “humanness and openness,” I consider its implementation a step in the opposite direction for many schools. For more than 50 years, many recognized leaders in education have worked to move learning opportunities provided in our schools from “rigid, passive, rote, and narrow” to “open and humane.”

2013 Jan 12. Mike Caulfield again points out that personalized learning may have an isolating effect on students who really need to have their assumptions tested by their peers:

Benny, the student the study is about, has some odd ideas about mathematics, induced by peculiarities of the testing system. But he’ll never know they are odd because the individualized instruction makes discussion with peers impossible.

2013 Jan 13. Mary had a positive experience with IPI and highlights the efforts her teacher took to keep the program from isolating students with their misconceptions:

I was educated using IPI from K-4. IPI allowed me to work at my own pace, which tended to be faster than average in Math and about average in the reading. When I moved to a district that did not use it, I was devastated. I hated the non-IPI system and was bored and annoyed with math for the next three years. Since This was so devastating to me, I clung to my IPI materials and I still have some all these years later. I use them and my experiences to balance the discussion we have in my graduate class when we discuss the Benny Paper. You see, to me, IPI was not a failure, the way Benny’s teacher implemented it was. Teachers still had to teach when using IPI or of course it would be a failure. My experience with IPI was different in key ways than The Benny paper describes… the teachers would set up table groups each week based upon what book we were working on. Along with working independently through the workbook and tests, the students were required to discuss a question provided by the teacher and s/he would ask each group to stop and discuss it at a particular time so s/he could be there to listen in. In addition to this, after each unit test, we had a brief one-on-one meeting with the teacher to discuss the content, where according to my old handwriting, I was being asked targeted questions where I needed to explain my reasoning. In other words, my teachers did their own assessments and did not rely on bubble sheets. True the initial presentation of the material came through the workbook, and it’s true such a system would not engage all students all the time, but that’s where teachers come in. Teachers need to know their students. Teachers need flexibility day by day, student by student, to use or not use these tools. Allowing students to move through material at their own pace is still a good goal. Giving teachers tools to help them manage that is a good goal. Devising tools that remove teachers from the process is where we go wrong.

Rocketship’s Learning Labs & The Cost Of Personalization, Ctd.

Rocketship CEO John Danner went on record with EdSurge. The Learning Labs aren’t leaving.

Online learning is integral to our model…The Learning Lab is not going away, rather we are working to integrate its key components directly into our classrooms under the guidance of our incredible teachers and staff…I think Merrow probably just happened to focus on an isolated incident and wanted to bring it up as it is always a valid concern with online learning. We continue to work on the data integration piece and this pilot doesn’t change the importance of that. Our teachers continue to get more robust data from the Learning Lab and are eager for us to work towards a fully integrated and real-time system.

Jason Dyer notes that this doesn’t really address NewsHour’s criticism:

Is the complaint from the PBS interview really about “teacher interface” or even “data”?

Meanwhile, on his blog, Danner writes a post called “Kids learn when they are solving problems,” in which he laments the state of online learning and basically outs himself as a radical constructivist.

When you are in a school, I think it becomes very clear when learning happens. Students who are working on a problem that they can solve learn by trying to solve the problem and receiving prompts and insights from peers or the teacher when they make mistakes. This eventually helps them get over the hump and be able to solve similar problems with a lot less mental effort. That’s learning. This happens thousands of times a day in well run classrooms. For whatever reason, we have really lost this truth in online learning.

All of this makes Danner, and Rocketship, really hard to pin down. But there’s a lot to like here and even more that’s interesting.

[3ACTS] Toothpicks

I enjoy tasks that exhaust a finite supply of things in order to make some kind of interesting structure. Here, a finite supply of toothpicks (250 of them) are exhausted to make a pyramid. (Or consider the finite fencing around Pixel Pattern.)

At some point I’d like to test out the hypothesis that removing the finiteness would make the video a lot less perplexing on the whole. In other words, we wouldn’t be as perplexed by a guy plugging away at a pyramid with an inexhaustible supply of toothpicks. We’re perplexed because we know, at the end of the video, that he’s done and we want to know what the pyramid looks like.

Here’s the task page.

Here are several interesting questions that popped up at 101questions:

  • Alison: How many rows in the end?
  • Hope Gerson: How many toothpicks does it take to make the next sized equilateral triangle?
  • Douglas Moore: How many triangles?
  • Matthew Clark: What’s the perimeter of the final triangle?
  • Scott Westwell: How many small (3 toothpick) triangles can be made?
  • Jeff de Verona: How many “total” triangles will be created (any size)?
  • Gregory Taylor: How long did it actually take to go through the entire stack?

Rocketship’s Learning Labs & The Cost Of Personalization

One of the most fascinating pieces to come out of the winter break was this segment from PBS NewsHour’s John Merrow on the Rocketship charter network.

The video distills into ten minutes all the most interesting angles on Rocketship — its high parent involvement, its high teacher salaries and professional development, its morning “launches,” and the segment pays special attention to Rocketship’s “Learning Labs,” which Merrow describes as “lots of computers and kids, no teachers.” (Watch that part of the segment.)

This aspect of a lot of charter and for-profit schools should make us all very uneasy. Rocketship can afford to pay its teachers more because, for one hour each day, the students are plugged into computers, boxed into cubicles, and tutored intermittently by low-skill, hourly-wage workers. Rocketship spruces up its lab with lots of primary colors but it can’t shake comparisons to a call center.

130104_2

130104_6

130104_8

130104_7

130104_5

130104_4

130104_3

This is “differentiation,” says John Merrow, and it’s true that the students are working on different tasks, but at what cost? The students don’t interact with their peers or their teachers. The math program, ST Math, isn’t bad but computers constrain the universe of math questions you can ask down to those which can be answered with a click and graded by a computer. The promise of personalization, of perfectly differentiated education, has forced Rocketship to make dramatic concessions on the quality of that education. It’s a buffet line where everyone chooses their own flavor of the same gruel.

Merrow’s documentary team wasn’t persuaded of the Learning Lab’s merits:

The Learning Lab saves schools lots of money but there’s just one problem: they’re not really working. A problem we saw is that some students in the lab do not appear to be engaged. They sit at their computers for long periods of time, seemingly just guessing.

What’s remarkable is that the Rocketship staff is also unpersuaded of the Lab’s merits. One principal says, “If I had to guess, I’d say you come back in a year, you won’t see a Learning Lab.” Another says, “Next year we’re thinking of bringing the computers back to the classroom.”

This isn’t any kind of small pivot, something Rocketship can gloss over with a sunny press release. Throughout Merrow’s segment, the teachers, the principals, and the charter CEO all spoke of their commitment to innovation. We should commend them for innovating away from technology when it’s ineffective, especially given their particular location (Silicon Valley) and time (2013). That just isn’t easy.

BTW: Mike Caulfield suggests that personalization is hostile to the kind of whole-class conversation we know to be valuable:

Indeed, structured classroom discussion has one of the highest effect sizes in Hattie, much higher than mastery learning. But it’s really difficult to have a classroom discussion (or group activities that foster student discussion) without some level of shared experience and knowledge. I’m curious if this fact might lie behind much of the surprising failure of computerized adaptive learning systems.

2013 Jan 09. Edsurge got Rocketship CEO John Danner on record. The Learning Labs are staying:

Online learning is integral to our model…The Learning Lab is not going away, rather we are working to integrate its key components directly into our classrooms under the guidance of our incredible teachers and staff…I think Merrow probably just happened to focus on an isolated incident and wanted to bring it up as it is always a valid concern with online learning. We continue to work on the data integration piece and this pilot doesn’t change the importance of that. Our teachers continue to get more robust data from the Learning Lab and are eager for us to work towards a fully integrated and real-time system.

2013 Jan 25. MindShift reports that Rocketship is, indeed, moving the computers back to the classrooms.

Featured Comments

Clyde Boyer:

I read something from the http://edtechnow.net/ blog recently that really struck a nerve – a quote from William Cory, Assistant Master of Eton, who wrote in 1861:

“You go to school at the age of twelve or thirteen and for the next four or five years you are engaged not so much in acquiring knowledge as in making mental efforts under criticism.”

It’s that ‘mental efforts under criticism’ piece, that structured classroom discussion where your thoughts are challenged where higher order learning takes place.

William:

It’s important to assess thing like this not only in terms of how effectively they teach math, but also in terms of what they teach children *about* math. The learning lab teaches children that math is a solitary activity, wherein one clicks at things on a computer until the computer approves.

Jenny:

Not only should we be concerned about what students are learning about math based on this experience but what they are learning about computers as well. I’m sure the majority of schools are not doing a much better job of offering elementary students the opportunity to use computers as more powerful tools rather than skill-practice machines, but most don’t have kids doing so quite this much. If we want students who will explore, innovate, challenge ideas, we have to help them see more possibilities than simply answering questions and being told right or wrong.

Jane Kise:

One simple filter, Jungian type, tells us that over half of all children aren’t going to be energized by an hour at a computer screen. Extraversion and Introversion in personality type terms involve how we are energized. All of us can do both, but one is preferred and the other is draining. Further, even if the Introverts like the computer lab, they still need the stimulation of discussion, learning to express their ideas and question those of others. Since a good portion of school is still set up for more Introverted activities, adding interventions that require more Introversion makes it a very, very long day for the Extraverts–and they just might start talking and moving when you least want them to.

Michael Paul Goldenberg:

“The learning lab teaches children that math is a solitary activity, wherein one clicks at things on a computer until the computer approves.”

Perhaps not TERRIBLY different from the way many math classes operate, if you simply substitute “teacher” for computer in the second instance so that we have, “Math classes teach children that math is a solitary activity wherein one writes or says the answers to computations until the teacher approves.”

William:

Out of character, writing this sort of stuff is *hard*. It’s hard for actual live human beings to understand how students are modeling the math in their head and respond accordingly. Poor Jennifer [DreamBox’s computerized teacher-avatar – dm] just repeats her instructions. If I were a student who didn’t understand place value, I might walk away from this unsure about my own multiplication facts, that were good.

Jennifer might help me more if she knew about some common errors (and maybe that sort of thing is going on in the background, invisible to the student?). Like Dan, I don’t want to be a luddite, and if the computer is better than people, we should go for it. But computers have a long way to go.

Much of teaching is empathy – being able to see the world through the eyes of a person who doesn’t know the things you know. It’s being able to communicate with someone who sees the world differently than you do. There are a thousand ways that live, in person communication can cultivate and encourage that empathy in teachers. For programmers who are at arms length, cultivating that empathy is double difficult and important.

Jennifer just asked me if I’d like to continue working, ’cause it took me a while to write this. I think my answer would be “no”?

Tim Hudson:

So just as you imagined this hypothetical student in a DreamBox lesson, I think it’s valuable to imagine this same student entering a classroom without the support of a technology like DreamBox:

The multiplication standard algorithm is a fifth grade Common Core standard, so let’s assume the student is a fifth grader who doesn’t understand place value. This student transfers into a new school and math class on the day after the teacher introduced the algorithm. Does the teacher know the student doesn’t understand place value? If not, how will that information be acquired? Once it’s known that the student lacks place value understanding, should the teacher continue teaching the algorithm lesson even though the student is clearly not ready for it? If not, what does the student do during math class?

Too often, the student is taught the algorithm right then because there are simply too many logistical and resource constraints that limit what even the best teacher is able to do in that situation. It’s no certainty that the student will meet grade level standards by the end of the year, and the inherent challenges of this reality end up being a huge strain on both teacher and student. I’m empathetic to both of them. And the tens of thousands of others in the same situation. These are the teachers and students we’re trying to help.