
How does the tech-by-any-means-necessary educator rationalize these findings?
I have my guess tucked into a sealed envelope, scheduled for posting later this evening. Until then, speak your mind.
[This li’l storm cloud via Jon Becker via an e-mail from Scott McLeod.]
20 Comments
Benjamin Baxter
March 27, 2008 - 4:38 pm -Tech-by-any-means has since been replaced by tech-by-good-means. The rationalization from here on out should be:
“Obviously, the good teachers aren’t using technology. If they did, the kids’ scores would only improve.”
or
“These results are skewed by the small number of teachers using tech frequently. These teachers tend to be poor — our argument has been that we have to convince the good teachers to join in, and they haven’t, yet.”
or
“These scores do not reflect valid, holistic measurements of competency or tech skills. I ignore them.”
I think that’ll get us started.
Chris Lehmann
March 27, 2008 - 4:47 pm -I’m not a tech-by-any-means-necessary teacher, but I’ll go ahead and explain it this way:
The NAEP tests an old way of teaching and learning math that teachers who are using technology are not teaching. Therefore, there are skills that the NAEP prioritizes that are not prioritized in a tech-infused math class.
The question then becomes this — what mathematical understandings and skills do students in a tech-infused math class have that kids in a traditional math class do not?
Dina
March 27, 2008 - 4:51 pm -Chris, I gotta tell you, just in case you haven’t had enough bright spots in your day today– you speak volumes to me as principal of SLA, characterizing yourself first and foremost as “teacher.” Bravo bellisimo.
Ben Wildeboer
March 27, 2008 - 4:56 pm -Two things popped into my mind when I saw the graph.
First:
Students using computers for math on a nearly daily basis have teachers who simply roll out Number Munchers for 30 minutes a day and count it as math instruction. Echoing Benjamin Baxter’s sentiment: using a computer doesn’t automatically make it good instruction.
Second:
Does the test measure higher order thinking skills? Is it a valid assessment of the learning taking place on the computer? Perhaps there’s a disconnect between the 21st Century skills being learned and what the students are being tested on.
In addition, I’d like to see a breakdown of how many students were in each category. It’d be interesting to see the sample sizes for each group.
Should we be concerned that even the highest group only earned 48.2% of the highest possible score?
JackieB
March 27, 2008 - 5:58 pm -I have a few questions.
First, why is it being assumed that higher order thinking skills are being assessed/taught on the computer? That they’re being taught in the classroom? Why are we assuming anything? Was all of the classroom instruction the same? Was the computer instruction? Was there a control group? How is this test normed? What’s the median score? What’s the sample size here? The link to the post by Jon Becker gives percentages taught, but no n.
*sigh* Off to figure out which test will tell us if there’s really a difference…(chi squared? anyone? my stats is rusty)
JackieB
March 27, 2008 - 5:59 pm -Argh… apparently so is my use of grammar. *darn spring break*
Ben Wildeboer
March 27, 2008 - 6:32 pm -@JackieB I just assumed you were on a spring break pirate themed booze cruise at the time of your first post.
It was a common thing back in the day to hear a pirate exclaim, “Arrgghh, me stats is rusty!”
Okay. Today is the last day before my spring break begins. Clearly I need a break.
Jen
March 27, 2008 - 6:35 pm -http://nationsreportcard.gov/reading_math_2005/s0031.asp?printver=
That has a couple of sample questions and if you hunt around you can find some more. Advantages to the NAEP are that it’s been around a long time and that it’s national. It’s what they often use for seeing if NCLB test score increases are actually reflected on other tests. (Like in NYC where scores on the state test are going up, but related NAEP scores are stagnant or falling.)
JackieB
March 27, 2008 - 6:48 pm -Okay, me, my rusty stats and my parrot found that for National Public Grade 4 mathematics the Average Scaled Score was 240 with a standard deviation of 29. I think we need more data to determine if there really is a difference (still working on that part. It takes time in
JackieB
March 27, 2008 - 6:50 pm -What the heck? *sigh*
… okay, I’m done for the night
Mr. K
March 27, 2008 - 6:53 pm -i poked around with the naep data explorer myself for a while, because I wanted data on the 8th grade results.
The breakdown of computer usage is a little bit more detailed, with categories like “Use computer for math games in math class”, “Use program to drill on math facts”, or “Using graphing program for charts for math class”.
I picked the data by achievement level, and then by percentiles.
In some of the categories, there appeared to be little or no affect from the category. In others, there was a strong positive correlation between frequency of computer use and percentage testing at below basic, and a negative correlation between the frequency and percentage testing at proficient or above. If anything, the results were more pronounced than those in this post.
I can’t buy the “testing the old way of learning” argument – math is math, and if the kids are getting a deeper understanding, then they should be able to handle both the profound and the simple aspects of it.
I’m more inclined to think (based entirely on personal observation with very small n) that the technology acts as a crutch for the poor teacher, or a distraction from the actual content.
Maybe I should hit my principal with this.
H.
March 27, 2008 - 9:04 pm -Good night, Jackie, we all still love you :)
Mr. K
March 27, 2008 - 9:19 pm -The first of the two things you need to remember from statistics, btw, is that correlation does not imply causation.
It seems eminently reasonable to me that low performing kids have more technology thrown at them as an attempted fix. Even if it marginally successful in improving individual kids scores, these results could make such use look negative because it doesn’t represent deltas for the students, it only report son the success of the groups.
JackieB
March 27, 2008 - 9:43 pm -Thanks H. :)
I bet Dan just loves this thread.
TMAO
March 27, 2008 - 10:12 pm -Ben wrote: “Does the test measure higher order thinking skills? Is it a valid assessment of the learning taking place on the computer? Perhaps there’s a disconnect between the 21st Century skills being learned and what the students are being tested on. ”
Hi Ben,
Let’s assume these exams, and all exams like them, do NOT test Higher Order Thinking Skills (HOTS), but instead test gross basic skills. If a teacher has rightly decided that kids need HOTS, I’m assuming they’ve assessed in some ways and determined that the kids already have a firm grasp of the basics. As such, they should still perform at a high level, even in the absence of a test that assesses HOTS. I’m in a Master’s program, where presumably my intellect and ability to do those HOTS are being assessed, but if my Master’s program dropped a comprehension test on the last reading we were assigned, I could pass the mulitple choice portion.
If the kids are getting that vaunted 21st century curriculum, but can’t first do foundational stuff, what the hell are we doing here, anyway?
That said, HOTS can exist in and around and through ANY learning objective. There’s a fair amount of misusage of the term. I can utilize HOTS in the service of learning the difference between an action verb and a being verb, and it doesn’t matter that this particular objective is considered basic, or in fact, remedial.
J.D. Williams
March 28, 2008 - 6:42 am -I would just like to see the definition of using technology every day. I have a SmartBoard and we use it everyday. Does that mean my class would be in the “Every day or Almost” category?
So is it the teacher/whole class using tech everyday, or individual students using tech every day? The answer to my question is probably in there somewhere, but I have to go to the dreaded Friday staff breakfast.
Jeff Wasserman
March 28, 2008 - 9:08 am -This is where Mr. K’s assessment of the situation works really well with what I see here–our lowest-performing math students, the ones who are in high school, yet doing middle school math, are in the computer labs every day for some district-mandated online remediation. Our AP Stats and Calculus kids, on the other hand, tend to be in regular classrooms.
Granted, this is a high school setting, not an elementary school, but I think the philosophy is the same. Give the low-performing kids the shiny computer stuff so they’ll want to do the work, then test them and watch them fail anyway.
Benjamin Baxter
March 28, 2008 - 11:34 am -TMAO:
You’re right, of course. The point in throwing those out there was because that’s going to be the line of attack, generally.
His later blog posts seem to affirm this.
Laelia
March 28, 2008 - 3:48 pm -From one of the above posts: “…with categories like “Use computer for math games in math class”, “Use program to drill on math facts”, or “Using graphing program for charts for math class” To me this equates to using worksheets in a classroom. “Junk in = Junk out”. Games and facts do not scores increase. In fact, I would like an actual accounting of what was considered ‘computer usage’ and for how long. Once a week for 10 minutes a pop or once a week for an hour at a time.
BTW…Laelia is aka Nancy…LOL…but I’m sure you remember that.
Tracy W
April 2, 2008 - 7:58 am -Perhaps there’s a disconnect between the 21st Century skills being learned and what the students are being tested on.
What are 21st century skills, and how do they differ from any-other-century skills?