Design for Educators: Greg Farr’s Dashboards

I wrote about Greg Farr‘s dashboards awhile back, his weekly airing out of the campus’ dirty laundry: non-attendance, discipline, drop-outs. “There are no secrets at Shannon,” Greg says. If I were ever to step into administration, implementing that kind of accountability would head my list of Things To Do Before I Ever Sat Down.

Here’s a sample dashboard, lifted from the school’s website

This particular accountability measure freaks me out, also, because it demands focused graphic design, which my longtime subscribers will recall is an incessant fixation of mine.

Unfortunately, Greg and his team have here what designers call a “low signal-to-noise ratio.” The information he’s trying to convey pulses faintly from the screen (low signal) while other design elements blare static around it (high noise).

In trying to determine a piece’s signal (and this applies to your conference slide deck, classroom PowerPoint slides, handouts, writing, public speaking, anything) ask yourself what is essential to the point and then separate the rest. (Or, as Queen Gertrude advised Polonius, “More matter, with less art.”)

In this case, the life blood of Greg’s design, the thing without which the dashboard would be nothing, is very small.

These tiny arrows constitute a faint signal. They force the viewer to work harder to determine meaning. A faint signal isn’t fatal, by any means, but a little obscurity compounds quickly over time.

The noisier elements of Greg’s design, I imagine, are obvious to both he and I. This is a fiendishly complicated project for a lot of reasons (listed below) and the solutions (listed even farther below) are not obvious.

  1. He’s dealing with totally different scales. Attendance is a continuous percentage. Discipline is measured by discrete incidents.
  2. Inconsistent indicators. High attendance is good but high discipline is bad. The same indicator for each carries with it opposing significance. Greg has attempted a workaround by reversing the attendance scale, which wasn’t a bad move.
  3. Each colored section (green, yellow, and red) measures a different range. The green zone for referrals is 3 units long; for non-completers and safety it’s 2 units long. This is almost certainly due to the fact that Greg and his team think that three referrals is less a cause for concern (threat level green) than three non-completers (threat level yellow). That ethical differentiation is pretty cool, though it’s also throwing up some unwanted noise.

So as we attempt a (totally unauthorized redesign) of Greg’s dashboards we try to amplify what matters and dampen what doesn’t. If this were a contract job, I’d be in constant contact with Greg and his team, asking them what they thought mattered. As is, this is totally presumptuous.

Here’s my draft:

Specific revisions:

  1. “Attendance” from the original had to become “absence.” Imagine two side-by-side pie charts comparing two candidates in an election. Imagine that one pie chart showed how many people did vote for Candidate 1 and the other chart showed how many people did not vote for Candidate 2 and you have an idea of how confusing this can get over the long haul. So everything is now defined as a negative. Low bars are good and high bars are bad all the way across the board.
  2. There is an exact count for each measurement to counterbalance the noisy scaling from the original. Each absence bar measures two percent (2%) and each of the other scales measures one (1) referral / withdrawn student / safety incident. It doesn’t really matter if the viewer knows that or not, however.

    The strength of the colorful scale in both the original and the revision is its visual signal. (Red is bad!) However, Greg asked his visual signal to carry information on its back, whereas I pushed it off to the side letting them both do their separate jobs.

  3. Gradients. In a few years gradients are gonna fall out of design favor, at which point I’ll look back on this time of my life with a small cringe. For now, gradients modernize the original ever so slightly.
  4. A design that reflects the school’s brand. I’m probably the only Californian who knows that Greg’s district’s web server was down this last weekend. Once it came up, I pulled a color swatch from its website. Depending on who sees the dashboard, it’s a nice opportunity to enhance the Shannon Learning Center brand.

If I were Greg, I’d be wondering right now if this update is as easy to update as his original, where he just moves arrows side-to-side on the scale. A: Not quite, but close.

You start from a fully loaded original and delete the bars you don’t need.

If you want to animate the thing for extra credit, you create five slides. (Greg’s original looks like a PowerPoint file; mine’s out of Keynote.) The first has empty indicators; the second has an accurate first indicator; the third has an accurate second indicator, and so on.

Then you set the transition between each slide to a two-second left-to-right “wipe” and the result is something like a rising power meter. Keynote will export to QuickTime so you can play it on your school’s closed circuit tv network, if you’ve got one of those.

Easy, good-looking, high signal-to-noise ratio, here are the resources:

  1. Keynote [pretty!]
  2. PowerPoint [sigh!]
  3. QuickTime [the full effect!][qt: 400 316]

Final remarks:

  1. Thanks in advance, Greg, for being a good sport on this one. You’ve got a great thing running there.
  2. If you’re an administrator (or know one) who’d like to put Greg’s program into effect at your school (or district) I’d like to provide the design work for $free. Pass it on: dan at mrmeyer dot com.
I'm Dan and this is my blog. I'm a former high school math teacher and current head of teaching at Desmos. He / him. More here.


  1. Wow!!

    There is no team, just me…but if there WAS a team I know they’d join me in yelling, “THANK YOU!”

    I think I owe you waaayyyyy more than that Adult Beverage you once offered!

    I’m not familiar with Keystone. I’ll have to take a look. In the meantime, get some rest cause we have work to do, my friend and colleague!


  2. Okay, I *love* the redesign. But, from long ago days in mental health and setting up a few behavioral plans…can you switch it so that it’s positive?!

    That is, that you are aiming to fill up all the bars, so that you have a nice pretty green for go 95-100% bar and then as your bar slips downward into the bad zones it goes to yellow and then red?

    Just a quibble, for sure.

  3. Back in Greg’s original, he had attendance working fine as a positive measure of absence. The problem comes with all the rest. Like, the positive measure of “discipline referrals” is, er, “unreferred students,” and it’ll always ride pretty high. It would be nice if we could work it out, but that’ll remain one of the confounding issues that made this such a tough nut to crack.

  4. Outstanding work, Dan. I’m thinking I’m going to poach Greg’s “Dashboard” idea and your design ideas when I start at my new school in the fall.

    I echo Jen’s sentiment regarding shooting to *fill up* bars, but I can see where that would be tough to do in terms of how this stuff is measured.

  5. Working in a collectivist (Japanese) rather than individualist (USA) culture (with all that that implies), and while accepting that the focus of your blog entry is the design of the chart rather than bigger issues, can I ask what may at first seem like a dumb question?

    Why is it important to put this kind of info out there? How does this enhance or nurture accountability? In this (collectivist) culture, it would almost certainly have a negative effect, and anyway no-one would be caught dead hanging their dirty laundry in public like this. I’m not criticizing, just asking for information.

  6. If it helps, here are the reasons Greg cites on his page:

    The dashboard serves TWO CRITICAL FUNCTIONS:

    I. Identifies Areas That Need Improvement

    As principal, I am particularly tuned into the dashboard. More than just daily postings, I watch for trends that might indicate areas in which some type of intervention may be necessary. If any indicator begins declining and stays low, immediate steps are taken to evaluate and locate reasons. Anything from student focus groups to site-based meetings may be called to evaluate and recommend strategies to address the areas of concern.

    II. Communicates Campus Status to All Stakeholders

    There are no secrets at Shannon. Our operations and data are open and available to all stakeholders — parents, staff, students, and community. We make every effort to let everyone know how we are doing. Communication is encouraged as much as possible. There is a reason my home number is on my webpage!

  7. Larry Dallas

    June 12, 2007 - 5:07 am -

    I have to agree with Marco Polo. Sometimes I think I was born on the wrong planet, because it seems like everyone thinks stuff like posting “accountability” (will that word please go away!!!, can’t we use “responsibility” instead?) is a great idea.
    If I was a student, I really would not care. The students that care about that kind of stuff are the school “leaders” anyway, those with great grades, on sports teams, popular with other kids and staff at school. I was one of those kids that wanted to tear that stuff down and throw it in the trash.
    It is now 15 years since I graduated high school. I am no different about those kinds of things now. The only difference is I know more about the harm it can cause. I see similar things posted at the school I work at and boy do I wish I had the gonads to tear it down. I have two kids at home that I would like to clothe, feed and have a roof over their heads, so I don’t.

  8. Oh wow…

    I love the honesty of Larry’s comment!

    If Larry were to say this to me aloud in a faculty meeting, my staff would smile, look at each other knowingly, and get comfy in their chairs because they know two things about my leadership style:

    1) every teacher is empowered and encouraged to ask / challenge me exacly like this; and

    2) everyone knows a healthty debate is about to break out…

    This tossing down of the gaunlet is just too good to ignore, Larry…So bring it on…Meet me on my blog at high noon:

  9. Larry, if we assume that this information is useful to a principal — absences, discipline, and their trends over time — then it isn’t hard to assume it’d be useful to other stakeholders also. Assistant principals can use it to determine if their disciplinary interventions or attendance incentives are effective. And from there it isn’t tough to see that teachers would find it sobering and encouraging also. Students too, and not just the hyper-interested dorks you reference.

    There are exceptionally rare cases when more information is undesirable. All you’ve offered here is your distaste for this buzzword “accountability” (which I admit I share somewhat). You haven’t actually made a case for anything but your own annoyance.

    So: how would reflecting student behavior back to them (like Greg has done) harm a school’s environment or its kids?

    Marco, what’s weird to me here is that Greg’s public display of ugly data is a very collectivist approach to a process that is typically individualistic.

  10. Why not just use numbers? There are but four categories, they are not closely related, (how many referrals is one absence worth?), there is not really a common scale.

    That being said, your design is a clear improvement over the original. But, yeah, we report and are interested in referrals (negative) and attendance rate (positive). Maybe one gradient can run right to left, with its label opposite the others?

  11. Dan, really, really nice work here. I would be the last to call myself a design guru but I do try to work with my students to help them understand that good display facilitates information acquisition.

    Is the animation just eye candy? The quick snapshot of the data is what we need. Can’t we just get it all at once rather than waiting for it to be displayed one bit at a time? Maybe this is a personal quirk of mine; I’m usually not a big fan of using incremental bullets on PowerPoint slides either. Unless there’s a specific reason why some of the information on the screen needs to pop up later, I tend to say just put it all up there. [note: this is not a denigration of your work; I am simply lending my own limited design perspective]

  12. Jonathan, I got a power meter stuck in my head and it wouldn’t leave me alone. A strict text indicator might do the job as well (only with less noise) but I would need to work a mock-up to be certain. However, speculatively, I think I would miss knowing when the indicators hit critical levels. In a text-only interface, I don’t think it would be clear to me that four referrals in a week is a threat-level-yellow situation.

    Scott, if we were using an arbitrary checkerboard animation to move between two different slides on experimental psychology (e.g.), I’d make the same case for superfluous eye candy. But consider the scenario, if you haven’t, of someone discussing the data, say, at a staff meeting. It strikes me as a value-added proposition there to let her reveal the data sequentially, while discussing their significance.

  13. Lori Jablonski

    June 14, 2007 - 6:38 pm -

    Since I have no idea what anyone is talking about on the School 2.0 thread, I thought I might weigh in here.

    I would actually love something like this at my school site, both on a newly-designed website (ours is a dismal joke) and as some sort of static display in our main hallway, and am considering proposing it for next year, but I’m going to weigh in on the side of the original. The green, yellow and red with the sliding arrows (you can’t really consider the arrows in isolation from the colors and the bars) took me less than a second to read and digest the meaning with no explanation necessary other than the info on the dashboard (and that’s about how long anyone should take looking at this (it is after all intended to serve as a weekly snapshot). I can’t figure out why anyone needs to spend much time on gradients or scaling or trying to positively state each dashboard indicator or whatever on a something as straight forward as this. It actually made the redesign much more difficult to quickly figure out. I had to stop and work at actually seeing what was before me; I think it was the empty boxes that had me momentarily flummoxed.

    My only suggestion: make the arrows bigger and play with the fonts and text and background colors. So much for sophisticated design principles. In this case, keep it simple, simple, simple.

    By the way, Dan, I’ve really loved reading your stuff this year. What a find. Only an hour or so at work tomorrow and vacation begins! Hope the summer is sweet to you.

  14. Always room for disagreement on design. At least now, if you raise this with your school planning team, you’ve got a couple of formats to offer. (Someone oughtta ask Greg to upload his PowerPoint file.)

    What’s always interesting to me is how much the little things matter — like the extra split second it took you to process my design and the extra one it took me to process Greg’s. It was initially tempting for me to say, aw, what’s it matter? I got the point. So what if it took a few extra blinks?

    But I have to remind and re-remind myself that those extra blinks, for some of my students, particularly for those who don’t easily construct meaning from words, constitute the difference between buying-in and checking-out. So all the gradients, scalings, positive re-framing, and tiny tiny redesigns, the stuff that seems too picky by half, are all part of what makes this teaching thing so special to me.

    Which isn’t to say I’m sad to see it go for a few months. Enjoy your break, too, Lori.