30th 2015
Don’t Fall Victim to MisNAEPery

Posted under Accountability & Edublogging & Grades and Standards & Innovation and Reform & Journalism & Research & Testing

It’s NAEP season, my friends. The 2015 National Assessment of Education Progress results were released this week to a barrage of spin, rhetoric, and general “misNAEPery.” I’ve mostly seen this misNAEPery pop up in the form of certain folks using the data to show that education reform efforts aren’t working. (For now, we’ll ignore the crushing irony of using test scores to prove that testing isn’t valuable.) That’s a bummer, so let’s spend a few minutes today talking about what this year’s results do and do not mean.

First, let’s talk briefly about the results themselves. Chalkbeat ran a pretty good piece on Colorado’s 2015 NAEP scores that included some nifty graphs. Nifty or not, however, I take some issue with the graphs’ reliance on percentages of kids scoring proficient or better rather than scale scores. Not that I blame Chalkbeat for going this way; graphs showing what appears to be actual change are a lot more exciting than what you get when you look just at scale scores over the past ten years. Those graphs look like this:

You’ll note that the line is basically flat. Not as exciting, is it? I think it’s safe to say that the rhetoric surrounding this year’s slight drop has been somewhat inflated.

Even so, Colorado did see some declines this year. Reading scores dropped by three points in both 4th and 8th grades, though neither decline was statistically significant. Math scores also dropped by five points in 4th grade and four points in 8th grade, both of which were statistically significant changes.

The statistically significant dips in math scores are admittedly not what we’d like to see, but CDE was quick to point out in a press release that Colorado’s longer-term math trend still reflects statistically significant positive changes in both grade levels. That makes some sense when you think about the national results we’ve seen on NAEP’s own long-term assessment (which notably differs from the main NAEP results that were just released), though I’m not sure I agree with masking potentially important point-in-time shifts with overall trend data.

My thoughts on the results aside, I strongly encourage you to dig around through CDE’s materials on the 2015 NAEP results. While you’re at it, spend some time with the National Center for Education Statistics’ Nation’s Report Card site. You’ll find all sorts of cool stuff that you can fling around at your next dinner party.

But before you start flinging, let’s pause to remember something important: Correlation is not the same as causation. Remember this simple statistical truism the next time you see reform opponents tout this year’s downward tick as “proof” that education reform—or rather, whichever part of education reform they are most inclined to pick on—doesn’t work.

Can we definitively say that reform efforts haven’t played a role in the downturn? No. It’s entirely possible that general instability in the system as a result of the testing and accountability wars has played a part. It’s also very possible that we’re seeing an “implementation dip” related to changing standards, or a misalignment of those standards with what NAEP is measuring. Or maybe ten other things. But just as we can’t definitively thank individual reform efforts for generally rising scores (as we are often reminded by anti-reform folks), we can’t definitively blame those efforts for small downturns.

Doing that would be kind of like blaming Nicolas Cage movies for the number of people who drown in swimming pools. Or pointing to U.S. spending on science, space, and technology as a leading driver of certain types of suicides. Yes, my friends, these spurious correlations are real. See for yourself (courtesy of tylervigen.com, which is one of my favorite websites of all time):

None of this is to say that we should rest on our laurels or reflexively blow off the downward turn as an aberration. Any downturn, small or large, temporary or permanent, probably holds some useful lessons for us. But let’s be sure we’re drawing informed lessons from the data rather than force-fitting them into the anti-reform box.


4 Responses to “Don’t Fall Victim to MisNAEPery”

  1. Ed is Watching » New PARCC Scores Are Ugly, but the Real Question Is Why on 13 Nov 2015 at 2:03 pm #

    [...] with them. Some of you may remember that this was also a leading explanation for this year’s NAEP score dip. The case here often made by pointing to Kentucky, which in 2010 became the first state to adopt [...]

  2. Re-Assess Common Core and Consider Reversing Direction on 25 Dec 2015 at 7:50 pm #

    [...] not beyond presenting visually misleading data to prop their claim that nothing has changed (in these charts, for example, the author draws NAEP with a 50 points/grid, where 10-12 points equal a grade level; in other [...]

  3. Ed is Watching » Little Eddie the Liar? on 29 Dec 2015 at 12:10 pm #

    [...] returned from Christmas break yesterday to find a trackback on a post I wrote back in October about what this year’s NAEP results do and do not mean. In that post, I chided anti-reform [...]

  4. Ed is Watching » New PISA Results Bring the Same Old Disappointing News on 08 Dec 2016 at 4:18 pm #

    [...] What’s holding us back? The short answer is that it’s hard to say. As is the case with NAEP scores (warning: this post contains the infamous graphs over which I had a big, nerdy spat a year ago), it [...]

Trackback URI | Comments RSS

Leave a Reply