How many posts about the former schools chief of a city that’s 1,100 miles from here can one blog suffer? I don’t know, but we’re going to find out.
It seems I was a few days premature in writing about the controversies that continue to dog erstwhile D.C. chancellor and bee-eater Michelle Rhee. Her reforms are the subject of two separate, critical national studies, themselves the subject of a new 3,200-word dissection in a high-profile education policy journal.
The first study, undertaken by Alan Ginsburg, a former director of Policy and Program Studies in the U. S. Department of Education, found that contrary to claims by Rhee supporters, D.C. students performed worse during her tenure than they did under her predecessors. See “The Rhee D.C. Record: Math and Reading Gains No Better than Her Predecessors Vance and Janey.”
The second, by a committee of the highly respected National Academy of Science’s National Research Council (NRC), found that gains in student test scores in D.C. between 2007 and 2009 were no better than in 10 other districts for which comparable data was available. You can read a pre-publication version of “A Plan for Evaluating the District of Columbia’s Public Schools: From Impressions to Evidence.”
The article in Education Next, by Executive Editor Paul E. Peterson, the director of Harvard University’s Program on Education Policy and Governance, asserts that both reports made factual and analytical errors in an attempt to discredit Rhee. His conclusion: Rhee wasn’t in office long enough for anyone to draw firm conclusions about her impact on student performance.
If you are hooked on Rhee-parsing as contact sport, you can call up a footnoted version of the piece.
(Full disclosure: Two years ago I accepted a freelance assignment from Education Next, with which I have no ongoing relationship. I know its scholarly editorial team is widely regarded as conservative. I was not pushed to take any particular slant on my wholly apolitical topic — teacher-run cooperatives — just encouraged to be skeptical.)
I am not going to run any numbers for you, because I have been down this road before and know beyond any doubt it’s above my pay grade. Which is kind of my point: Confronted with the kinds of statistics that are used to compare one set of students to another in terms of achievement, most of us are forced to decide whether we trust the person doing the math enough to swallow what they say the take-away should be.
All three reviews build their cases not on D.C.’s equivalent of the MCAs, but on the more credible, apples-to-apples National Assessment of Educational Progress test results. Often referred to as the nation’s report card, the NAEPs are taken by a representative sample of students.
I wrote about Florida’s NAEP test results in this space two months ago after Star Tribune columnist Katherine Kersten claimed that given that state’s numbers, Minnesota ought to emulate former Gov. Jeb Bush’s reforms. Kersten did in fact supply a set of eye-popping results, but I poked around a little and found that not only had she omitted some case-puncturing ones, there were caveats big enough to drive a fleet of big yellow buses through.
After that post, I was contacted by a local education policy advocate and self-described statistics geek who spent the better part of two work days slicing NAEP data with me to demonstrate that what the tests really show is that Minnesota kids outpace Florida’s on all but a couple of markers.
You never saw that post for a couple of reasons, chief among them that by the time all of the asterisks and qualifiers had been appended it was every bit as long as Peterson’s piece. Worse, unlike him I could not actually understand the analysis I was contemplating publishing, much less defend it.
(I confess to you in total honesty that I took to my fainting couch, rendered weak by the memory of a humiliating episode where I, a graduate student working in the Los Angeles Times D.C. bureau, regurgitated a federal Labor Department release crediting former President George Bush Sr. with bringing unemployment to an historic low. I did not realize he did so by deciding hundreds of thousands of people were never going back to work and should simply be erased from the rolls.)
So, no numerical arguments from me. I do, however, want to highlight one interesting passage from the EdNext story concerning the National Academy of Sciences panel’s report. The committee — which was made up of people who parse data for a living — concluded that test scores had risen modestly during Rhee’s time and continued to do so. But the scientists would not say whether this happened because of her reforms; in effect, no data linked correlation to causation, they said.
You know what that sounds like to me? Standardized tests don’t necessarily tell us much more than how a particular student, on a given day, does answering a series of questions that are actually quite limited in what they measure. They do not reveal whether a botched equation is the result of a missed breakfast, years’ worth of missing skills or a union-protected dullard of a teacher.
There is one number I think I understand. The National Academy of Sciences raised and spent $650,000 compiling its report, which it describes not as an assessment of Rhee’s reforms but “guidance on how to structure” said evaluation. I make that to be about 76 times the average amount of state funding Minnesota provides for each public school student.