Community Voices features opinion pieces from a wide variety of authors and perspectives. (Submission Guidelines)

Rules of engagement for Minnesota’s education wars

Kent Pekel

Although in typical Minnesota fashion we have kept the skirmishes civil, education wars are raging across our state. Fierce debates about what and how we teach our children are underway between districts, unions, advocacy groups and politicians across the political spectrum. Those battles will be waged with even more intensity as races for governor, the Legislature and school board heat up in the months ahead.

The weapons of choice in Minnesota’s education wars include facts and figures that the combatants use to support their proposals and to puncture opposing arguments.  While it is great to see data being used to inform our public debates about education, in recent months I have grown increasingly concerned about the way the numbers are being explained and spun in Minnesota today.

Source: Minnesota Department of Education
The graphic from MDE’s press conference

For example, when the Minnesota Department of Education announced in February that the state’s graduation rate had increased from 75.49 percent to 79.48 percent, the press conference and the press release featured a graph with a vertical axis that began at 70 percent and ended at 80 percent. That made Minnesota’s promising but still limited gain look much bigger than it would have on a standard graph with an axis that starts at 0 percent and extends to 100 percent. 

The same data, but on a 0-100 scale.

Two simple rules

All of the parties to Minnesota’s education conflicts should be able to agree that Minnesotans deserve accurate and unvarnished data to inform the decisions they make about education at the ballot box and elsewhere. Toward that end, I propose that all of us who speak and write publicly about schools adopt two simple but powerful rules of engagement as we advance our ideas. 

1.  Emphasize trends. It is standard practice to compare one year’s test scores, survey results and graduation rates to the outcomes achieved by students in the same grade the previous year. While that is a relevant comparison, it’s important to remember that it contrasts the performance of two entirely different groups of young people. As anyone who has been a teacher can tell you, the demographic composition and character of one cohort of students can differ significantly from another. That is one reason scores often go up one year and down the next.

Rather than focusing public attention on those year-to-year shifts, leaders should emphasize trends that have persisted for three years or more. It’s only when we have at least three such data points that we should start to get excited about improvements or concerned about declines. 

2.  Offer evidence when we claim that something works. When the data on improved graduation rates were released last month, policymakers on one side of the aisle cited it as evidence of their strategy’s success, while policymakers on the other side argued that the gains were made because the state eliminated the graduation exams that students previously had to pass to earn a diploma. Neither group cited evidence to back up their opinions in the news stories that carried their comments. 

Figuring out what causes a change in educational outcomes is extraordinarily difficult, because so many factors influence the way young people perform in school. For instance, yet another contributor to the recent rise in Minnesota’s graduation rate could have been the decline in jobs open to high school students that occurred during the Great Recession and its aftermath. That’s a pattern that has been observed during previous economic downturns in the United States, as more students decided to stay in school when there were fewer employment options outside of it. 

Claiming credit, casting blame

Given the complexity of connecting causes and effects in education, leaders should be cautious about claiming credit and casting blame when a change occurs. At those times when we do feel confident enough to make a claim of causation, we should back up our assertions with data and research. And if the available data and research don’t directly address the issue at hand (which unfortunately happens a lot in education), then we should at least explain in clear, logical terms what we think produced the change and why. In other words, when we link an input to an output, we should clearly connect the dots between the two.  

Given the fierceness with which Minnesotans are waging their battles over education these days, it is unlikely that the warring sides will eagerly embrace rules of engagement like the ones I have suggested here. If, however, reporters and the public begin to look for trends and ask for evidence, then a change in our education debate is not only possible but likely.

That won’t end Minnesota’s education wars, but it will make them a lot more productive.

Kent Pekel, Ed.D., is the president and CEO of the Search Institute. Before joining the institute in 2012, he served as the founding executive cirector of the University of Minnesota’s College Readiness Consortium and as the executive director of research and development in the St. Paul Public Schools.


If you’re interested in joining the discussion, add your voice to the Comment section below — or consider writing a letter or a longer-form Community Voices commentary. (For more information about Community Voices, email Susan Albright at

You can also learn about all our free newsletter options.

Comments (6)

  1. Submitted by Raj Maddali on 04/08/2014 - 07:09 am.

    A url link

    Should be to the study you are referencing. Not to the MDE website.

  2. Submitted by Paul Brandon on 04/08/2014 - 09:43 am.

    Variability and trends

    I agree that a full scale baseline would be appropriate in analyzing educational (and other) trends, although a knowledge reader can usually extrapolate one. The sort of truncated vertical axis (and horizontal/time axis) you show is typical for the business pages.

    Variability is also important.
    If the standard deviation for graduation rates is 1.0, then the results shown approach significance; if it is 5.0, then it is noise.

    Of course the fact that the direction of the trend is maintained over 5 years is significant, and strengthens the conclusion that -something- other than random variation is a work, although it doesn’t tell us -what-.

  3. Submitted by Ilya Gutman on 04/08/2014 - 08:19 pm.


    These rules should apply to any debate, not only to educational one. Plus one more: If any statistics is cited, a casual connection should be proven and not by just a few isolated anecdotes. Maybe then we will be able to get rid of all our myths including a discrimination one.

  4. Submitted by Joe Nathan on 04/09/2014 - 12:51 am.

    One thing omitted from the MDE hs grad report

    Good suggestions, Kent.

    Perhaps it’s worth noting that until the issue was raised at the press conference MDE had not mentioned that one reason for the increase in hr grad rates was that as of May, 2013, students no longer had to pass reading or writing tests in order to graduate.

    At least 3 districts (MPS, SPPS and Osseo) have acknowledged they had students who would not have graduated until the old requirements.


  5. Submitted by Dane Smith on 04/09/2014 - 08:23 am.

    Yes to broader perspectives and responsible use of charts

    Kent Pekel has one of the biggest minds in Minnesota on education policy and his demand for broader perspectives when presenting data is dead-on. One of my favorite bottom-line charts _ neglected by most everybody _shows the percent of total personal income for public K-12 Minnesota investment declining by 20 percent over the last 20 years, from 5 percent of income to 4 percent. Much of what we see from media and some interest groups falsely represents substantial increases in school funding. .

  6. Submitted by Kent Pekel on 04/09/2014 - 05:17 pm.

    Quick Comment

    Thanks to all of you who raised good points in the comments section. A quick observation in response: the use of data for decision making is growing in almost every field, and establishing causation is no easier in health care or criminal justice than it is in education. That said, I think that the use of accurate and appropriate use of data in education faces a couple of particular challenges. First, at every level the system is directly governed by elected boards that are generally composed of non-experts. That means that questionable and even outright wrong ideas can gain a lot of traction in influential settings like school boards and legislatures where experts sometimes have a hard time getting heard. Second, because everyone went to school at some point in their lives, many in the public feel (to some extent correctly) that they know a lot about what should happen in education. Data is therefore interpreted quite immediately and sometimes emotionally through the lens of personal experience. As a result, people are quick to embrace data that confirms their perceptions and reject data that doesn’t. That’s why presenting the data carefully and in context is particularly important in our field.

Leave a Reply