Nonprofit, independent journalism. Supported by readers.

Donate
Topics
MinnPost's education reporting is made possible by a grant from the Bush Foundation.

Why some educators have serious problems with Minnesota’s school accountability system

Four years after it was launched, the MMR doesn’t accurately and fairly reflect all students’ academic growth, critics say.

Four years after its launch, Minnesota’s homegrown school accountability system is under increasing fire from a growing number of educators and policymakers for failing to accurately and fairly reflect all students’ academic growth.

“It’s wildly incomprehensible and arbitrarily designed,” says Kent Pekel, executive director of the Search Institute. Pekel, who has written a detailed critique of the MMR, was on the state advisory committee that agreed to what he says was supposed to be a temporary set of measurements. “It’s just a bad system and it’s time to move beyond it.” 

Released last week, the latest round of Multiple Measurements Ratings (MMR) sparked a flurry of headlines touting state officials’ claims that the numbers showed two-thirds of Minnesota schools are on track to cut academic achievement gaps in half by 2017. Education Commissioner Brenda Cassellius chastised the Minneapolis and St. Paul districts for failing to participate in a state program she credits for progress in the on-track schools.

Yet both assertions are misleading, according to critics of the system — a number of whom are frustrated enough this year to break their silence on the issue.

Article continues after advertisement

Among the concerns is that the numbers are being used in political ways. “When you say two-thirds of schools are on track to close the gap, that suggests kids everywhere are making progress toward closing the gap,” says Pekel, the founder of the University of Minnesota’s College Readiness Consortium and an education official in the Clinton administration. “But that’s counting schools, not the number of kids. There are many more kids in the third of schools that are not making progress.”

Released several weeks earlier, the Minnesota Comprehensive Assessments were essentially flat statewide. That those scores are used to calculate the MMR, which is touted as showing the gap closing, is confounding to many.

“The Minnesota Department of Education has consistently reported the outcomes of the MMR in ways that support the policies of the department and the Dayton Administration,” Pekel says adds, “rather than as the objective agency that could serve as a convening force in Minnesota’s divisive educational debates.”

Specifically, because they are in small districts or Greater Minnesota, many of the schools topping the rankings have worked with the department’s Regional Centers of Excellence. The centers were designed to provide technical assistance that Minneapolis, St. Paul and other urban districts already had in house.

Yet a Sept. 1 Star Tribune story about this year’s MMR release described Minneapolis and St. Paul as “resistant” to asking the state for help. 

“If the state really wants to meet its goals, we are going to have to see Minneapolis and St. Paul also improving,” Cassellius told the Strib. “We are ready to go all-in, but schools are locally controlled. But we are ready to do all hands on deck.”

One of Pekel’s top objections to the MMRs is that the results were supposed to be used to identify successful strategies so more schools could adopt them. It’s not clear this has happened, he says, and if the Regional Centers of Excellence has pinpointed great practices, “a much more aggressive state would be mandating them.”

“I appreciate the attempt to not just look at sheer proficiency rates,” says Mauri Melander, principal at Minneapolis’ Lucy Laney, a school that has consistently struggled to see its gains reflected in state measurements. “I just think every time you try a new formula you’re going to find flaws. You want two and two to always equal four, but it doesn’t.

“Before you know it, you’re not looking at children as children anymore,” she adds. “You’re just trying to fill pockets and that’s a sad place to be.”

Too complicated? 

Melander’s colleague at Minneapolis’ Green Central, Matthew Arnold, also has numerous classrooms showing double-digit growth yet is contending with a failing designation. The coaches and grade-level teamwork he is confident are driving the progress are the same strategies beginning to move the needle at Lucy Laney. 

Kent Pekel

But because of the phenomenally complicated way the MMRs are calculated, neither school’s progress is acknowledged.

Department officials say they addressed a number of concerns about the methodology in 2014. But there are some factors that are non-negotiable, they counter, given that the system was designed to meet very specific federal requirements. 

“Some people don’t realize we’ve been through two iterations of the MMR,” says Cassellius. “I think a lot of educators don’t know we tweaked it to get the variability out of it.”

“Its first weakness is its incomprehensibility,” says Pekel. “You can say what you want about the U.S. News [and World Report] rankings, but if you look in the back you can see how it is calculated.”

The MMR system compiles scores in three “domains,” plus graduation rates in the case of high schools, to come up with a number on a 0-100 index. How schools show progress toward the three — proficiency, growth and progress toward reducing the gap by half — varies. 

Schools near the top earn “Reward” and “Celebration-eligible” designations, while those near the bottom are singled out for interventions. Because schools compete against each other on some points vs. a fixed goal, it’s hard for those at the bottom to move up.

Tougher for schools with diverse populations

MinnPost called Pekel after hearing from several school- and district-level assessment coordinators who spent time trying to figure out why they were in the bottom — despite what they felt was strong growth in the classroom.

Particularly in urban areas, leaders of schools with high populations of color have long groused privately that the MMRs do not reflect their year-over-year growth.

The methodology, they insist, is weighted against schools with concentrations of impoverished children of color. Schools with large numbers of poor white students can earn a quality designation by making progress on just one indicator, while those serving more diverse populations must meet a dozen targets. 

Several of the schools touted in last year’s state press releases, for example, are located in Greater Minnesota; they have large numbers of low-income white students and not enough children of color, special education students and English-language-learners (ELL) to count. (A subgroup must have 20 students in order to be taken into account by the MMR.) 

By contrast, urban schools with more diverse populations have to hit many more targets for academic performance to earn the same points. More baffling, students in a particular racial or ethnic group (Latinos,  say) must also meet separate targets — often higher — for students in poverty and by special education and ELL status.

A complicated history

Some of this is due to federal law, Cassellius counters. “Every single student in every single group needs to count,” she says. “We need them to increase.” 

Cassellius says it’s not realistic to look at changing the calculations behind the MMR while a rewrite of No Child Left Behind — the law that first mandated student progress be reported in the first place — is before Congress. (Over the summer both the U.S. House of Representatives and the Senate passed revisions that would let states take control of accountability, but it’s unclear whether lawmakers will be able to bridge the chasms between the two bills. A conference committee headed by Lakeville Republican Rep. John Kline hasn’t done much so far.)

The impasse in replacing the 2001 No Child Left Behind (NCLB) is part of the reason the MMR was created. After years of gridlock, in 2011 U.S. Secretary of Education Arne Duncan announced that states willing to develop better accountability systems could win waivers from NCLB.

At the time, it was unclear that Duncan would eventually give most states waivers. NCLB’s unfair and punitive consequences were beginning to be felt throughout the state and officials feared they might have a short window to win relief.

Pekel was on the panel that advised the state on its waiver application. The panel agreed that the goal — cutting the state’s yawning achievement gaps by half by 2017 — was both ambitious and reasonable. “We really didn’t know whether that was a window that was just going to close,” he says. “The feds did require a ratings system to add up to a number.”

All in all, the MMR was judged much better than NCLB’s assessment system, yet still far from perfect, he says. It was understood that a more sound system would be developed after the waiver was secured. Indeed, some of Pekel’s objections are reflected in the feds’ initial response to a draft of Minnesota’s winning waiver application.

State officials say they “tweaked” the system in 2014 when Minnesota sought a one-year extension of the waiver. Earlier this year, the state was granted a second waiver.

Depending on what happens in Congress, Cassellius says she envisions the World’s Best Workforce legislation, which was passed several years ago, to be the spine of a better state model, one in which districts would have more local control over accountability.

“People are reacting to the judgment,” says Eric Moore, Minneapolis Public Schools research and evaluation chief. “Obviously the more groups you have, your work is cut out for you more than a school that has maybe one subgroup.”

Adding to this are frustrations that two Minneapolis high schools, Edison and Patrick Henry, would have qualified for recognition had fewer students opted out of the 2015 tests.