Nonprofit, independent journalism. Supported by readers.

Donate
Topics
MinnPost's education reporting is made possible by a grant from the Bush Foundation.

Minnesota test-results takeaway: ‘Our kids did not get dumber overnight’

While the number of students scoring proficient in reading plummeted, that result reflects a change in the definition of proficiency, not a decrease in students’ knowledge.

The complexity of understanding what a test does and doesn’t measure is one reason policymakers here and around the country often advocate assigning schools stars or letter grades or some other designation that is supposed to convey simplicity.
REUTERS/Sergio Perez

Today your local newspaper doubtless carries the hotly anticipated results of the 2013 Minnesota Comprehensive Assessments (MCAs), the standardized tests used to determine the number of students who are proficient at math, reading and science.

If you’re like most people you crack it open, start tripping over the acronyms by which the sundry exams are identified, begin sinking into a quagmire of terms like “cut scores” and “weighted norms” and grab frantically for the false security of the one thing that seems easy: The percentage of students passing in your corner of Lake Wobegon.

I have a better idea. Roll that paper up and use it as a prod to herd your friendly neighborhood psychometrician — you know, the nerds who create the tests — into a pen where we’ll keep them for a few months to determine whether we as a state have finally arrived at a workable, meaningful testing regime.

If the next set of results bears out this promise, we can let them out and start paying attention again. The next set of results being Minnesota’s Multiple Measurements Ratings (MMR), which combine proficiency, yearly growth and the rate at which a school or classroom is closing the achievement gap.

Article continues after advertisement

Due out around Oct. 1, the MMRs reveal which schools are making accelerated gains with students who start out behind, which are lagging or succeeding with both proficiency and growth, as well as schools that have student bodies that start out highly proficient yet don’t make big strides. A homegrown system, they provide a much more sound basis for making judgments about a school’s strengths and weaknesses.

Definition of proficiency changed

What’s that? You already looked at today’s less nuanced numbers and, frighteningly — and incorrectly, let’s just get that right up front — it appears reading proficiency statewide has plummeted? Worse you fear you’ve heard the explanation — that that’s because a new, much tougher test was brought online this year — before? Sounds like some context is in order.

First, the takeaway: Math scores are flat statewide with 61 percent proficiency, but up five points over 2011. They are up 3 percentage points for the second year in a row in Minneapolis, and 3 points to 44 percent in St. Paul. Neither district made significant strides in narrowing yawning gaps between affluent learners and poor.

Commissioner Brenda Cassellius

The crude number of students scoring proficient in reading plummeted, from 76 percent to 57 percent statewide. But that reflects a change in the definition of proficiency, not a decrease in students’ knowledge.

“Proficiency is not dropping,” state Education Commissioner Brenda Cassellius said Monday. “Our kids did not get dumber overnight.” Rather, the tests have gotten harder over the last decade-plus.

Designed to comply with NCLB

Administered in grades 3-8 and then once again in high school, the MCAs are the tests that were designed to comply with No Child Left Behind (NCLB), the 2001 federal education-reform mandate. In their first iteration, they were supposed to reveal, by race, disability status and English-language facility, whether a school was failing some or all of its students.

The data confirmed, of course, that lots of schools were in fact failing large numbers of poor minorities. But beyond that many of the tests were lousy. They did not measure individual students’ performance from one year to the next or show where their gaps in knowledge lay. So they were useless to teachers and school administrators, many of whom then administered more tests in an effort to glean useful information.  

On a policy level, NCLB did not set a single standard for proficiency. And because there were penalties associated with the failure to increase performance, some states — Minnesota was not among them — reacted by lowering their bars.

A consortium of state education leaders got together and decided it was time for tests that not only set a single, high benchmark for performance but that measured for things that higher education and employers were looking for, such as problem-solving skills, creativity and critical thinking. Minnesota’s teachers were among those who helped to design and test the Common Core Standards (CCS), which were to be adopted voluntarily.

Article continues after advertisement

Minnesota was already preparing to roll out an even tougher math exam, so it adopted only the new Common Core reading standard. And it began working on test that, much like those many districts were already using to glean data teachers could use, would measure individual students’ fall-to-spring progress and reveal specific knowledge gaps.

In 2011, the new math test was administered for the first time. Proficiency levels fell.

In 2013, the new reading test was administered for the first time. In addition to requiring higher-level thinking, it has a higher standard for proficiency. And so while passage rates for math scores fell about 10 points in 2011, in reading this year the drop was more like 20-30 percent.

Which is not to say that Minnesota’s students became less proficient. The test wizards at the state Department of Education have figured out how to control for differences in the new and old reading tests. So when the MMRs are released in a few weeks, it should be possible to identify trends.   

“It’s really a nightmare for testing people to try to explain,” said Dave Heistad, Bloomington Public Schools’ director of research, evaluation and assessment and one of the region’s best brains on exams.

“For a parent looking at an individual child’s results, proficiency” — as in, is the student on track — “is important,” he explained. “But if you’re going to look at a school, you really need to look at growth.

Growth is key

“So the state and schools need to do a good job explaining the MMR,” Heistad continued. “If proficiency goes down [with different or more rigorous tests] but growth is strong, a school is OK.”

The complexity of understanding what a test does and doesn’t measure is one reason policymakers here and around the country often advocate assigning schools stars or letter grades or some other designation that is supposed to convey simplicity. Unfortunately these under-nuanced approaches haven’t worked much better.

Heistad thinks he might be approaching a more useful system. Because the newest generation of tests is tied to college-readiness benchmarks, it’s possible to create a graphic that shows where a student is relative to the average score of someone headed to a four-year college, to a specific school such as the University of Minnesota or even an elite one like Carleton College.

Article continues after advertisement

“You can now tell by fifth grade whether a kid is on track for college, and you can do something about it if they’re not,” he said.

His predecessor in the job, Jim Angermeyr, is more skeptical. He fears that years of changing standards and the corresponding dips in proficiency invites the general public to simply start ignoring the data.

Educators, meanwhile, are now primed to swing into action when a student is not proficient, Angermeyr added. Students’ schedules, class assignments and other things often are changed in response, potentially putting them on a lower track.

“The discussion about the higher standards is good, the idea of focusing on preparing kids for college is good,” he said. “I think schools need to be very careful about identifying kids as non-proficient based on these Common Core Standards.”

Worse, he fears, any drop in perceived proficiency, whether real or not, could become a policy cudgel to be swung by anyone who has a vested interest in continuing to label some schools as failing: “Some of us who are still cynical wonder about the political motivation behind this.”

Cassellius, meanwhile, has a fairly simple message. It’s taken well over a decade for Minnesota to come up with a set of standards that are high enough and driving for the right results.

Teachers can use results quickly

The new exams come with practice tests teachers can administer in the fall so that they can use the results, now available very quickly online, to individualize instruction throughout the year, she said Monday.

And they align beautifully with the end game of college- and career-readiness, Cassellius added. The kinds of thinking the Common Core tests require for proficiency are the ultimate goal.

“This is what we ask our kids to do in college,” she said. “The whole goal is not to have to take remedial coursework in college.”

“Now that we’ve done this” — replace inadequate tests with better ones — “let’s stop changing the goalpost on our teachers,” said Cassellius. “Let’s respect them and give them the time to build their toolkits.”

Article continues after advertisement

“This is not a walk in the park,” she said. “But I think it’s a much more honest and transparent picture.”