Skip to Content

Support MinnPost

CenturyLink generously supports MinnPost's Education coverage. Learn why.

Minnesota is really good at collecting student data, but not the best at using it

Earlier this month, the state office of the legislative auditor released a report detailing the administration of standardized tests and the use of these test scores. In brief, it found that Minnesota schools invest “significant time and resources” in having students take the computerized tests, but concluded that “their usefulness is limited.”

These standardized tests include the Minnesota Comprehensive Assessments (MCAs), which all students take in reading, math and science to measure their proficiency  and growth in these areas. Students take both the math and reading MCAs in grades 3-8,  then the reading test again in 10th grade and the math in 11th. The science MCA captures how they're doing in grades 5 and 8, plus one year of high school. English Language Learners take an additional set of standardized tests called the ACCESS tests to measure their English proficiency, from grades K-12, in listening, speaking, reading and writing.

Administering these tests is no small feat. Schools must shuffle schedules so students have access to computers, the report stated. And education specialists who work with special education students or English Language learners are often displaced from their normal duties to help administer the tests — which, in 2016, spanned anywhere from three to five weeks.  

In addition to some schools reporting that they’ve had to hire additional staff or purchase computing equipment to administer the tests, the Minnesota Department of Education (MDE) spent $19.2 million on standardized tests in 2016, with federal sources only covering roughly a third of the expense.

The big question

That’s a lot of resources being poured into standardized testing, which raises an important question: How useful are these test results?

According to the audit, of those surveyed, the majority of principals and teachers said standardized test scores are helpful in identifying achievement gaps between groups of students. But more than half of respondents indicated they felt “unprepared to interpret key test score data.”

Theodore Christ
Theodore Christ

This disconnect between student data collection and the capacity of educators to utilize it isn’t necessarily unique to the MCAs. According to a lesser-publicized report published by the University of Minnesota’s Center for Applied Research and Educational Improvement last year, Minnesota educators believe in the use of data but are struggling to interpret and use education data.  

The director of the center, Theodore Christ, says educators are in dire need of quality professional development and technical assistance. “I mean, they’re just drowning in [data],” he said. “It’s all over the place. And if they don’t have the capacity to use it, they just turn away from it.”

When it comes to addressing persistent achievement gaps between students of color and their white peers, he adds, “That task is all but impossible unless we know how to use data to inform our decisions.”

Interpreting MCA scores

Minnesota’s standardized tests are given to meet federal accountability standards. They’re designed to serve as indicators of school quality. However, they’re often misunderstood to be a source of diagnostic information for individual students, says Michael Rodriguez, an expert on the interpretation and use of testing data at the University of Minnesota.

And while the school-level scores are valuable in identifying achievement gaps across schools and districts, he says many educators are missing out on the more valuable information that can be gleaned from these tests: how variable kids are within a given school.

Michael Rodriguez
Michael Rodriguez

“Schools that get useful information from those MCAs are the ones that do the deeper dives,” he said. “They look at the variability. They look at group differences. They look at: How are students with these kinds of experiences doing versus students who don’t have those experiences, and which kinds of experiences are we giving a kid that helps them perform better? And that requires someone who can go in and break down those numbers and do some analysis. Not many schools have staff that can do that.”

Listing a few exceptions, he says educators at Como Park Elementary in St. Paul have a data wall in their staff room to drive meaningful conversations across teachers about how individual students are doing. And both Twin Cities districts, along with the Anoka-Hennepin district and the Bloomington Public Schools district, have robust research, evaluation and assessment offices to help guide data collection and analysis.

More commonly, however, schools are relying on their district assessment coordinator — generally a staff person, a teacher or a support person who’s been assigned to help with the technical aspects of administering assessments — as their default data expert. To help these point persons move beyond things like how to ensure testing data is secure, Rodriguez says he and a few of his colleagues have been contributing to the Minnesota Assessment Group (MAG) to help members develop their skills around data literacy. That work includes teaching them how to move beyond looking at average school or classroom scores and better understand how to analyze the variability within a group of students. 

A similar shift in how staff at the state Department of Education support assessment coordinators across the state is also taking place. Jennifer Dugan, director of assessment for the department, says she and her team had long been scrambling to keep assessment coordinators trained on the technical aspects of test administration as the standards continued to change.

“With the revision cycle put out a bit … [it] has allowed us to take a breath and say, ‘We can shift our focus now to connecting those dots for educators,’” Dugan said.

With the help of a grant, the department hired Holly Brunson in June to promote data literacy across the state. So far, she’s been busy holding professional development trainings in partnership with assessment coordinators across the state, surveying educators to create a more teacher-friendly interface for all of the data available through MDE, and even bringing educators back to square one — reviewing the test development life cycle — so they have a better understanding of what standardized tests are designed to measure and how educators can weigh in. But there’s only so much outreach Brunson can do on her own.

“How do we have a renewed emphasis on data interpretation and data usage? We’ve been shifting about two years,” Dugan said. “But, especially with only one dedicated staff, it’s going to take a long time to see the fruits of that labor.”

A need for more data literacy tools

Educators may be swamped with student data, but that doesn’t deter them from administering their own assessments to measure the progress of individual students, so they can develop interventions and better target instruction. In fact, many teachers and principals prefer these tests over state standardized tests, the audit states, because they provide more immediate information on students, as opposed to the MCA scores, which don’t arrive from the testing vendor until the end of the school year or later.

This should come as no surprise to those who train prospective teachers in teacher preparatory programs. At the University of Minnesota, for example, the use of standardized testing data is not really a core part of the assessment curriculum, says Mistilina Sato, an associate professor specializing in teacher development at the university. While candidates learn about “testing language” and why the achievement gap data that comes from statewide tests is valuable, they’re much more likely to use their own assessments on a day-to-day basis, she explained.

When it comes to preparing teacher candidates, at least at the University of Minnesota, data literacy tools are currently interspersed throughout the program which is designed to focus more on things like classroom management strategies and curriculum development. Candidates may get some practice using student data during their student teaching experience, but it’s not really a priority when they’re first starting out.  

“Every school seems to have its own assessment culture,” Sato said. “Once you enter into the school, you have to first learn about how that school is using [data].”

Jennifer Dugan and Holly Brunson
MinnPost photo by Erin Hinrichs
Jennifer Dugan, left, is the director of assessment at the state Department of Education and Holly Brunson is the outreach and training specialist for the department.

Rodriguez says he’s been working with Sato on the development of a new course for all students in the teacher prep program that will provide them with a deeper dive into assessment literacy to help close the gap researchers like Christ have been sounding the alarm on.

“The vast majority of educators out there are saying that they need support for analysis, reporting, for interpretation and use,” Christ said. “And they need more professional development.”

According to the report, which was based on input gathered from about 800 individual educators and 13 professional education organizations, 69 percent of educators surveyed indicated there’s a need for additional training and support to use data. But even the superintendents, principals and other education leaders at the local level are indicating their comfort level in interpreting and using data is fairly limited, with 70 percent of survey respondents saying their capacity for training and supporting high-quality data interpretation was either fair or poor.

In response to these findings, the center is advocating for a new piece of legislation (H.F. 1492) that would fund the establishment of data experts at each of the Minnesota Service Cooperatives located throughout the state to assist districts with things like data literacy professional development offerings and data collection. The proposed budget is about $1 million a year for four years.

“We need to make a decision: Are we going to be a state who simply has decided data is not important? And then let’s stop collecting it, because we’re spending tens of millions of dollars collecting it, but we don’t know how to use it,” Christ said. “Or are we going to be a state who values data and research? And [then] we’re both going to collect that data and support the use of it.”

Get MinnPost's top stories in your inbox

Related Tags:

About the Author:

Comments (1)

Education programs - pay attention

The most important thing teacher do is not teach, but determine how much learning is occurring - not changing what is working, fixing what needs to improve. This needs to happen with students in individual classrooms, across all subjects by grade level, across all grades by subject and across school, to determine what vital information students are not learning as well as they should, including things that aren't being taught, but should. For example, every student should be learning how to manage a bully, how to resist peer pressure, how to learn independently and basic life skills such as being a parent or getting along with coworkers and a boss. And of course, standardized tests are no way to observe whether students can apply knowledge in a real life situation.

Every teacher needs to have skills at assessment and as someone who did it for a career, I can tell you that most people never had any meaning education on the subject. Also at a school, a lot of assessment should be done by a team - as it is in healthcare, with all the information assembled in one place and all the professionals who connect with a student, grade level or subject talking about it.

The rigor involved is a time commitment not made, with no formal processes in place to make it possible. Of course, we can continue to teach and not really have a good grasp if students are learning the right things at the right pace and graduate with what they need to function. It is incumbent on those who teach how to teach to change their approach.