Nonprofit, nonpartisan journalism. Supported by readers.

Donate
Topics

House of Charity generously supports MinnPost’s Mental Health & Addiction coverage; learn why

New HealthPartners predictor algorithm identifies members at elevated risk for suicide

Their goal is to engage members and work with them to get the treatment they need.

HealthPartners
Between August 2019 and August 2020, staff from HealthPartners’ behavioral health team used an algorithm to identify 500 members who they believed to be at elevated risk of attempting suicide.
MinnPost photo by Peter Callaghan

A group of researchers from seven health systems in the United States and Canada including Bloomington-based HealthPartners has developed an algorithm that can predict and identify individuals at higher risk of attempting suicide.

Rebecca Rossom
Rebecca Rossom
To identify members at elevated suicide risk, explained Rebecca Rossom, HealthPartners psychiatrist and senior investigator, a group of researchers from the health systems known as the Mental Health Research Network created a machine-learning model to identify a set of health predictors that place individuals at greater risk of attempting suicide. The model then selected members who had experienced a pre-identified number of those health predictors.

The set of health predictors for suicide risk were, Rossom said, “in a lot of ways what you’d expect. Things like, ‘Have they had a previous suicide attempt?’ ‘Have they had a depression diagnosis?’ ‘A substance-use disorder?’ ‘An alcohol-use disorder?’ We have access to that information and we use that in the models.”

Other suicide-attempt predictors were perhaps less obvious, Rossom added, including an eating disorder diagnosis or a history of benzodiazepine prescriptions: “It is a whole host of things. It’s not always obvious, but it is based on research. We use every element we can gather. It ends up being over 200 data elements.”

Article continues after advertisement

Having such a large set of data elements is important to create a truly accurate prediction model, Rossom said: “For different patients it can be a different combination of things. It is not an easy thing to say for a whole population of people, ‘Here are the top 15 things that raise suicide risk.’” As long as you’ve got good predictors, she added, “Translating that down to the individual is where it gets more challenging; the more predictive elements that go into a model the better.”

Between August 2019 and August 2020, staff from HealthPartners’ behavioral health team used this algorithm to identify 500 members who they believed to be at elevated risk of attempting suicide.

Quanah Walker
Quanah Walker
Quanah Walker is director of behavioral health at HealthPartners Health Plan. He oversees the plan’s team of case managers. He said he and his team appreciated having access to information that applied specifically to HealthPartners members: “We took the research done by Dr. Rossom and others in the Mental Health Research Network and we asked our team at the health plan to translate it into a registry for us using health-plan data.”

By replicating what Rossom and her colleagues from the other health plans had developed, Walker’s team hoped to create a pilot project with the population that they serve in Behavioral Health Case Management.

“It took a good eight months for Health Informatics to take that research and apply it to our HealthPartners data and then apply it to this registry,” Walker said. “We started using [the registry] in August of 2019.” The algorithm identified 500 health plan members who were at higher risk of suicide.

A program created through collaboration

Rossom, who is part of HealthPartners Institute, the health system’s research and education wing, explained that the goal of the Mental Health Research Network is to build collaboration and research capacity among a larger network of health systems. By pooling their data together, the group hopes to build more accurate ways to predict potential problems that could have easily been missed in traditional office visits.

The new suicide-predictor model is a good example of the power of collaboration, Rossom said. In the study, the results of which were published in the American Journal of Psychiatry, researchers analyzed health data collected from 20 million visits by 2,960,929 members across seven health systems from 2009-2015.

“The idea is to have a large enough data sample from all these patients across all these care systems so we have enough data to accurately predict suicide attempts,” Rossom said.

Article continues after advertisement

The collected data was fed into a machine-learning model to develop a predictor model to build a list of members deemed to be at higher risk of suicide.

The machine-learning model, Rossom explained, “Takes all of the data we can give it about patients — their age, their gender, their race and any comorbid conditions — and it spits out a model that says, ‘These are the people we think are most highly at risk for suicide.’”

To determine the accuracy of the list produced, Rossom said, “You develop and train the model in 70 percent of the population. Then you test it in the remaining 30 percent. In our case you end up with a prediction curve. You look at the area under the curve and then the closer you get to one, the better the model.”

Next step? Outreach

At HealthPartners, members of the Behavioral Health Case Management Program brought their focus to the 500 members identified at higher risk of attempting suicide. Members of Walker’s team connect with plan members via letters and telephone outreach to talk about mental health issues and help them make connections to providers and other specialists.

“In a year, we usually work with around 6,000 or so members in our program,” Walker explained. “When we get someone on the phone, we talk to them about how we can help them.”

Their goal is to engage members and work with them to get the treatment they need. “We do assessments of their situation to figure out opportunities, gaps in care, goals,” Walker said. “We find out if they need providers, community resources, etc. We incorporate this into our usual work and progress.”

In their outreach to individuals determined to be at higher risk of suicide attempt, Walker said that case managers took a low-pressure approach.

“We don’t call someone and say, ‘You got flagged for high risk of suicide. Let’s do a screen on you,’ Instead, we just try to build a natural connection with them, but we also do a Columbia Suicide Screening.” The Columbia Screening, or the Columbia Suicide Severity Rating Scale (C-SSRS), is a standardized, evidence-supported questionnaire used to assess suicide risk. Walker’s team uses the C-SSRS as a way to further understand the individual’s suicide risk category, so they can tailor their interventions.

Article continues after advertisement

“If they are high risk, we want to do more up-front work with them,” Walker said. In those cases, he continued, “We could do what a mental health professional would do, including figuring out their intentions, finding out if they have a specific suicide plan. In those cases, we can do things like get a mobile crisis unit involved or have a safety check done. In the majority of cases so far we haven’t had to do that.”

When Walker’s team determines that an individual on the list is at moderate or low risk for suicide, he said they take a slightly different approach: “We have pathways for that as well which involve safety planning, connecting them with a provider or providing interventions that help members be safe.”

Not every member appreciates the idea that their health plan is feeding their health data into a machine-learning model to make a determination about the state of their mental health, Walker said. “There are people that decline to answer our questions. They don’t want to do the [C-SSRS] questionnaire. We’re respectful of that. It is up to them if they don’t want to participate.”

Normalization is the goal

Though she can understand why some people could feel unsettled by the idea of their health plan coming up with this kind of list, Rossom said she believes that some of that unsettled feeling may have its source in deeper feelings of shame that still cloud many discussions about mental illness.

“There could be a little bit of creepiness factor around this, and yet most people wouldn’t feel that way if their doctor told them they’ve run their cardiovascular risk score,” Rossom said. “Maybe part of it is there is usually nothing delicate about talking about cardiovascular health. But there often is something delicate when you start talking about mental health and the risk of a suicide event.”

Rossom wants to encourage people to view their mental health in the same light as they view their physical health.

“These risk models we’ve created are similar to cardiovascular risk models,” she said. With cardiovascular risk models, she added, “We can tell you that your risk of having a heart attack in the next two years is 30 percent.” Suicide-risk models carry a similar message: “If you are in that top five percent of risk, your risk of suicide is 20 times higher than the average person.”

One important method of reducing suicide deaths, Rossom believes, is to work to reduce discrimination around mental illness. The responsibility lies not just with patients and their families, but also with providers.

Article continues after advertisement

“Were trying really hard to get clinicians to talk about this matter-of-factly, the way they do talk about cardiovascular risk. The stigma around this is part of what’s killing people. They feel shame. They don’t want to reach out for help. The more we can make checking in about your metal health no different than checking in about your diabetes, this is going to go a long way.”

Rossom said she feels confident that the model she and her colleagues developed at the Mental Health Research Network is strong, though she’s willing to admit there may be some individuals flagged as high-risk who may actually not be at elevated risk for a suicide attempt.

“We can tell you that the model is grouping people the right way,” Rossom said. “It doesn’t mean that everyone in the group is going to go on to attempt suicide. It may be accurately predicting risk, but it doesn’t mean that most people in that group are going to go on to have that event.”

That said, Rossom believes that early interventions, like the outreach calls from members of Walker’s team, can be truly helpful in reducing suicide deaths.

“We’ve done research with people who’ve survived suicide events,” she said. “These attempts can be very impulsive acts.” Many of the individuals interviewed said they denied having any thoughts of suicide on earlier medical encounters. “Half of them said, ‘I answered those questions truthfully, but then it just came over me.’ The other half said they knew they wanted to kill themselves when they were interviewed but they said, ‘I didn’t want to say anything so you would intervene.’”

In these cases, Rossom said, any type of outreach could potentially save a life.

“When you are seeing a population of people who have all these risk factors, it is hard to know from the outside who is going to be most at risk from this group.” Predictor lists, she said, may be another way to identify individuals who are creeping closer to a suicide attempt: “It is good to have a tool that gives you an indication of that so you have another opportunity to do what you can to help.”

Algorithm in action

Since they began reaching out to individuals on the registry of members at elevated suicide risk, Walker said that he and his colleagues encountered a handful of situations where they feel they were able to help avert a potential suicide attempt.

One example, Walker said, was a female HealthPartners member.

“She showed up on the registry as high risk for a suicide attempt. We reached out to her. In our initial conversations she wasn’t identifying any mental health concerns.” In a follow-up call, the member’s behavioral health case manager was able to take the conversation deeper, Walker said: “We were able to complete a suicide attempt screening questionnaire and determine that she actually was having suicidal thoughts.”

The woman told her case manager that she had been thinking about killing herself with a gun, Walker said. “If we didn’t have this screening information, we wouldn’t have talked to her about this, but because we had flagged her has having the risk, we were able to do risk screening and safety planning with her.”

The woman told her case manager that she had guns in her house, and the case manager went to work to make sure that the guns could be stored in a place that the member did not have access to. “We made a plan with one of her family members to make sure that the guns would be placed in a locked box and that the member did not have access to the key,” Walker said. The case manager then helped the member secure appointments with mental health providers so she could get the extra care she needed.

While he believes that an algorithm will never take the place of human connection in treating mental illness, Walker said he does see a clear, life-saving benefit from having access to this list.

“What this process allowed us to do was know that there was a concern for this member and then initiate a conversation and address it,” he said. “I really don’t think that would have happened if we did not have that process in place.”