Experts have long known that early detection of serious mental illness can lead to more effective treatment. With this in mind, a group of researchers at the University of Minnesota Medical School was recently awarded a grant from the Defense Advanced Research Projects Agency’s Neural Evidence Aggregation Tool program to develop a way to better identify early symptoms of depression, psychosis and suicidality. This tool, which is still in the early development phases, could detect subconscious brain impulses that signal early development of these serious mental illnesses.
Alik Widge MD, PhD., project lead and associate professor at the University of Minnesota Medical School, said that the project, known as Fast, Reliable, Electronic, Unconscious, Detection, or FREUD, would use measurement of brain activity to detect early, subconscious signals of hesitation or pause over specific images that may signal the early stages of these conditions.
The project’s name gives a clear clue to its intention.
“Technology projects should have catchy acronyms,” Widge said. The purpose of this project, hinted by its nod to the famed psychoanalyst, is, he explained, “to identify signals that tell us when something is in someone’s unconscious or when they react to an image that represents something they are avoiding or conflicted to think about. We want to know what is in someone’s unconscious mind and how that impacts their actions.”
When I spoke with Widge last week he was excited to talk about his team’s plans for the ambitious project. This interview has been edited for length and clarity.
MinnPost: What can you tell me about this project and how it could be used in the real world?
Alik Widge: We are focusing on detecting one of the most challenging mental health symptoms, which is suicidality and severe emotional distress.
The idea is that feelings of conflict often occur when someone has thoughts of ending their own life or harming themselves and they are sitting in front of a physician and they get asked, “Have you had thoughts of hurting yourself?” There is this moment when the person thinks, “Should I? Shouldn’t I?”
They might know that things are getting worse and they want help, but they are scared to ask. Maybe they are scared because they think, “What if I end up in the hospital, take a med I don’t want to take or have to take time off of work? What if I am not as OK as I am trying to convince myself I am?” That can lead them to push that out of their mind and think, “I’m OK. I wasn’t really just staring at a bottle of pills last night.”
We’re looking for a way to isolate that moment of internal conflict, to create a kind of test that could put a pin in that moment of conflict. People in the media have been saying we are building a suicide detector. It is not a suicide detector. It is an enhancement to the way we do risk assessment.
MP: It seems that many people have these kinds of internal conflicts. Why do you and your colleagues believe that identifying those moments would be helpful?
AW: This moment of conflict often presents itself in patients. It is particularly common if a person is meeting with a physician they don’t know well, which happens often in mental health screenings.
The idea we thought about with this project is, What if you can detect that moment, that flash of, “Should I or shouldn’t I?” When that happens, what if we could somehow clue in a physician to think, “Maybe you want to ask that question one more time or come back to it. Don’t take no for an answer. Dig a little deeper.”
MP: So you’re talking about that moment in one of those mental health screenings you get at an annual checkup that asks something like, “Have you ever thought about harming yourself?”
AW: Yes. A lot of patients at first say no when they are asked this question. In my own personal experience, a patient might say no, then I’ll come back 15 minutes later and ask, “You’re really sure this has never gotten bad enough that you’ve thought, ‘I’m better off dead?’” Then there’s this moment, a sigh, a shoulder slump. They say, “I don’t want to talk about it.” I say, “We’ve gotta talk about it.”
MP: Who is the intended audience for this test?
AW: It could be given to anybody, but there are some groups it would be particularly helpful for. One of those populations is members of the military, a group that has a high rate of suicide. This is why it is a Department of Defense project. Suicidality and the avoidance of talking about it comes up often with service members . They know that if these kind of thoughts come up, they will get pulled off the front line and sent off to recuperate. They feel guilty: They’re abandoning their post. They will tell themselves, ‘I’m fine,” when they really aren’t fine.
From personal experience I know that this also happens frequently with doctors. Most doctors who kill themselves go to work that same day. And most doctors are so bad at admitting they need help.
MP: Do you think a test like this could eventually be given to everybody when they go in for their annual checkup?
AW: Of course. If you get that little bit of clue, that extra edge, if you get that information during a physical, you could potentially save someone’s life.
MP: This is a four-year award. How will you stage your research?
AW: The first two years are experimental. We will be working with people for whom we know their mental health history and we know what their answers will be to certain questions. They will be with a physician they trust.
To measure their reactions, we will put a big, old EEG cap on them. There will be lots of machines focused on them: fancy cameras, a whole room of equipment trying to detect that little brain signal. We try to insert that moment of conflict among phrases that might seem neutral, like, “The Vikings game last night was great,” vs. “I wish I was dead.” We’ll be watching what their brain does in reaction to that. Through that complex equipment we will identify those signals. In those first two years we will figure out what signals can give you that information and where in the brain they can most likely be seen.
MP: What you’re describing sounds like a lot of hassle. I’m imagining that patients would find it strange to be all wired up, with a big EEG cap.
AW: We’d eventually reduce the size of the machine used to detect these impulses. Maybe it would be something that’s the size of a large sunhat that you can just put on someone’s head.
MP: What happens during the next two years of the project?
AW: The first two years are to figure out the science, to figure out where we want to look and what do we want to look for. Then, the next two years, we’d try to figure out a version of the product that we can actually put in doctors’ offices, something we can replicate with a turnkey scalable system and move into clinical trials.
MP: How do you think patients would react to taking a test like that, even wearing the smaller, sunhat-sized cap?
AW: It’s a good question. I think we’re going to have to see. One thing that’s been tossed around in the psychiatric literature for a while is the question of whether people will be more honest with a computer than they would be with a person. People don’t feel judgement when they are answering questions from a computer vs. from a human.
Will they be willing to wear a sunhat-type thing? Maybe — especially if they see six other people sitting around with a sunhat-type thing on. It may be an early adopter thing. Over time, if it works, it may become more normalized.
MP: What could this the test be like?
AW: Imagine a person sitting in front of a screen and it’s putting up images, sentences, standardized prompts meant to trigger feelings of conflict. Maybe you are asking a person if with every picture, “Is this something that relates to you or not?” Some are death-related themes. Some are not, like neutral landscapes.
What we are looking for in those moments is when you see that little electrical hiccup in the brain in reaction to an image or phrase that signals, “I’m not sure how I feel about this.” We are looking for that sign of that moment of conflict.
MP: Could you give me an example of what some of those images could look like?
AW: We’ll figure out what images are evocative for people with known suicidality. It might be a picture of coffin or a funeral or a bottle of pills with a hand next to it. We don’t know what’s going to be best. That may need to be customized to a person. We may need to develop adaptive algorithms that narrow these themes to a specific person.
MP: Would the test be different for identifying early symptoms of psychosis?
AW: It wouldn’t be that different. With psychosis, people are hearing voices that other people can’t hear and seeing things other people can’t see. When this happens to someone the first thing that happens is, “I can’t tell people. They’ll think I’m crazy.” They are scared of taking medication and yet if they took it, we know early intervention bends the curve and prevents people from having major episodes and hospitalization.
MP: I’m imagining that an experienced psychiatrist like yourself might not need a test like this, that you have seen so many people with depression, psychosis or suicidality that you could detect these signs as easily as this test could. Would you really need a test like this?
AW: The awful truth is that clinicians are only marginally better than chance at guessing at who or who isn’t going to hurt themselves. I am no better than chance at doing this. If I see 20 people in my clinic, I can’t accurately tell you who is going to commit suicide in the next week.
Because of this, clinicians need an extra edge. Even I, as an alleged expert, understand that there are too many times I walk out of an interview and I’m really hoping I made the right call. Should I have said, “You are not leaving this office. You are going into the hospital,” vs. “Go home. And please, for the love of God, show up next week?” There are may times I go home on a Friday and worry about someone all weekend. If I could easily detect their potential suicidality when they were in the office, that would be great.