"Our kids are suffering," is how one Cincinnati professional summed up the mental health landscape that teens and younger are navigating today. The markers of a healthy emotional life were already trending the wrong way before the pandemic, but the isolation and trauma caused by that event led to a surge in rates of depression, anxiety, and mental health emergencies nationally, including suicide. In 2021, three leading U.S. medical associations declared a national emergency in youth mental health.
It's an urgent problem, and in Greater Cincinnati, leaders from the medical, social services, education, and philanthropic communities are collaborating to work on the issue. Parents, medical providers, care givers, mental health professionals, and youths themselves will be part of the solution. This story is the sixth in the Soapbox Cincinnati series, Amplifying Youth Voices, which raises awareness of the problem and looks at possible community-based answers to it.
Of all the statistics that frame the crisis in youth mental health, none is more tragic than those on suicide.
The rate at which youth end their own lives has steadily grown, until in 2021
suicide became the second leading cause of death for kids age 10 to 14. But what if school counselors, psychologists, or doctors could discover when young people might be at risk for taking their own lives? With that knowledge, they could alert parents, and steer a child into care that might avoid a lifetime of mental anguish or the heartbreak of suicide.
That’s the promise of a long-term project under way at Cincinnati Children’s Hospital Medical Center that combines artificial intelligence and supercomputing with information gleaned from millions of interactions with patients as well as data about their communities, families, medical histories and experiences. All of that is fed into a high-powered computer program that can -- with more than 80% accuracy Children’s researchers say -- assess the likelihood of developing a serious mental illness.
“So often, young people’s mental health issues go for years undetected and untreated,” says Dr. Michael Sorter, head of the division of child and adolescent psychiatry at Children’s and one of the project’s lead researchers. “Can we identify the kids at risk much earlier in their lives, and then evaluate them appropriately rather than waiting until they have this whole cascade of negative events in their life?”
The project is called
“Trajectories,” as it aims to trace the arc of potential mental illness over time, much like pediatricians predict a toddler’s future growth based on height and weight measurements. But forecasting emotional problems is far more complicated.
The project’s data crunching began with the issue of anxiety, which it’s estimated
32% of teens have experienced in way or another, while
8% have suffered bouts so severe that they were impaired. Searching Children’s medical records since 2009, researchers found 1.2 million diagnoses for anxiety, said John Pestian, who runs the computer science lab at Children’s that is the hub of this work. Those diagnoses were based on 16 million notes that were entered into the medical record by clinicians.
Pestian’s team has created a database using the volume of clinicians’ notes, and information about patient’s family and life and publicly available information about their community. What is the violent crime rate in the neighborhood? What is the family situation? Is there a divorce? Has the child been exposed to violence or some other trauma? What is the average income in the neighborhood? What is the education level?
These millions of data points are fed into a supercomputer at the Oak Ridge National Laboratory, the federally funded research center in Tennessee that worked on the Manhattan Project. Children’s has made a $10 million investment into the computer facilities and experts at Oak Ridge, giving it access to work on today’s version of the Manhattan Project – solving youth mental health. Oak Ridge’s Frontier supercomputer is said to be the world’s fastest, able to conduct two quintillion calculations per second. Children’s researchers are training the computer to recognize the possibility of mental illness by feeding tens of millions of bits of information that it can use to assess patients in the future.
“If I were to do that on my desktop, it would take 10 years just to do one episode,” Pestian says. “If I do it on the Frontier at Oak Ridge, it just takes a couple seconds.”
Pestian leads the Computational Medicine Center at Children’s, which is developing advanced technology to help with early detection of psychiatric illnesses. His team’s work in AI and machine learning has already been deployed to help treat epilepsy in children. Processing volumes of patient data, including subtle changes in physician’s written language in the medical record, the program identifies candidates for epilepsy surgery much earlier. The surgery is effective, as is results in long-term freedom from seizures for two-thirds of patients. Identifying candidates for the surgery earlier can mean more years of seizure-free life.
Using these high-tech tools to detect mental illness earlier, before breakdowns or emergencies occur, could mean calmer childhoods and more productive adult lives.
“Forty percent of adult mental health issues start in childhood,” Pestian says. “With early identification, we can get treatment and work on stopping this from passing on to adulthood.”
Emotional disturbances such as anxiety and depression often go undetected in children. They are passed off as “going through a phase,” or they are advised to just work through the problem, or they are ignored altogether.
“A child with anxiety disorder or obsessive-compulsive disorder, or depression may have that condition for literally years and years without anyone ever really addressing it,” Dr. Sorter says. “As a result of that delay, so many complications can happen.” They don’t do as well in school. They miss out on opportunities. They begin to construct a negative narrative about themselves. That can lead to isolation, lashing out, difficult relationships with family, even crime and substance abuse.
The most severe cases can result in suicide. That’s where the work at Children’s began more than 20 years ago. The medical center possesses what is believed to be the largest collection of suicide notes. Pestian and his colleagues annotated and digitized thousands of notes, identified words and emotions that can be seen as signals of distress. Some common characteristics they found were not guilt, sorrow or anger, but instructions: “Don't forget the oil in the car needs to be changed.” “Please give my Pokemon cards to my brother.” They also found words of hopelessness. “My girlfriend left me and I can’t do this anymore.” Or “They won’t stop bullying me at school.”
They also used audio recordings from patient visits to the emergency department to analyze linguistics, acoustics and other subtle information.
Mental illness often leaves signs; the trick is to see them. These “thought markers”
might identify suicidal ideas much the same way that a test for high blood sugar can detect diabetes. When someone is tormented enough to contemplate killing themselves, it causes a psychic disturbance, a “perturbation,” Pestian calls it. “When that perturbation occurs, it begins to influence the thought process,” he says.
Those potentially deadly thoughts can be detected through changes in how the patient speaks, and Pestian and his team developed a language of suicide. The speech of suicidal children treated in the emergency department has been found to be interrupted by longer-than-normal pauses. They use the pronoun “I” more often. The vowel sounds of words are shortened, making them less intelligible. They sigh more, laugh less, express anger, but not hope.
Pestian’s lab used a branch of AI called natural language processing to develop algorithms to detect suicidal thoughts. Nearly 85 percent of the time, the AI model came to the same conclusions that human experts did. That makes it a potentially useful tool to broaden early screening for mental illness to more children.
The vision is a computer program or application that would run in the background during a doctor’s visit or during a session with a school counselor that could alert the adults to the potential for mental illness. Then further screening and care and treatment, if necessary, could follow.
“Our goal is to be able to work with families and say, ‘You know this hasn’t happened yet, but here are some of the things that you can do to make sure your teenager doesn’t have depression or anxiety or other issues,” Children’s CEO Steve Davis told a public gathering recently. It could also serve as a sort of red alert, indicating a child needs to be treated immediately.
“Most of these disorders in children are very treatable,” Dr. Sorter says. “If kids get into treatment, most of them get better. The tragedy is about half of the kids who have these disorders don't really get any kind of treatment at all, or minimal. That's why something that can help us detect it is so critically important for the future.”
As high-powered and advanced as these tools are, the experts note that they are not a substitute for the care, intuition, and expertise of a real live human. “These are aids to help humans be better at what they do,” Dr. Sorter says. And only a person can deliver the attention, the care, and the human touch that can be healing to a child in distress.
The Amplifying Youth Voices series is made possible with support from Interact for Health. To learn more about Interact for Health's commitment to mental health and well-being, please visit here.