Introduction

Suicide has always been an obstinate public health issue, despite significant advances towards the diagnosis and treatment of mental disorders. An area of interest is the development and expansion of suicide screening technologies through accessing and evaluating social media data. Studies have displayed that youth are most likely to reveal suicidal thoughts and suicidal risk factors online and on social media. Learn here how Machine Learning to help mitigate suicide rates in individuals!

How Machine Learning Can Help?

A Machine Learning(ML) tool can use Electronic Health Record (EHR) data to analyze and evaluate the suicide attempt risk and enable health care professionals to know which patients to screen in nonpsychiatric clinical settings, a study published in JAMA Network Open revealed. Researchers have noted that suicide has been on the rise all over the world and it is a problem that claims around 14 in 100,000 lives every year.

Researchers from the Vanderbilt University Medical Center (VUMC) have created a ML algorithm that uses EHR data to forecast and predict the suicide attempt risk. This model has recently undergone a prospective trial at the institution.

Termed as the Vanderbilt Suicide Attempt and Ideation Likelihood (VSAIL) model, the algorithm uses routine data from the EHR to calculate a 30-day risk of return visits for suicide attempt, and by extension, suicidal ideation. Adult patients have been categorized into eight tiers based on their risk scores per the algorithm, researchers also found that the top tier alone reported more than one-third of all suicide attempts documented in the study and about half of all cases of suicidal ideation.

As documented in the EHR, one in 23 of these high-risk individuals have gone on to report suicidal thoughts, and one in 271 have attempted suicide. “Today across the Medical Center, each patient cannot be screened for suicide risk in every meeting — nor should we,” states Colin Walsh, MD, MA, assistant professor of Biomedical Informatics, Medicine and Psychiatry.

But we know some individuals are not screened despite factors that might put them at higher risk. This risk model is a first step towards that screening and may recommend which patients to screen further in settings where suicidality is not often discussed.”

Over the 11-month test, close to 78,000 adult patients were seen in the hospital, emergency room, and surgical clinics at VUMC. The HER documentation revealed that 395 individuals in this group testified to having suicidal thoughts and 85 lived through a minimum of one suicide attempt, with 23 living through repeated attempts.

Dr. Walsh states – “We may be able to question hundreds or even thousands of individuals about suicidal thinking, but millions of patients who visit our Medical Center every year cannot be interrogated — and not all patients need to be questioned. Our results recommend that AI can help as one step in directing the restricted clinical resources to where they are most required.

Data analytics and AI tools are being utilized increasingly in the mental healthcare space. A recent study that has been published in JAMA Psychiatry has displayed that a universal screening tool could control predictive analytics algorithms to accurately determine an adolescent’s suicide risk. The algorithm could also inform providers with data on patients who may require follow-up interventions.

Conclusion

Unlike the existing tools, the EHR model questions adolescents not only about suicidal thoughts but also about various factors that may put them at risk, comprising trouble concentrating, sleep disturbance, agitation, and issues with family and school connectedness. There are many reasons individuals may not share suicidal thoughts and this model will only help with the suicide risk assessment.