Thursday, December 26, 2024
HomeHealthWhat Role Could Artificial Intelligence Play in Mental Healthcare? - HealthITAnalytics.com

What Role Could Artificial Intelligence Play in Mental Healthcare? – HealthITAnalytics.com

– In just over a year, nearly every part of the medical industry has been ushered into a new era of care delivery.

Upended by a crisis, the healthcare sector has had to quickly find new ways of safely providing quality care to patients. For many, the solution was to go digital – typically in the form of telehealth services.

In the mental health field, this has become particularly prevalent. 

“The onset of COVID-19 led to a dramatic increase in the use of telehealth,” said Zac Imel, PhD, professor and director of clinical training in the department of educational psychology at the University of Utah.

“Mental health is one area of healthcare that can be delivered via telehealth without losing its essence. There is something that’s lost, but there’s a lot of it that you can do. And there’s been a significant shift. I’ve seen students in my program go from doing no telehealth to almost 95 percent of them doing telehealth.” 

Recent research has reflected the virtual trend as well: A RAND study found that the significant rise in telehealth use during the height of the pandemic was driven more by people looking for mental health services than care for physical conditions. 

The rise of digital mental healthcare has also brought up the use of a technology that has remained somewhat elusive in the medical space: Artificial intelligence. 

For years, the industry has seen tools like chatbots and virtual assistants as a viable way of wading into the waters of AI. With the onset of COVID-19 – and all the stressors that came with it – organizations have turned to AI to potentially broaden access to and availability of mental health services. 

At The Trevor Project, a suicide prevention and crisis intervention organization for lesbian, gay, bisexual, transgender, queer & questioning (LGBTQ) young people, leaders recognized the need for wider availability of digital mental health services during the pandemic.

“Our research shows an estimated 1.8 million LGBTQ youth between the ages of 13 and 24 in the US seriously consider suicide each year, and at least one LGBTQ youth between these ages attempts suicide every 45 seconds,” said Kendra Gaunt, data and AI product manager at The Trevor Project.

“So, when we’re thinking about what we can do to meet this need, and how we can equip our highly skilled counselors, it’s through the lens of connecting with every LGBTQ youth that needs our support. We’ve also seen a shift as a result of the pandemic. At times our volume has doubled what it was pre-COVID.” 

Kendra Gaunt, The Trevor Project

Kendra Gaunt, The Trevor Project

To get ahead of this issue, The Trevor Project recently partnered with Google.org to launch The Crisis Contact Simulator, a first-of-its kind counselor training tool powered by AI.

The model simulates digital conversations with LGBTQ youths in crisis and allows aspiring counselors to experience realistic practice conversations before taking live ones. The platform will enable staff to train even more volunteers and make regular training updates.

“While we’ve definitely noticed that this method of the instructor-led role plays is effective, we also saw a really unique opportunity here to leverage AI to increase our number of trained counselors,” Gaunt said. 

“The technology can also improve the flexibility and quality of our training process. About 68 percent of our digital volunteer counselors do shifts on night and weekends, so now they can be trained on nights and weekends as well.”

In the research sector, organizations are also exploring the use of AI in mental healthcare. A study recently published in JMIR examined the utility of an AI-powered chatbot called Woebot, a mental health digital solution designed to treat substance use disorders. 

Substance use disorders and mental illnesses often go hand-in-hand – and recent circumstances seem to have only exacerbated both conditions. Research from NYU School of Global Public Health showed that people with anxiety and depression are more likely to report an increase in drinking during the pandemic than those without mental health issues.

“There might be the potential for more problematic substance abuse because people are isolated and stressed, whether over health concerns, financial concerns, political concerns, or social concerns,” said Judith Prochaska, PhD, MPH, a licensed clinical psychologist and professor in the department of medicine at Stanford University.

“A lot went on in 2020, and that was compounded with isolation and reduced access to in-person counseling or 12-step programs.”

In the JMIR study, Prochaska and her team demonstrated that Woebot was associated with significant improvements in substance use, confidence, cravings, depression, and anxiety. The findings indicate that chatbots like Woebot could potentially reduce the burden of substance use disorders.

“Woebot will reach out to people sometimes and say ‘hey, let’s connect,’ or people can go in and use it,” Prochaska said.

“There are psycho-education lessons that are cognitively based. There are check-ins where it asks individuals about their mood or anxiety, as well as tools to use in the moment to manage anxiety and cravings.”  

AI as a means to personalize care, increase access, and provide support

If there is one chief benefit of using AI in clinical care, it’s the technology’s ability to obtain insights from massive amounts of data. 

This advantage still stands when AI is applied to the mental health space, noted John Torous, MD, director of the digital psychiatry division at Beth Israel Deaconess Medical Center.  

“In mental health, we’ve had several innovations,” he said.

“First, we had the genetic revolution where we had all this new genetic data and it unlocked the key to understanding mental health. We’ve also had neuroimaging to help us understand how the brain works, as well as smartphone and sensor data. Intuitively, we know that these tools contain important information that we could use to personalize care for patients, but it’s been very hard to unlock that data for clinical insights on a routine basis.”

John Torous, MD

John Torous, MD

AI systems could help providers go through these data resources and collect clinically actionable targets that will improve patient care.

“By doing that, we may be able to offer more personalized and preventive care. And hopefully, we can approach these mental illnesses in a more targeted way,” said Torous.

For Prochaska and her team, delivering care that is personalized to the user was top of mind when developing their AI-powered chatbot.

“Woebot is based on cognitive behavioral therapy principles. It has an empathy component that’s tailored to the messages that the individual sends,” said Prochaska. 

“It’s designed to target cravings and urges and to help the individual build self-awareness in terms of their patterns of thinking, mood-related thinking, anxiety, depression, as well as the urge and craving to use.”

Another advantage of AI-driven tools? Increased access. 

“Chatbots can interact with an individual in real time. They’re available 24/7, at no cost, and they reduce stigma in terms of accessing treatment. Whether these tools are used as stand-alone treatment agents or as an adjunct to more traditional counseling, chatbots provide added therapeutic content,” Prochaska said.

Advanced analytics technologies also have the potential to remove access challenges for marginalized groups, Gaunt pointed out.

“AI can increase equity and access to mental health services, especially for LGBTQ youth who exist at an intersection of identities – because we know that LGBTQ youth already face injustices and discrimination in their everyday lives. This is only compounded when you layer on race, ethnicity, socioeconomic status, gender identity, and so on,” she said.

“When I think about how AI can evolve to support people who represent a variety of lived experiences, I think about eliminating barriers to convenience, or access, or privacy. So, making services available 24/7 on demand in different platforms, and also giving people the space to have sensitive conversations – even more so for ones that they might not feel comfortable having out loud.”

While the use of AI comes with several potential benefits, it will be critical to maintain care approaches that are focused on the patient-provider relationship. 

“There will always be a need for human-to-human connection. AI’s role in this space shouldn’t be to replace humans, it should be to support them,” said Gaunt. 

“At Trevor, we are not setting out to design an AI system that will take place of a counselor, or that will directly interact with a person who might be in crisis. Instead, we’re designing AI to work in partnership with counselors.”

According to Imel, this is the most feasible application of AI in the mental healthcare field.

“The role of AI in mental healthcare that I’m most excited about, and that has the most potential to make an impact now, is supporting human therapists,” he said. 

“Right now, therapists are mostly on their own in the room with the clients and even afterwards. After you’re trained as a therapist, you don’t get supervised much anymore in most places. You’re using your judgment and training to try and make sure you’re doing the right things. There are AI tools now that can listen into these sessions and give therapists some pretty useful feedback –even things like how much you talked and the types of basic interventions you used.”

Ensuring AI helps, not harms

For all of the good AI could do, there are a number of significant barriers to using the technology in mental healthcare. 

With smartphone apps and chatbots, patient engagement is a key factor in determining the success of the technology. 

“One challenge for mobile health applications is making sure people come back and interact with it on a regular basis,” said Prochaska. 

“Typically, the more the individual uses it, the greater the benefit they’ll get from it. Anything you can do to boost engagement should help with your outcomes in terms of accuracy and effectiveness.”

Judith Prochaska, PhD, MPH

Judith Prochaska, PhD, MPH

Additionally, developers and researchers need to make sure they equip these tools with appropriate protections for high-risk patients.

“Woebot does have some safety features built into it, like language detection and rules for risk management. It does state explicitly that it is not a suicide prevention tool,” said Prochaska.  

“And then for our study, because it was our first evaluation of the tool, we were thoughtful in terms of our inclusion and exclusion criteria. For example, if we felt that somebody was at risk for overdose, we didn’t include them in this first evaluation.”

The data used to train AI models is a crucial aspect of their clinical utility – and data quality can be particularly challenging when it comes to mental healthcare.

“AI is only as good as the data it’s trained on and the people that are using it. Each of those are important considerations,” said Torous.

“In terms of the data it’s trained on, we have to wonder what the gold standard is that we’re using. It’s especially hard to have a gold standard in mental health because we know that the current clinical definitions we use are not ideal. We have to take a very careful approach to how we’re training these algorithms, and it’s not going to be as simple as scoring people’s symptoms.” 

Torous also cited the importance of partnerships between industry and academia to develop AI systems that will effectively address the complexity of mental illnesses.

“If we’re just solving very easy problems, it doesn’t advance care. If we have an algorithm that can tell whether someone has schizophrenia or not, it’s useful, but it’s not really going to advance the field. That’s a very small improvement,” he said. 

“We also need to have diverse samples on which to train these models, because if we only take one region, one clinic, or one population, these algorithms are going to have very limited utility. These tools have to be built from the ground up with a very diverse approach – you have to work with the patients, as well as consider input from clinicians, data scientists, and regulators.”

Imel stressed the significance of collaborative development as well. 

“We have to think about how to build partnerships with patients, stakeholders, and therapists around the data that’s going to be necessary to build these tools,” he stated.

“We have to start thinking about how get buy-in from patients as well as providers. The chat platform that a patient is using is capturing their data, and the people behind that platform are trying to use your data to improve care. Similarly, providers are often not super excited about being recorded, so how do we get buy-in from them? Because if we don’t, it’s not going to work.”

Where AI and mental health will meet in the future

While the future application of AI in any area of healthcare is still unclear, this reality may be especially true in the mental health space.

The use of AI in therapy is dependent on several factors, and even if the industry is able to overcome some major challenges, the technology isn’t likely to appear in front-facing mental healthcare delivery.

Zac Imel, PhD

Zac Imel, PhD

“In the future, AI may be helping us on research – sorting through data to find new patterns that may help us understand how mental health illnesses develop, how they spread, and how we can prevent them,” Torous said.

In order for AI to take on a more central role, investigators will need to refine research and analysis in this area, Torous added.

“We know that people are interested in chatbots, and we know that patients are willing to try them. But in 2021, we really want to be answering questions like ‘How well does it work?’ Or, ‘Can we conduct higher quality studies?’ We’ve seen the interest, and hopefully that catalyzes more development,” he said.

Prochaska noted that she and her team plan to further test the Woebot tool to evaluate its effectiveness. 

“This is the first study in a series of three that we have planned. We see this as building a program of research and building the evidence basis for the program, and then we can make improvements in the program as we go,” she said.

One of the most important things to remember is that the development of these tools should be an ongoing undertaking – and not a one-and-done endeavor, Gaunt stated.

“The application of AI is a process that should evolve as we do,” she concluded. 

“Just because we’re doing one thing today doesn’t mean that’s how it always can and should be done in the future. Considering how the world around us can change – not just by the day or the hour anymore, but in minutes or seconds – that’s important. There’s always going to be a new model, a new technique, or knowledge that we acquire.”

RELATED ARTICLES

Most Popular

Recent Comments

pacomonkey007 on
nickrod32 on
Kate on
Gabriel Jimenez on
Boris Dorofeev on
AlexanderCostan on
Gouki249 on
Michael Schaper on
Supertomiman on
Robert Johns on
heyayup on
J.N Turner on
Cassandra Sainvilus on
mistermiah21 on
AL T on
Stjepan Vončina on
Alesandros356 on
Μαριος Κοσκολος on
Kikoushinzen on
Chanti Allen on
askvir2 on
PR3DA7EUR on
mikkita88 on
Shanoriya Robinson on
hightune21 on
s0medudeonline on
Ryan Wright on
Imcia Rens on
Garchomp Pit on
Kai Laa on
king vapor on
king vapor on
barosan jupan on
camaflauge on
Omar Doleymi on
JawNas1 on
Ibraheem Mansour on
SuperAceone on
James Darwin on
toomuchdingding on
lanciauxrayz on
curioussebastian on
Iman Farahin on
Samhain entertainment on
longsweep1 on
SuperCaffeinelover on
Rin Lee on
Samhain entertainment on
banglawaz0 on
banglawaz0 on
Chope89 on
nikos sicks on
ForZaSLaN1905 on
Kieran Murphy on
Brian Sirovey on
Enrico Baratelli on
Kenn Zesky on
Synthiotics on
ROGAN on
DJVM95 on
Corie Jacobs on
久登 寺島 on
Jakob Vlietstra on
shook one on
shook one on
Zeracan on
jarjarbinx79 on
keefkeef chiefchief on
WolfgangSenske on
Pieceofshit19 on
numbstateofennui on
The Real Witches on
Tribble Booth on
Greg Blackman on
Emily Fravel on
Daniel Baker on
Ahimsa Porter Sumchai MD on
Eden Brown on
johnboysssss on
CeeJayDee94 on
TheGoodNews01 on
jpalberthoward9 on
lakecrab on
jpalberthoward9 on
lakecrab on
jpalberthoward9 on
jpalberthoward9 on
jpalberthoward9 on
liffeybeat on
Chad Premo on
Michael E. O'Donnell on
徹 田中 on
Izzat Zainal on
InfliiKted on
angelo leslie on
Regena Daunicht on
Eddie The Liar on
DrNepal on
DrNepal on
TheGrimriftstalker on
Tatts Thompson on
Frederico Miranda Brandão Alves on
Jerry Bender on
uncle mike on
Dluv021 on
杏 唯 on
blu jonce on
lakecrab on
justin gingell on
anand- jivano on
kree8r on
Antonio Amaral on
Issam Bensoltane on
David Klonowski on
joe man on
chris badtrekkie on
Iktisam shahriar on
Hilaire Dufresne on
timthepainter1 on
immrnoidall on
Merle McDane on
Royalhighlander on
J Edge on
Mike J on
Mike J on
EarthEats Moon on
equn on
Lozial on
Grey Umopepisdn on
Adski92 on
ninjia1O1 on
murkyslough18 on
Robert Rickner on
okaminess on
stkcarm5 on
Kim Kelly on
funkymcbean on
ojibajo on
mzwickedlette88 on
neotek79 on
1ofmeNlotsofU on
aeroldoth on
TheThorne13 on
QueenLucyThe2nd on
James Gambino on