AI therapist unavailable: Mental health apps for when you need support

Generative artificial intelligence has captivated users with its ability to convincingly answer online queries. But could this emerging technology also be utilized to aid or even replace human therapists in treating mental illness?

It is estimated that one in five adults in the US struggles with mental health issues, and approximately one in 20 has a serious mental illness. However, due to a shortage of mental health professionals, long waiting lists, and high costs, many individuals are unable to receive the care they need.

Some Americans have turned to experimenting with the ChatGPT chatbot as an unofficial therapist. This technology has the potential to provide a “listening service” and could contribute to the growing market of mental health apps.

However, mental health practitioners have warned about the potential dangers of unsupervised use of generative AI. It could inadvertently reinforce delusions or justify low self-esteem, leading to harm rather than healing.

The field of mental health apps is already attracting significant investment. In fact, mental health tech groups have raised nearly $8 billion in capital since the beginning of 2020.

Lex charts showing fundraising by mental health start-ups and the last chart showing Q and A on why people do not get treatment for mental health issues

This category includes popular meditation apps like Calm and Headspace, known for their relaxation and mindfulness tools. While these apps can provide mental health benefits, they should not be seen as a substitute for therapy.

Telehealth companies such as Talkspace and Amwell have attempted to bridge the gap between users and therapists by offering online therapy sessions. However, they have faced criticism for not having enough qualified professionals to meet the demand. In fact, Talkspace and Amwell have experienced significant market value declines since going public in 2020.

Many existing mental health apps already incorporate AI to some extent. One example is Woebot, a chatbot that aims to deliver cognitive behavioral therapy through daily conversations. Most of Woebot’s conversations have been pre-written by trained clinicians.

Advocates of generative AI chatbots argue that they have the potential to engage users in dynamic conversations that are indistinguishable from human dialogue. However, it is important to acknowledge that the technology is not currently capable of achieving this level of sophistication.

Furthermore, it remains uncertain whether existing mental health apps effectively help users. Unsupervised use of generative AI could potentially do more harm than good. It is essential for individuals to prioritize their mental stability and avoid ad hoc experimentation.

Investors also have a responsibility to exercise care. They should only invest in apps that are overseen by responsible physicians and are seeking regulatory approval as healthcare devices. The fundamental medical principle of “do no harm” applies.

Letters in response to this article:

A mental health app is no match for human diagnosis / From Tim Barker, Chief Executive Officer, Kooth Digital Health, London W2, UK

Investing in wellness apps carries responsibilities too / From Hilary Sutcliffe, Director, SocietyInside, London SE21, UK

Reference

Denial of responsibility! VigourTimes is an automatic aggregator of Global media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, and all materials to their authors. For any complaint, please reach us at – [email protected]. We will take necessary action within 24 hours.
Denial of responsibility! Vigour Times is an automatic aggregator of Global media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, and all materials to their authors. For any complaint, please reach us at – [email protected]. We will take necessary action within 24 hours.
DMCA compliant image

Leave a Comment