With demand for care outstripping supply, mental health support bots have started to fill the gap.

Most therapists are fully booked due to the increase in demand for mental health care since the height of the pandemic.

Wysa, launched in 2016, was one of the first of these digital mental health support platforms.

Wysa - Everyday Mental Health
Use Wysa to vent or just talk through negative thoughts and emotions. Let it help you cope with pandemic anxiety and lockdowns. It is anonymous, safe and free.

Since then, hundreds of viable competitors, including Woebot and Youper, have consolidated their position in a market that imposes few restrictions on them.

AI therapy bots do not require approval from the Food and Drug Administration (FDA), the US health and consumer regulator - as long as they do not replace human therapists.

According to Scientific American, in 2020, the agency reduced inspection procedures for "digital therapeutics" in the hope of containing the pandemic-related psychiatric crisis, paving the way for developers to launch products that claim mental health benefits.

Like other AI tools, therapy chatbots have several shortcomings. Their responses often demonstrate no more than a superficial understanding of the problems referred to.

Although current bots are not based on the problematic LLM - the large language models used in generative AI's such as ChatGPT - there are no studies evaluating possible prejudices encoded in their dialogues.

AI Therapy Bots Have Risks and Benefits and More Risks
Therapy chatbots are increasingly popular and may benefit some people, but it’s dangerous to trust AI during a mental health crisis

It is not possible to know, for example, whether the bots' dialog might unfold differently for users from different racial, gender or social groups, potentially leading to unequal mental health outcomes.

The more human-like and unrestricted chatbots become, the more difficult it will be to prevent them from giving inappropriate or biased advice, which could lead to various consequences in the lives of those who use this type of help.

It's therefore important to use AI only for small amounts of help and not as if it were helping humans and psychologists.