Alexa Jett has suffered some heavy blows in recent years.
Now 28, she was diagnosed with thyroid cancer in 2016. Although she was given the all-clear, in August 2019 another crisis hit when her best friend and former boyfriend died of cancer at the age of 33.
“I completely shut down. I started wondering whether I was next,” Ms Jett says.
She was unable to get out of bed and household chores piled up leaving the house in a mess.
In desperation, she sought help online and from a mental health chatbot, called Vivibot.
“Hey, why don’t we make a goal?” the chatbot texted her on 10 September.
She only had to paint her toenails for a start, but this simple task combined with the chatbot’s “funny and friendly personality” and 24/7 presence, encouraged Ms Jett progressively to get more tasks accomplished.
“She pulled me out of that really dark place and I started functioning again,” says Ms Jett.
Vivibot is offered through GRYT, an app-based social community for people affected by cancer.
Dozens of similar services are available, which chat with their users on matters of mental health. They offer mood reports and tips on how to improve their mental and emotional state.
“These chatbots are a great first step for people who may be experiencing sad or depressed mood or anxiety to reclaim their mental health,” says Danielle Ramo, director of research at Hopelab, which designed Vivibot.
She is quick to add that chatbots cannot treat clinical depression or clinical anxiety, and are not designed to replace a human interaction of any sort.
However, clinical psychologist Noel Hunter says that some chatbots are not marketed that way and instead present themselves as a solution for mental health problems.
“They’re very careful to not explicitly say that because then they get sued. But people do get that message,” says Dr Hunter.
For Dr Hunter, chatbots reinforce the idea that we are at fault for our own suffering.
Psychologist Noel Hunter says mental health apps can’t replace human doctors
“They make you believe that, if you just look on your phone and do a couple of self-help kind of things, that’s going to take the place of the healing nature of a healthy relationship,” she says.
Bots of course can’t pick up on non-verbal communication that can indicate a lot about the way we feel.
“A big chunk of this non-verbal communication, imperative to our overall well-being and to our being fulfilled, is missed out in non human-to-human settings,” Dr Hunter says.
However, there’s a growing interest in using tech to improve the world’s mental health. The World Health Organisation says one in four of us suffers some kind of mental health problem and other research suggests that people are more honest with robots than with fellow humans.
Even social media giants like Facebook are entering the arena of digital mental health.
In October 2019, Facebook and Messenger launched the Let’s Talk Stories filter and a series of Let’s Talk Messenger stickers – tools prompting people to begin conversations leading to support.
“We found that private messaging can make it easier to talk about emotional or serious subjects. In fact, 80% of people who use messaging apps feel they can be completely honest when messaging versus in person,” says Antigone Davis, head of global safety at Facebook.
Facebook says that private messaging can make it easier to talk about serious issues
Looking ahead, there could come a time when artificial intelligence (AI) might be advanced enough to have a deep understanding of human mental health.
“We might have human-level AI in 2029,” says Peter Diamandis, physician, engineer, and founder and chairman of the XPRIZE foundation.
We are only in the nascent days of AI particularly in the medical arena, he says.
“The amount of data that’s now collected by medical exams, whether it’s looking at a brain MRI or your genome and the results from various tests, all these things are way beyond the ability of a single human,” Mr Diamandis says.
“It will actually be malpractice not to use AI in diagnosis in the next 20 years, possibly 10 years,” he continues.
Not everyone agrees that AI will advance that fast. And the question remains, will humans relate to AI in the same way as they do to a human therapist?
Ms Jett certainly thinks so. She points out that her generation has grown up with digital technology – to her it is an extension of her existence.
Peter Diamandis says in the future it will be ‘malpractice’ not to use AI in medicine
But Dr Hunter only sees a techno-bubble that, once it bursts, will get people to turn to more traditional ways of healing, either in the form of human-to-human therapy or spirituality.
“Belonging to certain kinds of groups of peers, something involving relationships,” she says.
Mr Diamandis sees a balance, tilted towards a heavy involvement of AI in our lives.
“I would imagine that a human therapist using AI is going to be much more powerful than a human therapist on their own,” he says, adding that in almost every single area where an AI and a human coexist to evaluate and treat patients, success rates are better.
Mr Diamandis urges us to think of the Iron Man movie to understand how AI will transform our mental health.
In the movie, genius superhero Tony Stark (Iron Man) has a digital helper Jarvis, who arranges his meetings, answers the door, and even organises his playlists.
“I think we’re all going to have a version of Jarvis in the next decade,” Mr Diamandis says.
“A Jarvis who will do our administrative tasks, such as read our emails or answer our phone calls; a Jarvis that will sense a depressed mood in our house and reverse it by playing our favourite movie, or a song it knows uplifts our spirits; a Jarvis that will study us 24/7 and learn us in many ways we don’t even know ourselves.”