Mindful Mondays: Artificial intelligence has limited power to treat mental health issues


Joscelyn Stocks | Daily Trojan

Artificial intelligence is ubiquitous. Siri and Alexa can scour the web for information, schedule appointments and play “Mo Bamba” for us on demand. Netflix analyzes our watch history to suggest shows to binge watch. And Amazon uses personalized ads to make billions of dollars off our shopping habits.

But machine learning’s sole purpose isn’t just to sell you that coffee maker you really don’t need.

Recently, AI has taken on a new job: your therapist. As far as I’m concerned, this new technology is promising but should be approached with caution.

Machine learning has woven itself into the fabric of our daily lives, providing us with services we never knew we needed. When it comes to seeking mental and emotional support, AI is revolutionary for its accessibility.

As I’ve mentioned repeatedly in this column, a person cannot start getting better unless they have the resources to do so. And yes, some of the work must be done independently, but we all need a push in the right direction. Finding a professional suited to your needs, securing a means of transportation to the office and paying for that professional are all obstacles people need to overcome when finding mental or emotional treatment — and for many, those obstacles are insurmountable.

That’s where AI systems like Tess come into play. Created by tech start-up X2_AI, Tess is a mental health chatbot that provides temporary — and more importantly, affordable — support to those who may not have the time to find a therapist. Users can message the chatbot when they’re having panic attacks or simply need to vent, and Tess will formulate responses based on the subject’s needs.

Tess is a form of artificial intelligence, and one of its greatest assets is her ability to learn. It reacts to shifting information, analyzing speech patterns to make sure each patient is understood — if Tess is making any mistakes, users can simply tell it, and it will remember. And if Tess senses the situation requires more intervention than it is programmed for, it will connect users with real, human therapists.

Along with its availability, one of the biggest advantages of using AI in these situations is the anonymity of the service: When using chatbots like Tess, people can share their biggest secrets, their biggest traumas, with no fear of judgment from their audience.

According to an NPR report, nearly 350 million people in the world suffer from depression, and most of them do not receive treatment simply because of the stigma surrounding the condition. When you remove the confrontational element of a therapist, you break down some of the fear of seeking help in the first place.

AI can also catch symptoms and diagnose personalized treatment before people even realize a problem themselves.

For example, researchers from Harvard University and the University of Vermont have used machine learning tools to improve depression screening: Employing color analysis tests and algorithmic face detection on Instagram photos, they correctly detected signs of depression 70 percent of the time. Furthermore, a 2016 study by the Suicide and Life-Threatening Behavior journal found that artificial intelligence and machine learning can identify a suicidal person with 93 percent accuracy.

That said, I can’t help but remain apprehensive about depending too heavily on AI, especially in more critical situations.

There is a human element therapists provide that simply cannot be recreated by AIs. Therapists and psychiatrists are trained professionals; they know how people’s minds work because not only have they studied them for years, but they are also people themselves. On the other hand, an AI system — however sophisticated the technology may be — is strung together by fallible code. Brilliant code, yes, but code nonetheless.

On the whole, if it isn’t worsening the conditions of people seeking help, I see no problem with embracing AI within the field of mental illness research and treatment. And while it may never recreate the experience of seeing a professional, AI creates yet another platform for people to find help — and take their first steps on the road to recovery.

Ryan Fawwaz is a sophomore majoring in journalism. His column, “Mindful Mondays,” runs every other Monday.