In psychology, empathy is crucial because it fosters understanding, compassion, and strong relationships. It allows individuals to connect emotionally and create a more supportive and harmonious society. Alfred Adler, founder of the School of Individual Psychology, once described empathy as “Seeing with the eyes of another, listening with the ears of another, and feeling with the heart of another.”
Today, artificial intelligence (AI) chatbots are being explored by psychologists to make therapy more accessible for patients, improve interventions, and aid in training new clinicians.
However, despite AI’s potential, there is cause for concern. In tests, chatbots have spread misinformation, professed their inner desires, and even sexually harassed patients, all of which have prompted leaders in tech and science to call for a pause.
The traditional approach
The traditional approach in therapy refers to the established methods and practices used in the field of psychology and mental health treatment. This often involves face-to-face sessions, using psychological theories and techniques to address mental health issues and focus on an individual’s thoughts, feelings, and behaviours.
Traditional therapy demonstrates significant effectiveness in improving mental health. Many studies consistently show that psychotherapy, including Cognitive Behavioral Therapy (CBT) and other forms of face-to-face therapy, can be highly effective in treating a wide range of mental health issues.
Studies indicate that face-to-face therapy is effective for many mental health issues. According to a survey by the American Psychological Association, 50% of all clients had improved symptoms after they received a face-to-face mental health treatment in eight sessions, with 75% experiencing improved symptoms by six months.
Ultimately, human connection is the foundation of the relationship between the therapist and patient. When a patient feels connected to their therapist, they are likely to engage in therapy and see improvements in their mental health.
If traditional therapy works, then what is the issue?
Although traditional therapy works, there is a shortage of mental health practitioners around the world. In February 2024, the US Health Resources and Services Administration estimated that 122 million Americans lived in areas with a shortage of mental healthcare providers. It is estimated that the country needs about 6,000 clinicians to cover the gap.
Therapists’ work and patient load have risen in response to increased demand. According to the American Psychological Association, the percentage of therapists working overtime before the pandemic grew from 31% in 2020 to 38% in 2022. Amid this increasing workload, more psychologists fail to meet the treatment demands of their patients.
Across the pond, in 2022, the NHS reported a shortage of 2,000 qualified therapists in the UK. A general practitioner in the UK stated in response to a British Medical Association survey: “Mental healthcare in this country is dysfunctional. It’s broken.”
Introducing AI chatbots in therapy
In response to the shortage of practitioners, many therapists and patients are resorting to AI chatbots for therapy and mental health support.
AI chatbots are large language models (LLMs) that can provide mental health support through automated conversations and therapeutic exercises. Apps like Woebot, Youper, and Character.ai had over a million downloads in 2024. These chatbots have been used to support people dealing with mild depression, loneliness, anxiety, and other mental issues. When people come to them with a problem, these bots respond in ways a therapist might—they ask questions and suggest coping mechanisms.
Is AI safe to use?
While AI chatbots might seem like a useful and cost-effective way of addressing mental health issues, there is one hurdle they will likely never overcome: a chatbot will never possess’ human emotions, no matter how convincingly it mimics them.
Emotions in humans are complex phenomena, deeply intertwined with our sensory and motor systems, influencing our decisions and behaviors. In contrast, AI systems lack an intrinsic emotion module, which fundamentally differentiates AI from human intelligence.
Also, AI is dependent on the context and pre-existing data to which it has access, meaning that any biases existing in that content will manifest in the AI’s responses. As a result, the integration of AI chatbots raises the possibility of racist, sexist, ageist, and other types of biased responses finding their way into conversations and inappropriate responses.
Does AI have a future in therapy?
Ultimately, while AI offers human-like responses, chatbots will never understand and express human emotion. This poses a substantial challenge when applied in psychology and therapy, which are built on human interaction, trust, emotional intelligence, and a sense of mutual understanding. Educators will need to consider the accuracy and reliability of AI, as well as privacy and data security in psychology.