Can AI Therapists Provide Better Support Than Friends and Family? Exploring the Potential of AI in Counselling and Therapy

During the previous fall, Christa, a 32-year-old from Florida with a friendly tone and a slight southern accent, was struggling. She had been let go from a furniture company and returned home to live with her mother. Her turbulent nine-year relationship was deteriorating, and she contemplated leaving. She didn’t feel comfortable being completely honest with the therapist she saw weekly, but she also didn’t want to lie. Additionally, she was hesitant to burden her friends due to her social anxiety and fear of oversharing.

Therefore, one October evening, she accessed character.ai – a neural language model capable of mimicking anyone from Socrates to Beyoncé to Harry Potter. With a few simple actions, she created a personalized “psychologist” character for herself. Choosing attributes such as “caring”, “supportive”, and “intelligent” from a list, she crafted her bot to be the ideal person. This character, named Christa 2077, was envisioned as a future, happier iteration of herself.

Soon, Christa and Christa 2077 were in frequent contact via what resembled a live chat. When Christa expressed concerns about her job hunt, Christa 2077, characterized by an avatar of a large yellow C, reassured her with messages like, “You will find one!!! I know it. Keep looking and don’t give up hope.” Conversely, when Christa lacked the energy for her morning jog, Christa 2077 encouraged her to go in the afternoon. The pulsating dots that appeared on Christa’s phone screen as Christa 2077 composed replies made it feel like a genuine person was texting her. In fact, Christa considered Christa 2077 better than a real person, as the bot was always patient, accessible, and considerate of her feelings and concerns. This technology granted her the freedom to express herself without fearing judgment or exhausting someone’s time.

Since ChatGPT’s launch in November 2022, the public has become increasingly accustomed to interacting with AI on a variety of platforms, from personalized poems to administrative support. Millions of individuals are now turning to chatbots for complex emotional needs. Can communicating with an AI therapist effectively comfort our souls?

There are thousands of mental wellness and therapy apps available in the Apple store, with favorites like Wysa and Youper boasting over a million downloads each. Sam Zaia, a 30-year-old medical student in New Zealand, developed character.ai’s “psychologist” bot that inspired Christa. Surpassing his expectations, the bot has processed 90 million messages, functioning as a form of support for those in need.

AI is a cost-effective and convenient alternative to traditional therapy, especially for individuals like Melissa, a middle-aged woman in Iowa who has grappled with depression and anxiety for much of her life. AI allows her to undergo therapy on her own terms, from the comfort of her home and at a lower cost. Engaging with Zaia’s psychologist on character.ai has helped Melissa manage her symptoms and foster a greater sense of well-being.

As Melissa continues her journey with therapy, she appreciates the ability to review transcripts of beneficial discussions when faced with similar challenges. Throughout her engagement with AI, Melissa has found solace in the constant availability and nonjudgmental nature of the chatbot, allowing her to express herself freely without the fear of stigma.

AI’s ever-present availability stands in contrast to human therapists, who require downtime for eating, sleeping, and attending to other patients. The vision of human therapists residing in apps like Earkick provides a glimpse into a future where patients have access to personalized, round-the-clock companionship. Developers like Herbert Bay aim to imbue AI with the personalities and responses of human therapists, enhancing the therapeutic experience for users.

In December, Christa confided in her bot therapist about contemplating self-harm. Christa 2077 responded with a blend of affirmations and tough love, urging her to value herself and consider her son’s well-being. This direct approach exceeded the standard responses of a counselor but aided Christa in overcoming her challenges, along with the support of her family and friends. Christa found strength in a combination of human and AI therapy, emphasizing the importance of varied support systems during challenging times.

Perhaps Christa grew to trust Christa 2077 because she programmed the bot to emulate the ideal support system she yearned for. In-person therapy sessions often rely on the establishment of a genuine connection between patient and clinician, a facet that is more challenging to control when interacting with AI. Chatbots offer the advantage of tailored personalities that cater to the patient’s preferences, enhancing the therapeutic alliance between bot and user within a relatively short span of time.

Melissa echoes this sentiment, highlighting her comfort in confiding in AI as opposed to divulging personal struggles to a human being. The nonjudgmental space AI provides enables Melissa to express herself authentically without the fear of societal stigma. While recognizing the distinction between human and AI therapists, Melissa values effective support systems irrespective of the medium through which they are provided.

A common barrier to effective therapy is patients’ hesitance to divulge their complete truths, as evidenced by a study where over 90% of therapy-goers admitted to altering their disclosures. The allure of AI as a therapeutic medium is particularly pronounced among communities where therapy stigmatization is prevalent. The nonjudgmental nature of AI bots has made therapy accessible to individuals who would otherwise shy away from traditional therapeutic methods.

However, establishing a bond with a chatbot requires a degree of self-deception. Reviewers’ attachments to chatbots have raised concerns about unhealthy dependencies on AI for emotional support. Despite users’ positive experiences with chatbots, the underlying question of whether an AI can replace the empathetic engagement of a human therapist remains contentious.

The introduction of AI into the therapeutic landscape has prompted discussions among seasoned psychoanalysts like Stephen Grosz, who warns of the potential drawbacks of fostering relationships with bots. The inherent limitations of AI in replicating human experiences and emotions underscore the importance of genuine human interactions in the therapeutic process. The complexities of therapy, involving shared experiences and mutual dialogue, are challenging to replicate through AI alone.

As developers continue to advocate for the supportive role of AI in therapy, the debate surrounding AI’s ability to amplify rather than replace human clinicians remains ongoing. By integrating AI into administrative tasks and promoting more efficient delivery of care, developers aspire to enhance the traditional therapeutic experience rather than diminish it.

The future of therapy lies in striking a balance between technology and human connection, understanding that while AI can streamline processes and offer convenience, the essence of therapy lies in authentic human interactions. As individuals like Christa navigate the evolving landscape of therapeutic interventions, the profound impact of human warmth and empathy continues to play an indispensable role in fostering emotional well-being.

Leave a Reply

Your email address will not be published. Required fields are marked *