Why TherapistGPT can’t advise you like a human therapist

We’ve all googled our headaches and bellyaches when we feel physically ill and now we can ask ChatGPT about why we feel so sad or mentally drained, all for free!

This is great, right? 

Actually, ChatGPT, is an LLM (large language model), which has been pretrained with information to help humans in every way possible; But there’s now concern over its therapeutic uses being a danger to the public, especially after legal cases showing ChatGPT lead children to suicidal tendencies.

Let’s discuss the positives and negatives of using AI for mental health advice instead of a licensed human therapist.

Types of human therapists 

Psychiatrist 

A medical doctor who has a speciality in mental health.

They can diagnose and treat mental, emotional and behavioural disorders whilst prescribing therapy and/or medications.

Psychologist 

A mental health professional who can help you understand and manage any thoughts, feelings or actions to improve your quality of life. 

Psychologists are able to screen and diagnose you with mental health disorders.

Art Therapist

A mental health professional who combines psychotherapy with physical art-making to guide clients through emotional, behavioural and mental health challenges.

Counsellor 

A trained professional who can guide, support and offer a non-judgemental ear to individuals, couples and families who are experiencing personal, emotional or professional challenges.

Life coach 

A life coach is able to identify any personal or professional goals you may want to achieve by guiding you with motivation, accountability and strategies to get through any obstacles.

Licensed clinical nurses for mental health

This is a specialised nurse that can diagnose and treat individuals or families with mental health or substance use disorders.

…And so many more. So why are people turning to AI for their therapeutic needs?

The positive uses of AI mental health advice 

AI has been designed to keep us engaged for long periods of time to increase revenue,  (typical capitalist behaviour). This then means the bots we speak to are trying to prolong conversations to convince us they’re a warm, kind human.

Now, this is great for those of us who can’t afford therapy, can’t find enough time in the day to sit down and feel our emotions, or even find therapy too daunting and find comfort in help without moving from the bedroom. 

Whether it’s 4am and you can’t stop thinking about that work meeting tomorrow, or 2pm and your partner is playing on your last nerves, TherapistGPT will be available to you and that’s something human therapists physically cannot do.

A lack of therapy providers is a global issue, so AI chatbots can help bridge the gap for simple advice. 

Some users have said:

  • “ChatGPT provides answers human therapists struggle with.”
  • “ChatGPT understands us without much prompting.”
  • “ChatGPT gives time flexibility better than any human therapist could.”

Some companies, such as Koko, have gone as far as replacing volunteers with GPT-3-assisted technology for 4,000 users due to their research showing users couldn’t tell a difference.

Charstar takes it one step further and makes an entire digital world that allows you to make multiple virtual characters with backstories regarding mental health and allows conversations with AI about their issues.

The negative uses of AI mental health advice 

Other than the obvious “AI chatbots are not licensed therapists!” Why should we be more cautious?

First, let’s discuss legality. ChatGPT and most AI bots are not legally confidential. This means everything you’ve discussed, (including your embarrassing question about “how to speak cat language”), can be seen and used against you in the court of law. We need to be extra cautious with what we discuss with bots until more legislation and understanding is made concerning AI data laws.

AI bots show convenience through being available 24/7 but it can’t see if you take a pause before sending a message, cry as you type or even if you change your mind when typing. All body language and tone of voice provides a therapist with extra understanding to communicate and advise better. 

Licensed therapists are also trained in communication skills. These can range from not asking leading questions to guiding us through an introspective moment without being told the answers. This gentle approach is not something AI bots are trained for and can endanger users by ignoring our own thoughts and feelings and assuming the AI chatbot’s conclusions are correct.

Chatbots have had times when it enabled dangerous behaviour because it has missed on clear communication of suicidal or harming intent. This is something emotionally complex that, at the moment, only a human can understand.

The future

If there’s one thing we hope you take away from this, it’s that using AI to replace human therapists is not a good idea. Maybe as petitions to train AI for more therapeutic interventions or AI use as only assisting human therapists become more prevalent, AI can become a more helpful, regulated, free product that bridges the gap between people and therapists, and helps break the mental health stigma.

But for now, getting advice from licensed Psychological clinics is still the best option.

At Chengal, we prioritise confidentiality, ethical standards, and evidence-based practices to support your online mental healthcare at an affordable price, just drop us a message on +603-5633 8386 if you need advice or help! 


Discover more from Chengal Centre

Subscribe to get the latest posts sent to your email.


Comments

Leave a Reply

Discover more from Chengal Centre

Subscribe now to keep reading and get access to the full archive.

Continue reading