AI Shouldn’t Be Your Coach, Therapist, or Mentor...
I get it, it’s tempting, isn’t it?
You’re feeling stuck, overwhelmed, or in need of guidance, perhaps telling yourself 'it's not bad enough for professional help', or 'I can't afford the type of support I would like' and there it is, the AI bot asking if you need help with anything. Ready to offer advice, 24/7, without judgment.
You might even be fooled into thinking it’s like having a coach, therapist, or mentor in your pocket, always there to help and support you, and I wouldn't blame you if that is where your head is currently at.
AI in the News
AI is definitely a hot topic at the moment, and I am so used to seeing conversations and discussions about the ethics or ecology behind it, with the following headlines just a few I've seen in recent weeks:
Google reported a 48% increase in greenhouse gas emissions since 2019, attributing the surge to energy consumption by AI data centers.
Training AI models consumes significant water resources. For instance, training GPT-3 in Microsoft's U.S. data centers can evaporate approximately 700,000 liters of clean freshwater.
Over 400 artists, including Elton John and Paul McCartney, have criticized the UK government's proposed changes to copyright laws, which would allow AI companies to use copyrighted material without permission.
Getty Images is pursuing legal action against Stability AI for allegedly using millions of its images without consent to train AI models.
There’s a lot of conversation right now about the negative impact of AI, and don’t even get me started on the whole “dumbing down of society” debate.
But seriously, think about it for a second: Who actually does math in their head anymore? It’s way too easy to reach for your phone, open the calculator, and let it do the work, even though you probably could figure it out yourself.
And that’s the point. If AI can write your emails, craft your social posts, and polish your website copy. It’s not hard to imagine a world where most of what we read has been filtered through a machine, a version of our voice that’s been edited, sanitised, and stripped of its realness. What happens when we stop trusting ourselves to write, to express, to create, without asking a bot to “make it sound better”?
What happens when we hand over our creativity, our communication, and our critical thinking in exchange for convenience? We’re not just outsourcing tasks, we’re slowly outsourcing our voice.
AI as your Therapist?
But that isn't what this article is about. When I think about AI and how it might affect our lives moving forwards one conversation I never expected to be having, was people telling me they’re using AI as their therapist.
And I use the word therapist loosely here, because whether you call it a coach, mentor, guide, or healer, the core intention is the same: You’re looking for support. You’re trying to shift something deep inside that’s been holding you back.
But is it really right, or even safe, to use artificial intelligence to unravel the mindset blocks, emotional patterns, and nervous system responses that have been keeping you stuck? In my mind the answer is so clearly a NO, it is not safe., and then it got me thinking...
Are we really at a place where AI feels like the only option? Where it’s seen as “better than nothing”? Where past experiences with therapy or mentorship left us so disillusioned that a chatbot feels more reliable? (Yes, these are all real things people have said to me recently.)
Maybe some of us have, but I want you to hear this: Relying on AI for personal development or emotional healing can be more harmful than helpful.
Would You Trust Google With a Diagnosis?
Let’s be real…
Would you Google your symptoms and take health advice from a random website on page one of the search results? I really hope the answer is no. Sure, we’ve all done the “Google spiral” once or twice, (myself included), but we know better than to actually take that as gospel. Why? Because we understand that Google is simply scanning the internet for articles and information it thinks might help.
It doesn’t know your medical history. It can’t assess your symptoms in real time. It doesn’t get to look you in the eye, take your pulse, or feel the subtleties of what’s really going on. So what do we do? We book an appointment with a doctor. We speak with a specialist. We go to a pharmacist.
So tell me this…Why, when it comes to your mental and emotional health, would you hand it over to AI? I recently heard someone say, “I’m basically getting therapy from ChatGPT,” and my jaw honestly hit the floor.
My question back was this: Do you really trust a bot with your trauma?
This isn’t shade, it’s a wake-up call.
Because here’s the thing: AI is doing exactly what Google does. But instead of giving you a list of sites to explore, it pulls from those sources and serves you a neatly packaged answer it thinks fits your question.
Let me make this clear, it is responding to the question YOU ask it.
AI is only as good as the prompt it’s given. So if you’re stuck in a loop, asking the same questions you’ve been spinning on in your head, how do you expect it to lead you out? At best, it might give you a few mindset hacks or a motivational quote. At worst, it gives you surface-level advice that sounds good but completely misses the root of what’s going on.
That’s not transformation. That’s slapping a fresh bandage over a wound that needs stitches.
The Illusion of Understanding
AI is trained on massive amounts of text. It can sound like it gets you. But it doesn’t know you. It doesn’t feel. It doesn’t intuit. It doesn’t recognise when your energy shifts mid-sentence or when you’re holding back tears. It can’t read your body language or track the nuance in your silence.
But a trained, trauma-informed coach or therapist can. That’s where real healing begins. In the pause. In the subtle. In the space between what you say and what you mean.
Misinformation in Disguise
Let’s talk hallucination. Not the kind you’re thinking, AI hallucination. This is when ChatGPT makes things up that sound legit but aren’t. It can’t always discern fact from fiction, especially when asked for emotional support, mental health guidance, or trauma navigation.
You might ask it how to handle your anxiety… and it gives you surface-level, generic advice pulled from random corners of the internet. No trauma sensitivity. No clinical nuance. Just an echo of words with no weight behind them.
I feel like I shouldn't need to spell out how dangerous this can be but let’s take a simple example…
Say someone is experiencing burnout, not just tired, but full-blown, nervous-system-shot, can’t-get-out-of-bed burnout. They ask AI, “How do I stay motivated in my business?” And the response?
“Push through.” “Wake up earlier.” “Set bigger goals.” “Discipline over motivation.” It sounds polished. Maybe even inspiring. But it’s completely missing the point. Because what that person actually needs is nervous system repair, not another productivity hack. They need rest, support, and probably a hard look at their boundaries, not to be told to grind harder.
That’s the danger.
When you’re in a vulnerable space, generic advice can reinforce the very patterns you’re trying to break. It can push you further from your truth instead of helping you come home to it.
Your Story Deserves More Than a Script
Every woman I work with has a different story. Different trauma. Different energy. Different safety cues. Different emotional thresholds. None of that can be captured by an AI. And when we outsource our healing to a tool that doesn’t know our nervous system, our triggers, or our truth, we bypass the real work.
This is why I combine NLP, Timeline Therapy, and EFT. Because real transformation lives in the body. In the somatic experience. In relationship.
AI as a Tool, Not a Therapist
Look, I’m not anti-AI. I use it. It’s brilliant for streamlining, ideating, and organising thoughts. But healing? Embodiment? Transformation? That’s a human job. Your mental health is sacred. It deserves a real connection. A safe space. A witness. Someone trained not just to respond, but to hold.
I get that it’s convenient. It’s easy. It feels like better than nothing. But when it comes to your healing, please don’t settle for “better than nothing.”
Don’t let a machine hold your heartbreak. Don’t give your power to a program trained to predict, not to understand. You deserve more. You are more.
So close the laptop. Put your face in the sun. Reach out to someone real. Let the next step of your healing be rooted in human connection.
Stay grounded. Stay human.
Kayleigh

