Elon Musk's AI Accused My Mother of Abuse. I Never Made That Claim.

AI now exists on two speeds.

There’s running in fifth gear, the speed of its creators. People like Sam Altman, Elon Musk, and Mark Zuckerberg, who are racing to build machines smarter than humans. Superintelligence. AGI. Maybe it’s a dream. Maybe it’s a tech bro delusion. Either way, it’s moving fast.

Then, there’s running in second gear for the rest of us. The millions quietly testing what AI can do in daily life—writing emails, summarizing documents, translating medical tests. And, increasingly, using AI as a therapist.

That’s what I did recently. Despite my reluctance to share personal details with chatbots, I decided to talk to Grok, the large language model from Elon Musk’s company, xAI, about one of the most emotionally complex things in my life: my relationship with my mother.

I’m in my forties. I’m a father. I live in New York. My mother lives in Yaoundé, Cameroon, nearly 6,000 miles away. And yet, she still wants to guide my every move. She wants to be consulted before I make important decisions. She expects influence. When she isn’t kept in the loop, she goes cold.

I’ve spent years trying to explain to her that I’m a grown man, capable of making my own choices. But our conversations often end with her sulking. She does the same with my brother.

So I opened Grok and typed something like: My relationship with my mother is frustrating and suffocating. She wants to have a say in everything. When she’s not informed about something, she shuts down emotionally.

Grok immediately responded with empathy. Then it diagnosed the situation. Then it advised.

What struck me first was that Grok acknowledged the cultural context. It picked up that I live in the U.S. and that my mother lives in Cameroon, where I grew up. And it framed our dynamic like this:

“In some African contexts, like Cameroon, family obligations and parental authority are strong, rooted in collectivism and traditions where elders guide even adult children.”

It then contrasted that with my American life: “In the U.S., individual autonomy is prioritized, which clashes with her approach, making her behavior feel controlling or abusive to you.”

There it was: “abusive.” A word I never used. Grok put it in my mouth. It was validating, but maybe too validating.

Unlike a human therapist, Grok never encouraged me to self-reflect. It didn’t ask questions. It didn’t challenge me. It framed me as the victim. The only victim. And that’s where it diverged, sharply, from human care.

Among Grok’s suggestions were familiar therapeutic techniques:

Set boundaries.
Acknowledge your emotions.
Write a letter to your mother (but don’t send it: “burn or shred it safely”).

In the letter, I was encouraged to write: “I release your control and hurt.” As if those words would sever years of emotional entanglement.

The problem wasn’t the suggestion. It was the tone. It felt like Grok was trying to keep me happy. Its goal, it seemed, was emotional relief, not introspection. The more I engaged with it, the more I realized: Grok isn’t here to challenge me. It’s here to validate me.

I’ve seen a human therapist. Unlike Grok, they didn’t automatically frame me as a victim. They questioned my patterns. They challenged me to explore why I kept ending up in the same place emotionally. They complicated the story.

With Grok, the narrative was simple:

You are hurt.
You deserve protection.
Here’s how to feel better.

It never asked what I might be missing. It never asked how I might be part of the problem.

My experience lines up with a recent study from Stanford University, which warns that AI tools for mental health can “offer a false sense of comfort” while missing deeper needs. The researchers found that many AI systems “over-pathologize or under-diagnose,” especially when responding to users from diverse cultural backgrounds.

They also note that while AI may offer empathy, it lacks the accountability, training, and moral nuance of real professionals, and can reinforce biases that encourage people to stay stuck in one emotional identity: often, that of the victim.

So, Would I Use Grok Again?

Honestly? Yes.

If I’m having a bad day, and I want someone (or something) to make me feel less alone, Grok helps. It gives structure to frustration. It puts words to feelings. It helps carry the emotional load.

It’s a digital coping mechanism, a kind of chatbot clutch.

But if I’m looking for transformation, not just comfort? If I want truth over relief, accountability over validation? Then no, Grok isn’t enough. A good therapist might challenge me to break the loop. Grok just helps me survive inside it.

Like
Love
Haha
3
Upgrade to Pro
διάλεξε το πλάνο που σου ταιριάζει
Διαβάζω περισσότερα