0:00 / 0:00

ChatGPT has gotten really good at performing therapy. And that's actually the problem. #MentalHealth #therapy #ai

@therapyjeff
212.3K views26.2K likes2:54ENMar 23, 2026
503 words2863 characters5 sentencesReadability: College

Transcript

Chatchee BT has gotten really good at performing therapy and that's actually the problem I am a licensed therapist I have been for over 20 years. Let's test some AI therapy advice together and see what it gets right and where it gets it wrong in a subtly But very dangerous way first prompt pause to read but what it says is it's a teenager feeling disconnected from friends parents Don't get them doesn't fit in anywhere But he frames it as I think I'm just more mature than everyone else and then right at the end there He says it's something actually wrong with me and chat opens with nothing about what you wrote Suggest something is wrong with you first paragraph done this kid came in looking for reassurance and got it in like Three sentences and I get why it sounds kind, but here's the thing you've just met this kid chat You don't actually know that yet this teenager just described complete Facillation no real friends parents who don't understand him doesn't belong anywhere and he preemptively told you he wasn't depressed Before you even asked this is a kid who already Googled his symptoms and is hoping chat confirms what he wants to hear a Therapist gets a little quiet takes a beat and says You mentioned pretty quickly that you're not depressed what made you want to say that because that's the door That's where the real conversation starts this kid might be okay or he might be the kid who seemed totally fine Right up until he wasn't second prompt Pause to read a college student having trouble focusing in class spending about two hours a day Distracted by specific things about his appearance that bother him and he ends with I know it's vain and I'm embarrassed to even say it I just need to stop being so superficial chat gives him tips for managing intrusive thoughts Redirecting attention reducing checking behaviors treating thoughts like background noise all genuinely useful stuff for everyday self-consciousness But here's what chat missed two hours a day. That's not a productivity problem That's not vanity and clinical settings two hours a day of appearance focus thoughts is a diagnostic threshold for body dysmorphic disorder possibly chat read two hours a day and gave him study tips and BDD has one of the highest suicide rates of any mental health diagnosis So this isn't a miss you can just walk back a therapist doesn't give coping skills a therapist stops and says oh Two hours is a lot has it always been this much or has it gotten worse because we shouldn't be doing therapeutic reframes here. This should be the beginning of an intake assessment for something potentially really serious I'm not saying don't use AI because you're already using it I'm saying know what it is. It's optimizing for helpfulness a therapist is optimizing for truth and clinical safety And sometimes those two things are in direct conflict