- cross-posted to:
- secularhumanism@sh.itjust.works
- globalnews@lemmy.zip
- cross-posted to:
- secularhumanism@sh.itjust.works
- globalnews@lemmy.zip
It’s cheap, quick and available 24/7, but is a chatbot therapist really the right tool to tackle complex emotional needs?
Bit of a catch-22 though, isn’t it? You want people to get better at doing those things, but they have to do those things in the first place to reach the people that help them get better at it.
I see nothing wrong with having AI chatbots in addition to traditional therapists. As with many AI applications they’re at their best when they’re helping professionals to get more done.
I’m not sure why, but you seem to have posted this yesterday but it didn’t show up until an hour ago. Your instance may be having some issues.
I do get where you’re coming from with all that, but the act of going to therapy itself is an achievement a patient can benefit from, and should be considered from the start. If that truly isn’t possible for someone, voice calls from a real therapist are a reasonable next step.
Also, the original question was, “Can AI replace therapists?”. I can see some meaningful benefits coming from an AI assisting a therapist, but that’s not what I was getting at. AI alone really just feels like a bandaid on a bullet wound, when applying pressure or a tourniquet is also available.
No, the original question is “can AI therapists do better than the real thing?” And yes, they can do better at specific things. That doesn’t make them a replacement, though.
Bandaids aren’t much use for a bullet wound, but bandaids are still good to have and useful in other situations. You wouldn’t use a tourinquet for a papercut.