Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Something isn't adding up here... if this is supposed to be supplemental therapy to talking with an actual human, but instead the actual human part is nixed and it is used as the entirety of therapy (i.e. outside of it's prescribed and tested usage) are we not worried about the negative externalities of replacing human contact with machines for curing social ills?

That's not even touching where you've just pointed out that we're essentially now decreeing that poor people get to talk to machines and rich people get to talk to humans, or the fact that insurance companies will jump at the chance to force the cheap AI down everyone's throats before they pay for human therapy, etc etc...



The status quo is poor people talk to nobody and rich people talk to humans. This is an improvement, no?


No.

The status quo is poor people talk to social contacts (eg, friends, family, church community, etc) while rich people talk to professionals (eg, therapists).

This is a change in that poor people are being moved to a professionally manufactured tool, which isn't necessarily an improvement -- it's just replacing an established, ad hoc system with an unknown technical one.

There's every possibility that it would make the situation worse and it's hubris on the part of the medical community that a tool built by them is better than informal therapists.


> The status quo is poor people talk to social contacts (eg, friends, family, church community, etc) while rich people talk to professionals (eg, therapists).

For some people (particularly those who have large extended families), that is indeed the status quo, but for others not so much.

If your home environment is not supportive (you can imagine responses ranging from "everybody gets depressed sometimes, just work harder and things will get better" to "grow a pair"), or even abusive, and you aren't in a position to get professional help, a tool like this (though given various other concerns, perhaps not this specific one) could very well be the only alternative to no treatment at all.

What comes to mind though, isn't an automated therapist per-se, but something closer to the "Young Lady's Illustrated Primer" (from the novel The Diamond Age).


No.

Everyone talks to social contacts. CBT therapy is something entirely different.

Parts of CBT and DBT are skill training in emotional and cognitive behavioural regulation, that people have not mastered sufficiently on their own. You can teach these skills, and I'd argue they should be mandatory classes on them through middle school. IMHO, they are the most valuable skills that you can learn, bar none.

>There's every possibility that it would make the situation worse and it's hubris on the part of the medical community that a tool built by them is better than informal therapists.

There's hubris in your assumption of how therapy works, or in the tools it incorporates today. This application was released, along with a study demonstrating it's efficacy. It's not the first of it's type - it's just the chatbot format of a self help book. There will be much better versions, I'm sure, over time.


I think there's more to therapy than having a good support or peers, although expressing yourself and having someone there to acknowledge you can undoubtedly be very cathartic.

Or, you might talk with friends about what's bugging you, but they're not going to necessarily explore how your feelings relate to external factors, or ask questions to help you understand yourself. A therapist's job is very different from the role of a friend.


If the alternative is sequestering them into talking to an AI controlled by the people who get to talk to humans, then unequivocally no


There is room for therapists, therapist assisting-applications and stand alone tools for therapy, just like there is today. This app is replacment for self-help books, not therapists.

Moreover, CBT and DBT, which is where therapy is currently focused for most issues, involves a lot of mental re-tooling and training.

This part of therapy is like learning any other process, and is approached and practiced like any other skill.


Imagine in the future, a really smart shrink AI is invented that can provide useful insight on how mental health of an individual might be improved. Then, everyone who has logs with this bot, or similar technologies, will be able to benefit retroactively simply by having more digital data points.

Just because you're not talking with a human about the data you're generating _today_ doesn't mean it won't bring interesting discussions to bear in the future.


Those discussions will include "which products are depressed people most likely to purchase?".




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: