Hacker News new | past | comments | ask | show | jobs | submit login

> The biggest risk is with privacy

No, the biggest risk is that it behaves in ways that actively harm users in a fragile emotional state, whether by enabling or pushing them into dangerous behavior.

Many people are already demonstrably unable to handle normal AI chatbots in a healthy manner. A "therapist" substitute that takes a position of authority as a counselor ramps that danger up drastically.






You’re saying that as if AI is a singular thing. It is not.

Also, for every nay sayer I encounter now, I’m going to start by asking “Have you ever taken therapy? For how long? Why did you stop? Did it help?”

Therapy isn’t a silver bullet. Finding a therapist that works for you takes years of patient trial and error.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: