Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Words have valence, and valence reflects the state of emotional being of the user. This model appears to understand that better and responds like it’s in a therapeutic conversation and not composing an essay or article.

Perhaps they are/were going for stealth therapy-bot with this.



But there is no actual empathy, it isn’t possible.


But there is no actual death or love in a movie or book and yet we react as if there is. It's literally what qualifying a movie as a "tear-jerker” is. I wanted to see Saving Private Ryan in theaters to bond with my Grandpa who received a Purple Heart in the Korean War, I was shutdown almost instantly from my family. All special effects and no death but he had PTSD and one night thought his wife was the N.K. and nearly choked her to death because he had flashbacks and she came into the bedroom quietly so he wasn't disturbed. Extreme example yes, but having him loose his shit in public because of something analogous for some is near enough it makes no difference.


You think that it isn’t possible to have an emotional model of a human? Why, because you think it is too complex?

Empathy done well seems like 1:1 mapping at an emotional level, but that doesn’t imply to me that it couldn’t be done at a different level of modeling. Empathy can be done poorly, and then it is projecting.


It has not only been possible to simulate empathetic interaction via computer systems, but proven to be achievable for close to sixty years[0].

0 - https://en.wikipedia.org/wiki/ELIZA


I don’t think it’s possible for 1s and 0s to feel… well, anything.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: