Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't think NLP is the problem here. Even if you crack NLP perfectly and would somehow be able to build a semantic map, or whatever your technology would use, of the user request with perfect fidelity, there's still the question of actually understanding what it's about. Perfect NLP would help in very limited number of situation where the meaning is clear - i.e. doing some pre-defined action, like closing an account or telling the balance. But somebody comes with a request like "my dog ate my credit card, so I by mistake used my debit one and it resulted in overdraft and penalty change, can I have that reversed and get a new one?" and I'm pretty sure chatbot won't be able to handle it, NLP or not.


> there's still the question of actually understanding what it's about.

Philosophically interesting, but in practice entirely unimportant. What you want is something that speaks like a human would speak in a similar environment, then the issue (getting the correct responses) is solved. Whether or not it actually understands won't change a thing about that situation.


Speaking nonsense like a human - i.e. being a perfect simulation of a mentally deficient human - is not going to help much. If you would respond with perfectly grammatical sentences bearing to relation to the question in hand, at best what you'd get would be Elisa (granted, 90% of tech support may be that), at worst, it'd be pissing off the clients.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: