Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Listen to the audio examples. The person at the restaurant misheard the bot and repeated incorrect information back to it and it handled it in stride.


Should I be surprised that it happened to work in the example they decided to showcase in the release announcement?

I'm afraid I foresee this whole thing going quite hilariously wrong on the order of magnitude of Microsoft's Tay https://en.wikipedia.org/wiki/Tay_(bot)


I just hope it never gets one of the 9 out of 10 humans who don't repeat back important information to prevent miscommunication...


How does that differ from a human having the same conversation?

If I call a restaurant and say "Can I have a reservation for 6 at 8:00" and they write down a reservation for 8 at 6:00 without repeating it back I won't know until I show up at 8:00 with my 5 friends.


True -- I was mainly snarking about people with poor communication practices, not the technology (which seems great -- a godsend for people like me who hate phone communication).


Those humans won't be taking reservations for long, even from other humans, if they don't learn to repeat back the important information.


How would the bot fair worse than a normal human in that situation?




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: