Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Not condoning or anything, but perhaps the thinking is that, if the user can re-enabling siri at a later date, they don't want siri to start with no memory?


Why would you think that?

If I enable some personal assistant at some point in time, I absolutely do expect it to start with no memory.


If/when a user actively consents to "learn from app", it's no different than setting up a new device, e.g. mail downloaded from IMAP server, data transferred from old device, or from cloud services.

Now imagining a EULA for Helpful Pre-Stalking..




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: