Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Apple has actually created ML chipsets and SDKs[0], so AI can be executed natively, on-device.

They tend to lag the hypewave, letting the dust settle, before they move in.

[0] https://developer.apple.com/machine-learning/



Won't local inference cause either slow responses and/or increased battery usage? Also storage space.

I wonder if it's worth it. Some here will say privacy but in my experience most users outside HN don't really care about that.


> in my experience most users outside HN don't really care about that.

In my experience, they very much do care, but don’t understand the ramifications of the convenience, pushed by … um … HN users.

In my experience, once I explain to them, what kinds of risks they take, they tend to batten down the hatches.


Out of curiosity, what do you tell them?


I don't have a prepared script.

I basically fill them in on the kinds of stuff that we all take for granted.

But the folks I Serve, with my software, tend to be a wee bit more tinfoil than most. It's a long story, and not one for this venue.


So is it safe to say, reading your last line, that it's not really most users outside HN but a specific group?


Absolutely.

I'm usually careful to couch things in terms of "It has been my experience," etc.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: