Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Literally every single person in my life is using AI on an almost daily basis. Not just like my co-workers, or some nerdy discord chat, like every single person I regularly interact with from my niece to my mother, my boss to my barista.

People complain about it, it's short falls and idiosyncrasies, but it's only been getting better, both the models and the integration.

There is no future now where LLMs aren't playing a big role. We'll have our CLI luddites who believe computing peaked in 1992 forever, but the rest of society is running full speed towards computers that they can talk to in natural language.

That's why Apple is uneasy. The god-tier technology usability company is on the verge of totally missing out on the greatest revolution in human-computer usability ever. My mother isn't going to want an Apple UI anymore when you just talk to the new computers.





What are they doing with AI besides an improved search if they aren’t developers?

According to OpenAI, ~4% of tokens generated are for software.[1]

LLMs are not a tech tool. SWE is practically a fringe use of LLMs.

See page 16: [1]https://www.nber.org/system/files/working_papers/w34255/w342...


I've seen product manager type people use LLMs to auto-generate product requirements docs. I know people who use AI to separate instruments from music tracks in order to learn them.

I guess I should have been more precise with the actual question - what do people do with AI that would cause Apple to be behind as long as they can access chat from their phone? Apple doesn’t need an “AI strategy” just to be able to run a third party chat interface.

What device would she use though, and how are they doing AI more meaningfully than can be an app on the device?



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: