Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You know who’s really f——-ed? Apple, they are now way behind google who is still behind OpenAI even with this.


No they are likely working on offline LLMs and custom chips so they'll be fine.

If you can run a large model locally for most of the cases, you won't want to use the Google Cloud services or OpenAI.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: