Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

No they are likely working on offline LLMs and custom chips so they'll be fine.

If you can run a large model locally for most of the cases, you won't want to use the Google Cloud services or OpenAI.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: