Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yes, the future is in making a plethora of hyper-specialized LLM's, not a sci-fi assistant monopoly.

E.g., I'm sure people will pay for an LLM that plays Magic the Gathering well. They don't need it to know about German poetry or Pokemon trivia.

This could probably done as LoRAs on top of existing generalist open-weight models. Envision running this locally and having hundreds of LLM "plugins", a la phone apps.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: