Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Can you host that model locally with ollama?


I haven't seen a GGUF for it yet, I imagine one will show up on Hugging Face soon which will probably work with Ollama.


Do you think local LLMs combined with P2P networks could become a thing? Imagine people adding datasets to an open model, the same way they add blocks to a blockchain, which is around 500GB in size.

It could help decentralise power and reduce our dependency on the big players.


There have been ambitions to do that kind of thing with LoRA - see the leaked "no moat" Google memo from a couple of years ago for one example: https://simonwillison.net/2023/May/4/no-moat/

It hasn't really happened though. I suspect that's because it turns out techniques like RAG or tool calling are massively easier and more effective than trying to tech models new information through shared model weights.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: