Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Does that even exist? It's basically what they described but with some additional installation? Once you install it, you can select the LLM on disk and run it? That's what they asked for.

Maybe I'm misunderstanding something.



Apparently it does, though I'm learning about it for the first time in this thread also. Personally, I just run llama.cpp locally in docker-compose with anythingllm for the UI but I can see the appeal of having it all just run in the browser.

  https://github.com/mlc-ai/web-llm
  https://github.com/ngxson/wllama


Oh, interesting. Well, TIL.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: