Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
naillo
on Aug 2, 2023
|
parent
|
context
|
favorite
| on:
Run Llama 2 uncensored locally
First time I've heard of `ollama` but having tried it now for a bit I'm super impressed! This is what I've always wanted playing with LLMs locally to be, just pull weights like you would packages. It all just works. Really nice work :)
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: