Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't know how productive I'm being but I'm using Llama3 via Ollama on a M1 Mac. It's as good as Copilot and Gemini for most things and I'll use those models if I need a little bit more. I prefer the privacy of the local models. I use it both through the command line and with the Open WebUI web interface. I use it for programming tips, learning, research, and writing. As a simple example, I wrote a (reusable) prompt for doing Chicago style title capitalization a few minutes ago. Normally I'd have to search for a web based tool and then manage through the crap. It's much quicker to ask a local LLM.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: