https://ollama.com lets you run models on your own hardware and serve them over a network. The you point your editor at that server, eg https://zed.dev/docs/ai/configuration#ollama
https://ollama.com lets you run models on your own hardware and serve them over a network. The you point your editor at that server, eg https://zed.dev/docs/ai/configuration#ollama