Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Can this be aimed at ollama or some other locally hosted model? It wasn’t clear from the docs since their config examples seem to presume you want to use a third party hosted API.


Given ollama has a openai compatible API[0], quick searching in repo returns this

https://github.com/alvinunreal/tmuxai/issues/6#issuecomment-...

1: https://ollama.com/blog/openai-compatibility


edit your `.config/tmuxai/config.yaml`

to add these lines:

``` openrouter: api_key: "dummy_key" model: gemma3:4b base_url: http://localhost:11434/v1 ```


I tried it and it didn't work too well. I suspect the prompts were optimized for Gemini, not local Gemma.

TBH I found the whole thing quite flaky even when using Gemini. I don't think I'll keep using it, although the concept was promising.


It's still early release, I know it should still be improved but my hope is community could help.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: