Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This seems cool! Is there a way to try it locally with an open LLM? If you provide a way to set the OpenAI server URL and other parameters, that would be enough. Is the API_URL server documented, so a mock a local one can be created?

Thanks.



Definitely, it is something we are super focused on as it seems to be a use case that is important for folks. Opening up the proxy server and adding local LLM support is my main focus for today and will hopefully update on this comment when it is done :)


I just added the ability to run the proxy locally: https://github.com/squaredtechnologies/thread/commit/7575b99...

Will update once I add Ollama support too!


Ollama support would be amazing. There's a stack of people in organizations (data rich places) who would likely love something like this, but who cannot get to OpenAI due to organizational policies.


Awesome, thank you! I'll check it out.


From https://news.ycombinator.com/item?id=39363115 :

> https://news.ycombinator.com/item?id=38355385 : LocalAI, braintrust-proxy; [and promptfoo, chainforge]

(Edit)

From "Show HN: IPython-GPT, a Jupyter/IPython Interface to Chat GPT" https://news.ycombinator.com/item?id=35580959#35584069




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: