Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Could you describe the tweak you did, and possibly the general setup you have with zed working with LM Studio? Do you use a custom system prompt? What context size do you use? Temperature? Thanks!


Here is how my prompt ended up! https://gist.github.com/hbradio/2f504c3fdb6f7113181b2d8c6862... I just asked an LLM to make it similar to a working Qwen prompt.

To make the LLMs able to use tools, I had to configure them in the Zed settings like this: https://gist.github.com/hbradio/fa4b456658a8d250e6ccc69ae9b3...

Also, I had to go into LM Studio and increase the max context size for each model I wanted to use in Zed. Otherwise it gives a parsing error on the response. I set it to the max allowable value.

I start LM Studio, start the LM Studio server, then go to Zeds AI config and tell it to connect to LM Studio. I put it in Agent mode, and it seems to work!

I don't know much about temperature, and I didn't use any other system prompt.

Good luck!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: