Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

llama.cpp has a way to constrain responses to a grammar, which is 100% reliable as it is implemented in the inference itself. You still need to tell the model to produce a certain format to get good results, though.


Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: