Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

So could SaaS LLM or cloud/api LLMs not offer this as an option? A guarantee that the "same prompt" will always produce the same result.

Also the way I usually interpret this "non-deterministic" a bit "broader".

Say i have have slightly different prompts "what's 2+2?" vs. "can you please tell me what's 2 plus 2" or even "2+2=?" or "2+2" for most applications it would be useful if they all produce the same result



The form of the question determines the form of the outcome, even if the answer is the same. Asking the same question in a different way should result in the adherence to the form of the question.

2+2 is 4

2 plus 2 is 4

4=2+2

4

Having the LLM pass the input to a tool (python) will result in deterministic output.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: