Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I want to be fuzzy and I want the LLM to generate something exact.


Is this sarcasm? I can’t tell anymore. Unless your ideas aren’t new, this is just impossible.


Why? I want the LLM to understand my intent and build something exact. That already happens many times.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: