Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Do we have any court cases or any other such thing out there that's decided whether or not it's safe for developers to trust general / common output coming from LLMs? I would probably be more efficient using various AI systems to write my code; but I'm afraid of a lawsuit around licensing.

Microsoft and Jetbrains are both introducing this tooling into their IDEs with Copilot and AI Assistant, but I still worry (I'm a naturally over-cautious person).

edit: to be clear, I'll ask it questions, just like everyone and their dog; but any sort of direct line / code completion; or "write me a method in Java that will do X, Y and Z" and then copy-pasting that 10+ line thing directly is not something I do.



Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: