Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The obvious problem with this is that you'd be sending all your (presumably proprietary) source code directly to ChatGPT, i.e. OpenAI, i.e. Microsoft, with a dubiously vague license to use it to improve the ChatGPT product.

I can't think of a single company I or someone I know has worked at that has a security department/person that would allow this. Maybe very small businesses wouldn't care that much?



> I can't think of a single company I or someone I know has worked at that has a security department/person that would allow this. Maybe very small businesses wouldn't care that much?

Well there are lots of open source companies where this isn't a concern. Even outside that lots of businesses use GitHub as a code repository just fine, and the GitHub tools can read your code to eg scan for security vulnerabilities.


I absolutely do not envy any team who would steal our source code. No, it’s not bad and is rather self-documented, but the amount is huge and it’s all inter-connected. What use can they extract from it?


Obfuscate the code, send to ChatGPT, then find replace the strange names in the summary with the normal ones ;)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: