Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>GPT and other LLMs don't allow you to use their output to train competing models

ToS is unenforceable and irrelevant to anyone that's in this space



That seems mostly right, particularly for internal models, but I wonder about adding some ringers to prove that copying happened:

https://en.m.wikipedia.org/wiki/Trap_street

Also, it seems sort of like how cryptocurrency folks assumed their transactions were anonymous? It's an API, so they could log the calls. (Maybe not the contents.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: