Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is big because it sets a precedent for broader usage of these "copy and make similar" ML models. For example, I believe it'll impact the future of AI image generation.


I have a crazy theory Microsoft may have been seeking this outcome.

Microsoft isn't stupid, and Copilot is so obviously legally dubious that it stretches reason they launched it without better legal justification or discourse. Microsoft employees basically treat any notion like it isn't legal as nonsense, even though they lack any real precedent to justify the position.

Is it plausible they might have launched a product almost certain to face a legal case in order to establish precedent for using training data? The company is working in this industry and it likely needs to know for future projects that are harder or costlier to build.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: