It's similar to saying that any digital representation of an image isn't an image just a dataset that represent it.
If what you said was any sort of defense every image copyright would never apply to any digital image, because the images can be saved in different resolutions, different file formats, or encoded down. e.g. if a jpeg 'image' was only an image at an exact set of digital bits i could save it again with a different quality setting and end up with a different set of digital bits.
But everyone still recognises when an image looks the same, and courts will uphold copyright claims regardless of the digital encoding of an image. So goodluck with that spurious argument that it's not copyright because 'its on the internet (oh its with AI etc).
I don't understand what is nonsense, how it works? Your response seems to be for something entirely different.
But anyway, how I see stable diffusion being different is that it's a tool to generate all sorts of images, including copyrighted images.
It's more like a database of *how to* generate images rather than a database *of* images. Maybe there isn't that much of a difference when it comes to copyright law. If you ask an artist to draw a copyrighted image for you, who should be in trouble? I'd say the person asking most of the time, but in this case we argue it's the people behind the pencil or whatever. Why? Because it's too easy? Where does a service like fiver stand here?
So if a tool is able to generate something that looks indistinguishable from some copyrighted artwork, is it infringing on copyright? I can get on board with yes if it was trained on that copyrighted artwork, but otherwise I'm not so sure.
A tool can't be held accountable and can't infringe on copyright or any other law for that matter. It's more of a product. It seems to me like it's a gray area that's just going to have to be decided in court. Like did the company that sells the tool that can very easily be used to do illegal things take enough reasonable measures to prevent it from being accidently used in such a way? In the case of Copilot, I don't believe so, because there aren't really even any adequate warnings to the end user that say it can produce code which can only legally be used in software that meets the criteria of the original license.
The issue is not about what it produces. Copilot i am sure has safeguards to not output copyrighted code (they even mention they have tests). So it will sufficiently change the code to be legally safe.
The issue is in how it creates the output. Both Dalle and Copilot can work only by taking work of people in past, sucking up their earned know how and creations and remixing it. All that while not crediting (or paying) anyone. The software itself might be great but it only works because it was fed with loads of quality material.
It's smart copy&paste with obfuscation. If thats ok legally. You can imagine soon it could be used to rewrite whole codebases while avoiding any copyright. All the code will technically be different, but also the same.
The DMCA disagrees. Specific methods of "circumvention" which inevitably take the form of a software tool are prohibited. Tools and their authors can be held accountable.
It's similar to saying that any digital representation of an image isn't an image just a dataset that represent it.
If what you said was any sort of defense every image copyright would never apply to any digital image, because the images can be saved in different resolutions, different file formats, or encoded down. e.g. if a jpeg 'image' was only an image at an exact set of digital bits i could save it again with a different quality setting and end up with a different set of digital bits.
But everyone still recognises when an image looks the same, and courts will uphold copyright claims regardless of the digital encoding of an image. So goodluck with that spurious argument that it's not copyright because 'its on the internet (oh its with AI etc).