To be safe, we'd have to get Microsoft to agree to indemnify users (if they really believe using this is safe, they should do so), or wait until a court case on copyright as it regard to training corpus for large models is decided and appeals are exhausted.
To be safe, we'd have to get Microsoft to agree to indemnify users (if they really believe using this is safe, they should do so), or wait until a court case on copyright as it regard to training corpus for large models is decided and appeals are exhausted.