I guess I think we should hold models used for non-academic reasons to a higher standard, and there should be oversight.
I don't know if all the language in this bill does what we need, but I'm against letting large corporations like a META or X live test whatever they want on their end users.
Calling out derivative models are exempt sounds good; only new training sets have to be subjected to this. I think there should be an academic limited duty exemption, models that can't be commercialized likely don't need the rigor of this law.
I guess I don't agree with affuture.org and think we need legislation like this in place.
It sounds like you do agree with affuture.org though. The proposed draft does not hold models used for non-academic reasons to a higher standard, and "models that can't be commercialized" are covered by it. It will be far harder to academics to work on large models under this draft.
> It sounds like you do agree with affuture.org though.
From the post.
> We need your help to stop this now.
I do not want to stop this bill; I want it revised. If that is not understood, then I suspect my primary goal for safer, less biased models is at odds with your primary goal of unregulated innovation.
At least if we continue to discuss this as a binary where agreeing with this affuture.org post means killing this legislation (as the post asks for) and not replacing it.
I don't know if all the language in this bill does what we need, but I'm against letting large corporations like a META or X live test whatever they want on their end users.
Calling out derivative models are exempt sounds good; only new training sets have to be subjected to this. I think there should be an academic limited duty exemption, models that can't be commercialized likely don't need the rigor of this law.
I guess I don't agree with affuture.org and think we need legislation like this in place.