To your average user who interfaces with these figurative black boxes with a black box in their hand, how is this particular black box any different than the other black boxes that this user hands their data to every second of every day?
there are plenty of disallowed 'black boxes' within the federal sphere; chatgpt is just yet another.
to take a stab at your question, though : my cell phone doesn't learn to get better by absorbing my telecommunications; it's just used as a means to spy on my personal life by The Powers That Be. The primary purpose of my cell phone is for the conveyance of telecommunications.
chatGPT hordes data for training and self-improvement in its' current state. It's whole modus operandi involves the capture of data, rather than it being used for that tangentially. It could not meaningfully exist without training on something, and at this stage of the game it's the trend to self-train with user data.
Until that trend changes people should probably be a bit more suspect about what kind of stuff gets thrown into the training bin.
Those typically have MSAs with legalese where parties stipulate what they will and will not do and often whether or not it’s zero knowledge and often option to have your own instance encryption keys.
If people are using the free version of chatGPT then it’s unlikely there is a contract between the companies and more likely just a terms of use applied by chatGPT and ignored by the users.