It does. Most privacy laws are based on time-from-discovery. If they immediately sprung into action at the moment they were informed and remediated the issue, they're in compliance.
Right, that's the problem. There need to be standards that govern what can ever be released to customers/the public in the first place. When violations of those are discovered, the penalties should be based on time from release, so the longer it was out in the wild, the greater the penalty.
But you can't remove something from the internet once it's there, so once it's released, it's expected that it always will be.
It's also impossible to guarantee a 100% secure infrastructure, no matter how good your product team is.
In the grey is a term of art: "best efforts."
If data is leaking, and it wasn't because hackers bypassed a bunch of safeguards, if it can be shown that you didn't use Best Efforts to secure said data, there is liability.
A charitable way of interpreting "best effort" is that it's similar to what I said: we need standards. But the problems with our notion of "best effort" are:
1. The standards aren't clearly defined (i.e., you must specifically do this).
2. They are defined in terms of efforts rather than effects. It is like saying "every car sold must be made of steel" rather than "every car sold must be capable of withstanding an impact against a concrete wall at 60mph with X amount of deformation, etc." We want the rules to determine what level of threat is protected against, not just what motions the company went through. In the case in the article, it wasn't because hackers bypassed a bunch of safeguards; the company didn't protect against even basic threats.
3. It's not enough to have "liability". That puts the onus on individuals to sue the company for their specific damages. We need criminal penalties that are designed to punish companies (and the individuals who direct them) for the harm they do to society by the overall process of rushing ahead selling things instead of slowing down and being careful. We need large-scale enforcement so that companies actually stop doing these things because the cost of doing them becomes too enormous.
4. Our laws do not adequately take account of the differential power of those who cut corners, and the differential gains reaped. We frequently find small operators on the wrong end of painful lawsuits and onerous criminal penalties, while the biggest companies and wealthiest individuals use their position to avoid consequences. Laws need to explicitly take this into account, lowering the standard of proof for penalties against larger, wealthier, and more powerful companies and individuals, and also making those penalties exponentially higher.
So is that true if they find out when the public does too? It seems that disclosing it privately has some upside (protecting the users) and no downside.