I get the idea of publicly disclosing security issues to large well funded companies that need to be incentivized to fix them. But I think open source has a good argument that in terms of risk reward tradeoff, publicly disclosing these for small resource constrained open source project probably creates a lot more risk than reward.
In addition to your point, it seems obvious that disclosure policy for FOSS should be “when patch available” and not static X days. The security issue should certainly be disclosed - when its responsible to do so.
Now, if Google or whoever really feels like fixing fast is so important, then they could very well contribute by submitting a patch along with their issue report.
> ...then they could very well contribute by submitting a patch along with their issue report.
I don't want to discourage anyone from submitting patches, but that does not necessarily remove all (or even the bulk of) the work from the maintainers. As someone who has received numerous patches to multimedia libraries from security researchers, they still need review, they often have to be rewritten, and most importantly, the issue must be understood by someone with the appropriate domain knowledge and context to know if the patch merely papers over the symptoms or resolves the underlying issue, whether the solution breaks anything else, and whether or not there might be more, similar issues lurking. It is hard for someone not deeply involved in the project to do all of those things.
> it seems obvious that disclosure policy for FOSS should be “when patch available” and not static X days
This is very far from obvious. If google doesn't feel like prioritising a critical issue, it remains irresponsible not to warn other users of the same library.
If that’s the case why give the OSS project any time to fix at all before public disclosure? They should just publish immediately, no? Warn other users asap.
Why do you think it has to be all or nothing? They are both reasonable concerns. That's why reasonable disclosure windows are usually short but not zero.
Because it gives maintainers a chance to fix the issue, which they’ll do if they feel it is a priority. Google does not decide your priorities for you, they just give you an option to make their report a priority if you so choose.
Timed disclosure is just a compromise between giving project time and public interests. People have been doing this for years now. Why are people acting like this is new just because ffmpeg is whining?
And occasionally you do see immediate disclosures (see below). This usually happens for vulnerabilities that are time-sensitive or actively being exploited where the user needs to know ASAP. It's very context dependent. In this case I don't think that's the case, so there's a standard delayed disclosure to give courtesy for the project to fix it first.
Note the word "courtesy". The public interest always overrides considerations for the project's fragile ego after some time.
(Some examples of shortened disclosures include Cloudbleed and the aCropalypse cropping bug, where in each case there were immediate reasons to notify the public / users)
Full (immediate) disclosure, where no time is given to anyone to do anything before the vulnerability is publicly disclosed, was historically the default, yes. Coordinated vulnerability disclosure (or "responsible disclosure" as many call it) only exists because the security researchers that practice it believe it is a more effective way of minimizing how much the vulnerability might be exploited before it is fixed.
Unless the maintainers are incompetent or uncooperative this does not feel like a good strategy. It is a good strategy on Google's side because it is easier for them to manage
> In addition to your point, it seems obvious that disclosure policy for FOSS should be “when patch available” and not static X days.
So when the xz backdoor was discovered, you think it would have been better to sit on that quietly and try to both wrest control of upstream away from the upstream maintainers and wait until all the downstream projects had reverted the changes in their copies before making that public? Personally I'm glad that went public early. Yes there is a tradeoff between speed of public disclosure and publicity for a vulnerability, but ultimately a vulnerability is a vulnerability and people are better off knowing there's a problem than hoping that only the good guys know about it. If a Debian bug starts tee-ing all my network traffic to the CCP and the NSA, I'd rather know about it before a patch is available, at least that way I can decide to shut down my Debian boxes.
The XZ backdoor is not a bug but a malicious payload inserted by malicious actors. The security vulnerability would immediately been used as it was created by attackers.
This bug is almost certainly too obscure to be found and exploited in the time the fix can be produced by Ffmpeg. On the other hand, this vuln being public so soon means any attacker is now free to develop their exploit before a fix is available.
If Google's goal is security, this vulnerability should only be disclosed after it's fixed or a reasonable time (which, according to ffmpeg dev, 90 days is not enough because they receive too many reports by Google).
A bug is a bug, regardless of the intent of the insertion. You have no idea if this bug was or wasn't intentionally inserted. It's of course very likely that it wasn't, but you don't and can't know that, especially given that malicious bug insertion is going to be designed to look innocent and have plausible deniability. Likewise, you don't know that the use of the XZ backdoor was imminent. For all you know the intent was to let it sit for a release or two, maybe with an eye towards waiting for it to appear in a particular down stream target, or just to make it harder to identify the source. Yes, just like it is unlikely that the ffmpeg bug was intentional, it's also unlikely the xz backdoor was intended to be a sleeper vulnerability.
But ultimately that's my point. You as an individual do not know who else has access or information about the bug/vulnerability you have found, nor do you have any insight into how quickly they intend to exploit that if they do know about it. So the right thing to do when you find a vulnerability is to make it public so that people can begin mitigating it. Private disclosure periods exist because they recognize there is an inherent tradeoff and asymmetry in making the information public and having effective remediations. So the disclosure period attempts to strike a balance, taking the risk that the bug is known and being actively exploited for the benefit of closing the gap between public knowledge and remediation. But inherently it is a risk that the bug reporter and the project maintainers are forcing on other people, which is why the end goal must ALWAYS be public disclosure sooner rather than later.
A 25 years old bug in software is not the same as a backdoor (not a bug, a full on backdoor..). The bug is so old if someone put it there intentionally, well congrats on the 25yo 0day.
Meanwhile the XZ backdoor was 100% meant to be used. I didn't say when and that doesn't matter, there is a malicious actor with the knowledge to exploit it. We can't say the same regarding the bug in a 1998 codec that was found by extensive fuzzing, and without obvious exploitation path.
Now, should it be patched? Absolutely, but should the patch be done asap at the cost of other maybe more important security patches? Maybe, maybe not. Not all bugs are security vulns, and not all security vulns are exploitable
> Absolutely, but should the patch be done asap at the cost of other maybe more important security patches? Maybe, maybe not. Not all bugs are security vulns, and not all security vulns are exploitable
I fully agree which is why I really don’t understand why everyone is all up in arms here. Google didn’t demand that this bug get fixed immediately. They didn’t demand that everything be dropped to fix a 25 year old bug. They filed a (very good and detailed) bug report to an open source product. They gave a private report out of courtesy and an acknowledgment of the tradeoffs inherent in public bug disclosure, but ultimately a bug is a bug, it’s already public because the source code is public. If the ffmpeg devs didn’t feel it was important to fix right away, nothing about filing a bug report, privately or publicly changes any of that.
In the end, a report saying "fix this within 90 days or this gets public" for small-ish bugs like this is a kind of demand. Do this or this gets out and you'll have to make an express release to fix it anyway.
I can understand that stance for serious bugs and security vulnerabilities. I can understand such delays for a company with a big market cap to put pressure on them. But these delays are exactly like a demand put on the company: fix it asap or it gets public. We wouldn't have to do this if companies in general didn't need to get publicly pressured into fixing their stuff. Making it public has two objectives: Warn users they may be at risk, and force the publisher to produce a fix asap or else risk a reputation hit.
> If the ffmpeg devs didn’t feel it was important to fix right away, nothing about filing a bug report, privately or publicly changes any of that.
It does change how they report. Had they given more time or staggered their reports over time, Ffmpeg wouldn't have felt pressure to publish fixes asap.
Even if the devs can say they won't fix, any public project will want to keep a certain quality level and not let security vulnerabilities get public.
In the end, had these reports been made by random security researchers, no drama would have happened. But if I see Google digging up 25 years old bugs, is it that much to expect them to provide a patch with it?
But this isn't a "small-ish bug". What gave you that impression? It's a vulnerability in code that is both compiled in by default, and that is reachable
when ffmpeg is run with its default settings when run on a file crafted to trigger the bug.
And if you believe this is a "small-ish" bug just because the ffmpeg Twitter account's gaslighting about "20 frames of a single video in Rebel Assault", then surely it being disclosed would be irrelevant? The only way the disclosure timeline makes a difference is if ffmpeg too think that the bug is serious.
> In the end, a report saying "fix this within 90 days or this gets public" for small-ish bugs like this is a kind of demand. Do this or this gets out and you'll have to make an express release to fix it anyway.
I think this is where the disconnect is. To my mind there is no "do this or else" message here, because there is no "or else". The report is a courtesy advance notice of a bug report that WILL be filed, no matter what the ffmpeg developers do. It's not like this is some awful secret that Google is promising not to disclose if ffmpeg jumps to their tune.
Further, the reality is most bug reports are never going to be given a 90 day window. Their site requests that if you find a security vulnerability you email their security team, but it doesn't tell you not to also file a bug report, and their bug report page doesn't tell you not to file anything you think might be a security or vulnerability bug to the tracker. And a search through the bug tracker shows more than a few open issues (sometimes years old) reporting segfault crashes, memory leaks, un-initialized variable access, heap corruption, divide by zero crashes, buffer overflows, null pointer dereferences and other such potential safety issues. It seems the ffmpeg team has no problems generally with having a backlog of these issues, so certainly one more in a (as we've been repeatedly reminded) 25 year old obscure codec parser is hardly going to tank their reputation right?
> In the end, had these reports been made by random security researchers, no drama would have happened.
And now we get to what is really the heart of the matter. If anyone else has reported this bug in this way, no one would care. It's not that Google did anything wrong, it's that Google has money so everyone is mad that they didn't do even more than they already do. And frankly that attitude stinks. It's hard enough getting corporations to actually contribute back to open source projects, especially when the license doesn't obligate them to at all. I'm not advocating holding corporations to some lesser standard, if the complaint was that Google was shoving unvalidated, and un-validatable low effort reports into the bug tracker, or that they actually were harassing the ffmpeg developers with constant followups on their tickets and demands for status updates then that would be poor behavior that we would be equally upset about if it came from anyone. But like you said, any other security researcher behaving the same way would be just fine. Shitting on Google this way for behaving according to the same standards outlined on ffmpeg's own website because of who they are and not what they've done just tells other corporations that it doesn't matter if you contribute code and money in addition to bug reports, if you don't do something to someone's arbitrary standard based on WHO you are, rather than WHAT you do, you'll get shit on for it. And that's not going to encourage more cooperation and contributions from the corporations that benefit from these projects.
> publicly disclosing these for small resource constrained open source project probably creates a lot more risk than reward.
You can never be sure that you're the only one in the world that has discovered or will discover a vulnerability, especially if the vulnerability can be found by an LLM. If you keep a vulnerability a secret, then you're leaving open a known opportunity for criminals and spying governments to find a zero day, maybe even a decade from now.
For this one in particular: AFAIK, since the codec is enabled by default, anyone who processes a maliciously crafted .mp4 file with ffmpeg is vulnerable. Being an open-source project, ffmpeg has no obligation to provide me secure software or to patch known vulnerabilities. But publicly disclosing those vulnerabilities means that I can take steps to protect myself (such as disabling this obscure niche codec that I'm literally never going to use), without any pressure on ffmpeg to do any work at all. The fact that ffmpeg commits themselves to fixing known vulnerabilities is commendable, and I appreciate them for that, but they're the ones volunteering to do that -- they don't owe it to anyone. Open-source maintainers always have the right to ignore a bug report; it's not an obligation to do work unless they make it one.
Vulnerability research is itself a form of contribution to open source -- a highly specialized and much more expensive form of contribution than contributing code. FFmpeg has a point that companies should be better about funding and contributing to open-source projects that they rely on, but telling security researchers that their highly valuable contribution is not welcome because it's not enough is absurd, and is itself an example of making ridiculous demands for free work from a volunteer in the open-source community. It sends the message that white-hat security research is not welcome, which is a deterrent to future researchers from ethically finding and disclosing vulnerabilities in the future.
As an FFmpeg user, I am better off in a world where Google disclosed this vulnerability -- regardless of whether they, FFmpeg, or anyone else wrote a patch -- because a vulnerability I know about is less dangerous than one I don't know about.
> publicly disclosing these for small resource constrained open source project probably creates a lot more risk than reward.
Not publicly disclosing it also carries risk. Library users get wrong impression that library has no vulnerabilities, while numerous bugs are reported but don't appear due to FOSS policy.
But if open source is reliant on public contributors to fix things, then the bug should be open so anyone can take a stab at fixing it, rather than relying on the closed group of maintainers
You are missing the tiny little fact that apparently a large portion of infosec people are of the opinion that insecure software must not exist. At any cost. No shades of gray.