The problem is that some small but non-zero fraction of these bugs may be exploitable security flaws with the software, and these bug reports are on the open internet. So if they just ignore them all, they risk overlooking a genuine vulnerability that a bad actor can then more easily find and use. Then the FOSS project gets the blame, because the bug report was there, they should have fixed it!
“Responsible disclosure” is a thing where an organization is given time to fix their code and deploy before the vulnerability is made public. Failing to fix the issue in a reasonable time, especially a timeline that your org has publicly agreed to, will cause reputational harm and is thus an incentive to write good code that is free of vulns and to remediate ones when they are identified.
This breaks down when the “organization” in question is just a few people with some free time who made something so fundamentally awesome that the world depends on it and have never been compensated for their incredible contributions to everyone.
“Responsible disclosure” in this case needs a bit of a redesign when the org is volunteer work instead of a company making profit. There’s no real reputational harm to ffmpeg, since users don’t necessarily know they use it, but the broader community recognizes the risk, and the maintainers feel obligated to fix issues. Additionally, a publicly disclosed vulnerability puts tons of innocent users at risk.
I don’t dislike AI-based code analysis. It can theoretically prevent zero-days when someone malicious else finds an issue first, but running AI tools against that xkcd-tiny-block and expecting that the maintainers have the ability to fit into a billion-dollar-company’s timeline is unreasonable. Google et al. should keep risks or vulnerabilities private when disclosing them to FOSS maintainers instead of holding them to the same standard as a corporation by posting issues to a git repo.
A RCE or similar critical issue in ffmpeg would be a real issue with widespread impact, given how broadly it is used. That suggests that it should be broadly supported. The social contract with LGPL, GPL, and FOSS in general is that code is released ‘as is, with no warranty’. Want to fix a problem, go for it! Only calling out problem just makes you a dick: Google, Amazon, Microsoft, 100’s of others.
As many have already stated: If a grossly profitable business depends on a “tiny” piece of code they aren’t paying for, they have two options: pay for the code (fund maintenance) or make their own. I’d also support a few headlines like “New Google Chrome vulnerability will let hackers steal you children and house!” or “watching this youtube video will set your computer on fire!”
The problem is that some small but non-zero fraction of these bugs may be exploitable security flaws with the software, and these bug reports are on the open internet. So if they just ignore them all, they risk overlooking a genuine vulnerability that a bad actor can then more easily find and use. Then the FOSS project gets the blame, because the bug report was there, they should have fixed it!
I agree that this is a problem.
“Responsible disclosure” is a thing where an organization is given time to fix their code and deploy before the vulnerability is made public. Failing to fix the issue in a reasonable time, especially a timeline that your org has publicly agreed to, will cause reputational harm and is thus an incentive to write good code that is free of vulns and to remediate ones when they are identified.
This breaks down when the “organization” in question is just a few people with some free time who made something so fundamentally awesome that the world depends on it and have never been compensated for their incredible contributions to everyone.
“Responsible disclosure” in this case needs a bit of a redesign when the org is volunteer work instead of a company making profit. There’s no real reputational harm to ffmpeg, since users don’t necessarily know they use it, but the broader community recognizes the risk, and the maintainers feel obligated to fix issues. Additionally, a publicly disclosed vulnerability puts tons of innocent users at risk.
I don’t dislike AI-based code analysis. It can theoretically prevent zero-days when someone malicious else finds an issue first, but running AI tools against that xkcd-tiny-block and expecting that the maintainers have the ability to fit into a billion-dollar-company’s timeline is unreasonable. Google et al. should keep risks or vulnerabilities private when disclosing them to FOSS maintainers instead of holding them to the same standard as a corporation by posting issues to a git repo.
A RCE or similar critical issue in ffmpeg would be a real issue with widespread impact, given how broadly it is used. That suggests that it should be broadly supported. The social contract with LGPL, GPL, and FOSS in general is that code is released ‘as is, with no warranty’. Want to fix a problem, go for it! Only calling out problem just makes you a dick: Google, Amazon, Microsoft, 100’s of others.
As many have already stated: If a grossly profitable business depends on a “tiny” piece of code they aren’t paying for, they have two options: pay for the code (fund maintenance) or make their own. I’d also support a few headlines like “New Google Chrome vulnerability will let hackers steal you children and house!” or “watching this youtube video will set your computer on fire!”