Edit: just to be clear-- I'm claiming there's no way the addition of parentheses could change the behavior of the program.
I guess I can understand resolving an ambiguity so that the reader doesn't have to look up precedence rules. But when combined with a high-profile bug in a critical piece of infrastructure it looks a bit like the software equivalent of not shaving one's beard in the hopes of winning a sports tournament.
Wasn't there some subtle difference between using sizeof with vs without parens? Something like if you pass a type to sizeof -- e.g. sizeof(*int) -- you must use parents anyway? It's been years since I've used C, I'm probably not remembering it correctly.
I don't think it is fair to blame the language for this.
Look at the lines above the vulnerability:
n = n + i / (written_out + 1);
i %= (written_out + 1);
Whatever buffer manipulation this is doing could be abstracted out into a library. (As studio.h does.)
Also, since written_out probably varies from loop iteration to iteration, what does "n remainder i" even mean?
Also, why don't they increment written_out before doing the arithmetic, and why do they increment i on line 523, when the only intermediate use of it is "i + 1"?
I've written some crappy code in my time, but when things descended into that level of insanity, I either delegated to fuzz tested functions, or at least left a comment.
Back in the day, you didn’t have alternatives with zero-cost abstractions like we have today. There are many valid reasons why one would make decisions to balance performance against memory-safety, especially 25 years ago – and given the fact that networking code tends to be part of hot paths where optimization matters.
The function in question was written in 2020. OpenSSL 3 is an incompatible major rewrite undertaken during 2018-2021 and there is no no valid excuse for why language quality was not improved at that time.
come on, that is quite harsh. c is easily my favorite programming language and it just works really damn well and produces very small binaries. it's easy to write terrible code in any programming language.
That's not really relevant to the discussion of whether it's the proper language to be used for something like OpenSSL.
> it just works really damn well
As far as I can tell, all mainstream programming languages "work well", so it's not clear what you mean.
> produces very small binaries
True but not very important IMO
> it's easy to write terrible code in any programming language
Sure. But it's not easy to write this particular flavor of terrible code -- buffer overflows -- in any commonly-used language other than C and its derivatives.
via https://www.twitch.tv/filosottile