> It isn't like folks haven't tried to make better languages and toolchains throughout the years. You even acknowledge this. They just haven't delivered on their promises.
The question also has to consider, why did they fail? And what is the criteria for failure? That may be entirely the point: the languages may be better by some metrics, worse in others, but they cannot easily beat the inertia factor, as a result of being so entrenched for 4+ decades. Everyone is very aware of the inertia factor; it seems to permeate all levels of technical decisions in all kinds of fields, for good and bad reasons. If the failure criterion is "not used as widely as C", then, like, everything has failed, so it's a pretty bad criterion, I think.
Ada has been around for a while now, and has seen a fair amount of high-assurance industrial use. It's really unclear if you can actually characterize Ada as a total failure for example, it lives on, although it's certainly less popular. But it was designed with the field in mind from day 1. I think a lot of this has to do with things like institutional knowledge and inertia, and vast amounts of time and money sunk into tooling for systems like C.
That's not bad, the money and stuff obviously helped; but it makes the narrative that all these other things are strictly worse/never attempted a little bit muddier. Does the "empirical" lack of better replacements imply that better replacements cannot exist, or never existed? I think Ada alone specifically negates the core of that argument, they've certainly been tried and seen some level of success. Or perhaps we didn't actually try enough due to resources, maybe? Or maybe it suggests that other contributing factors, externalities, gave rise to such a "monopoly" of computing intellect?
Also, it's easy to forget, but today, it seems general programming languages have enough demanding requirements in general (from things like expectations of package managers, to libraries, to editors), that actually supporting and promoting a general purpose language, to the point of wide usage, is ridiculously time and money consuming. Programming languages are a commodity, for the most part. Existing languages almost all either struggle to exist, have their primary developers financed by some corporations, or have existed for long enough to not simply drift into the void and become effectively immortal.
Yet, there's almost no money in it as a field. It's insanely hard to make a business out of things like compiler tech or programming languages these days, outside of specialized fields and clients. Unless you have deep pockets and a willingness to throw money into the void for a while, bootstrapping a "competetive" programming language is going to be insanely hard, on any kind of reasonable timeframe. Rust is a good example: Mozilla financed a vast amount of the development for a long time, and sustained it even in the beginning stages when it was much riskier as an investment. That's likely one of the major reasons it's even remotely as popular as it is today, because they ate the cost, to catch up with modern expectations.
All of these compounding factors are not going away. I agree there are few viable alternatives; but I don't agree it's so easy to explain it all away "empirically" with such a simple analysis as yours, that it was all either never tried or just "failed" (without defining what that even means).
> And, some of that is because you are ignoring all of the advances that have happened in C. If you are not using modern tools like valgrind/coverity/whatever, you are not comparing C fairly to contemporary languages.
The fact you suggest a 15+ year old, insanely complicated static analysis tool that's proprietary to ensure your code should basically be allowed to exist on the modern internet (because otherwise it will just be a nightmare hellhole security problem for someone down the road), yet continues to fail to find true high profile vulnerabilities that keep occurring, and still requires constant evolution -- all so that C can be "fairly" judged with modern languages... it's a bit alarming when you think about it.
Well, I'm being slightly tongue-in-cheek. I mean, every time I write C, I heavily indulge in these tools as my insurance to ensure I'm doing things correctly. I even tend to use CompCert to find undefined behavior! Coverity is really good, no doubt.
But when I do this, it definitely gives me pause at the investment I have to make. I don't sit around thinking "Yes, this is basically a great solution to the problem". It would almost be, like, I dunno... calling the story of the Titanic a success? Lifeboats hardly seem like the ultimate risk insurance when you're the one crashing into ice, especially when it seems we never have enough of them.
I'm not sure where this conversation is going. :) I think I resonate/agree with your entire upper part. At least, I find myself nodding my head a lot.
For the criticism of age of static analysis tool. This criticism doesn't make sense. I could make the same criticism for a 26ish+ year old and insanely complicated language toolchain (Haskel). And that is if I just pick the popular one. I could go with your example, Ada, for a 36+ year old option.
If the criticism is simply due to the proprietary nature. That cuts against much of your upper argument. The lack of money in the field is a large part of the problem.
The question also has to consider, why did they fail? And what is the criteria for failure? That may be entirely the point: the languages may be better by some metrics, worse in others, but they cannot easily beat the inertia factor, as a result of being so entrenched for 4+ decades. Everyone is very aware of the inertia factor; it seems to permeate all levels of technical decisions in all kinds of fields, for good and bad reasons. If the failure criterion is "not used as widely as C", then, like, everything has failed, so it's a pretty bad criterion, I think.
Ada has been around for a while now, and has seen a fair amount of high-assurance industrial use. It's really unclear if you can actually characterize Ada as a total failure for example, it lives on, although it's certainly less popular. But it was designed with the field in mind from day 1. I think a lot of this has to do with things like institutional knowledge and inertia, and vast amounts of time and money sunk into tooling for systems like C.
That's not bad, the money and stuff obviously helped; but it makes the narrative that all these other things are strictly worse/never attempted a little bit muddier. Does the "empirical" lack of better replacements imply that better replacements cannot exist, or never existed? I think Ada alone specifically negates the core of that argument, they've certainly been tried and seen some level of success. Or perhaps we didn't actually try enough due to resources, maybe? Or maybe it suggests that other contributing factors, externalities, gave rise to such a "monopoly" of computing intellect?
Also, it's easy to forget, but today, it seems general programming languages have enough demanding requirements in general (from things like expectations of package managers, to libraries, to editors), that actually supporting and promoting a general purpose language, to the point of wide usage, is ridiculously time and money consuming. Programming languages are a commodity, for the most part. Existing languages almost all either struggle to exist, have their primary developers financed by some corporations, or have existed for long enough to not simply drift into the void and become effectively immortal.
Yet, there's almost no money in it as a field. It's insanely hard to make a business out of things like compiler tech or programming languages these days, outside of specialized fields and clients. Unless you have deep pockets and a willingness to throw money into the void for a while, bootstrapping a "competetive" programming language is going to be insanely hard, on any kind of reasonable timeframe. Rust is a good example: Mozilla financed a vast amount of the development for a long time, and sustained it even in the beginning stages when it was much riskier as an investment. That's likely one of the major reasons it's even remotely as popular as it is today, because they ate the cost, to catch up with modern expectations.
All of these compounding factors are not going away. I agree there are few viable alternatives; but I don't agree it's so easy to explain it all away "empirically" with such a simple analysis as yours, that it was all either never tried or just "failed" (without defining what that even means).
> And, some of that is because you are ignoring all of the advances that have happened in C. If you are not using modern tools like valgrind/coverity/whatever, you are not comparing C fairly to contemporary languages.
The fact you suggest a 15+ year old, insanely complicated static analysis tool that's proprietary to ensure your code should basically be allowed to exist on the modern internet (because otherwise it will just be a nightmare hellhole security problem for someone down the road), yet continues to fail to find true high profile vulnerabilities that keep occurring, and still requires constant evolution -- all so that C can be "fairly" judged with modern languages... it's a bit alarming when you think about it.
Well, I'm being slightly tongue-in-cheek. I mean, every time I write C, I heavily indulge in these tools as my insurance to ensure I'm doing things correctly. I even tend to use CompCert to find undefined behavior! Coverity is really good, no doubt.
But when I do this, it definitely gives me pause at the investment I have to make. I don't sit around thinking "Yes, this is basically a great solution to the problem". It would almost be, like, I dunno... calling the story of the Titanic a success? Lifeboats hardly seem like the ultimate risk insurance when you're the one crashing into ice, especially when it seems we never have enough of them.