Most CPU architectures are "fake diversity"; for example both Alpha and ARM64 are 64-bit little-endian with a weak memory model. Sure, S/390 is 31-bit and supports BCD while in PA-RISC the stack grows upwards and IA-64 is VLIW, but these are trivia that are not comparable to the diversity of human cultures. For decades programmers have wasted their time porting software to different-but-not-better architectures, mostly for the benefit of the vendors who fragmented the market in the first place instead of standardizing.
> For decades programmers have wasted their time porting software to different-but-not-better architectures
Microarchitecture, register layout, ABI also constitute differences which have real-world uses, Not to mention sheer competition to avoid architecture-rot. Only targeting intel & ARM opens yourself to problems, cf. the upcoming nVidia hell ARM is about to experience.
Just because Rust-preaching (without practicing it, of course) Starbucks-spipping average HN readers don't know about it doesn't make it inexistent or necessarily wrong-think.
> Microarchitecture, register layout, ABI also constitute differences which have real-world uses
They do. I started coding on a 6502-based machine and used machine code to find prime numbers, some 27 years ago. I've used 1-2 other non-mainstream CPUs (whose names I don't even know) before diving neck-deep into the mainstream. It was fun, absolutely. It has potential, absolutely. Was it realized? Nope.
However, I can't resist but asking: if those things do have their uses then why didn't the hobbyists support them through patches to GCC / clang and LLVM?
Don't get me wrong. If you tell me we are stuck in a local maxima in CPU architectures, I'll immediately agree with you! But what would you have the entire industry do, exactly? Business pays our salaries and they need results in reasonable timeframes. Can you tell the guy who is paying you: "I need 5 years to integrate this old CPU arch with LLVM so we can have this feature you wanted last month", with a straight face?
> Just because Rust-preaching (without practicing it, of course) Starbucks-spipping average HN readers don't know about it doesn't make it inexistent or necessarily wrong-think.
That is just being obnoxious and not arguing in good faith. Example: I do use Rust, although not 100% of my work time.
You should try the Rust language and tooling -- and I mean work actively with it for a year -- and then you could have an informed opinion. It would make for a more interesting discussion.
Do I like how verbose can Rust be? No, it's irritating.
Do I like how cryptic it can look? No, and it wastes time mentally parsing it (but it does get better with time so 50/50 here).
Does it get stuff done mega-quickly and safer than C (and most C++)? Yes.
Does it have amazing tooling? Yes.
Does it get developed more and more and serve many needs? Yes.
Does it reduce security incidents? I'd argue yes although I have no direct experience. Memory safety is definitely one of the largest elephants in the room when security is involved.
---
You have a very wrong idea about the average Rust user IMO. I don't like parts of the whole thing but it helped me a lot several times already -- and it gave me a peace of mind. And I've witnessed people migrating legacy systems to it and showing graphs in meetings that demonstrate that alarms and error analytics percentages plunged to 0.2% - 2% (and they were always 7% - 15% before).
Just resisting something because it starts going mainstream is a teenager rebellion level of attitude and it's not productive. Do use Rust yourself a bit. Then you can say "I dislike Rust because of $REASON" and then we can have a much more interesting discussion.
> However, I can't resist but asking: if those things do have their uses then why didn't the hobbyists support them through patches to GCC / clang and LLVM?
They didn't decide to create a whole new language and make everything dependent on it.
At some point, when you reach a critical mass, you have to spend more on seemlessly "irrelevant" tasks, like supporting other architecture. Don't shift the problem away by making ridiculing it, own your shortcomings.
Okay, that's a more fair and balanced point of view.
However, let's not forget one of the main points of original article: nobody promised those people that their dependency's dependencies will never change. The crypto authors made a decision to go with Rust. If dependents want to continue using it, they have to adapt or stop using it.
As I've said above: backwards compatibility is an admirable goal but it doesn't override everything.
> As I've said above: backwards compatibility is an admirable goal but it doesn't override everything.
You'll never get a job at either Microsoft, or in any system jobs where backward compatibility is paramount (say, the Linux kernel) for millions, if not billions. Just going the Apple "fuck you" way is arrogant at best, disillusioned at worst, especially when you're an irrelevant language.
Sorry that your work has made you so frustrated. It sounds stressful. IMO you should consider exiting your current company or area. Judging by your comments, you are pretty jaded (and set in your ways).
I am not interested in discussing extremes as mentioned in two separate sub-threads now but you do sound like you need a break. Good luck, man.