Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What I'd be really interested in is how tech churn is perceived by people older than me. I'm only 30 years in, so I tend to defend my generation's choices such as POSIX, SQL, XML, SOA, Java, and C/C++ before that (plus special-purpose pet peeves of mine such as markup/SGML and logic programming which came before) though I'm also claiming to be proficient in and generally open towards new tech. I consider most of supposedly new tech as rehashes of things past rather than real progress, and to be just lock-in schemes and sucking in another way compared to the tech that it is supposed to replace. But I'm really uncertain if I'm just falling victim to generational effects, like, say, the proverbial COBOL programmer being ridiculed by younger devs to instinctively grow their own tech career. Still, for me, there's a moment around 2006-2012 when the industry went all-in on "the Cloud" and consumerisation of IT, when before I saw canon, progress, and consensus through standardization.

I guess this is something that can't be objectively measured; but I can say that initiatives for standardization have almost dropped to zero compared to the 1990s and 2000s.



> initiatives for standardization have almost dropped to zero compared to the 1990s and 2000s

I think OSS adoption explains this. Standards were all about advocating for common interfaces, even if the implementations were proprietary. Proprietary software was more a thing in the 90s and 00s.

Nowadays, common practice is to use open implementations, not just open interfaces. Nowadays, we only see widespread initiatives for standardization when people are trying to cement the long-term survival of their implementation's interface.

See, for example, the Open Container Initiative, put forward by Docker (a company that engineers were concerned about betting the barn on, in 2015).

Contrast that with S3, which does not have a corresponding commitment to an open interface standard. Their interface has been copied nonetheless by several vendors (DO, Linode), making it a de facto standard, albeit a fragile and autocratic one.

Basically, I'm not sure if standardization has evaporated, I think it's diffused, and OSS is (ironically?) a contributing factor.


Before the triumph of OSS standardization required multiple parties to come to agreement and then each produce their own implementation, so implementers had an incentive to fight for their own objectives so that they didn't get backed into a corner of having to do unprofitable work. Lots of bickering about was was and wasn't worth including. To navigate this environment requires committees and formal standards documents.

But you know what's a great consolation prize for not getting exactly what you want? Not having to do any work! So when Docker comes along with Docker Containers or Google comes along with Kubernetes, the consolation prize of not having to do any work is a lot larger than the downside of not getting to have any input. Working code is working code, especially if someone reputable has also promised to maintain it for free.


Adding on to your point, the phrase “designed by committee” carries a strong negative connotations for a reason.

People may have some gripes with the implementation of some subsystems, but they are free to fork and submit patches for review.

This opportunity for grievance remediation did not exist in the old world we’re describing, and contributed to the outcome that standard interfaces were the unholy logical union of each member’s opinions and beliefs.

I agree that the benefit of not doing work far outweighs the downsides, for most time horizons (<10 years).

If you’re thinking longer term than that, interface standards might be better for you. But few people are.


I'm glad you've added that last paragraph, because I had the impression you'd welcome our new F/OSS overlords, even though I factually agree with your point about open implementations having replaced open standards.

What I mean is that, while people love to argue about F/OSS free vs open licenses nuances ad infinitum, the reality is that power has shifted to a very small number of players (FAANG, RedHat/IBM, MS et al) who've captured F/OSS, without choice, discussion, evolution (of competing ideas). What you call "design by committee" can alternatively be seen as a defense against unilateralism with the power dynamics we've headed into.

"Open implementations" means nobody can make a buck and sustain development and innovation. The only gain to be made is through integration and attention economy, creating perverse incentives.

Take Linux: they're trying to implement an operating system for like 30 years now in a monolithic fashion. The power of Unix is not that it's the most advanced operating system, even by 1970s standards, but that it is a minimal portable system that can be created from scratch in one or two years.

Linux being GPL hasn't prevented it from being used for a giant spynet (Android) nor a lock-in scheme ("the Cloud", k8s) taking Unix principles of site autonomy and small parts working together ad absurdum (the absurd part being that to shield against minor F/OSS version conflicts we need opaque almighty container orchestration and an accompanying zoo of tools, all the while we're doing largely the same we did 20 years ago on much less capable hardware).

Take so-called web standards: the idea of standardization is originally motivated by digital humanism, eg. that we do the best we can to come up with idioms and languages for digital communication and its preservation, accepting inclusion over perfection. The reality is that this idea has been usurped by an ad company taking all communication into an analytics-heavy medium more idiosyncratic than ever, leaving a single browser capable to render web content in its entirety, where "standards" (HTML, http) are created by the dev team of said browser. We didn't need that; we had CompuServe, AOL, and desktop operating systems for this purpose.


Oh yeah, I totally agree that the long term picture here is quite murky at best and frightening at worst.

> “Open implementations" means nobody can make a buck and sustain development and innovation. The only gain to be made is through integration and attention economy, creating perverse incentives.

I don’t know if I wholly agree with this point.

Especially in the most recent decade, we’ve seen a lot of the F/OSS developed at attention economy companies applied to completely disjoint industry sectors, with positive effect.

I work at an insurance company, for instance. The fact that we’re using gRPC/protobufs allows us to have a small engineering team that hits way beyond our headcount. Happy to elaborate more on the point, but I’ll leave it here in good faith.

I think that our dependence on this technology is a smart choice in the short to medium term. In the long term (10+ year), the worst case scenario is that we have to maintain a fork, or migrate to something more secure. We would have had to do this anyway, if we built our own RPC framework.

I agree that OS’s and web standards are becoming less and less user serving, and more corporate/attention economy serving.

I wouldn’t be shocked to see the web bifurcate eventually, between HTTP/HTML/JS and something simpler served over a simpler protocol.

That being said, I do think there’s some value in having an opinionated author compelling people to conform to their standard. When that leadership is absent, you end up with something like the Bluetooth or USB standard, and everybody suffers.

But at least those standards will exist forever, until they’re superseded by something that’s a superset of them. The same can’t be reliably said about the F/OSS we’ve been talking about.

It’s all a series of trade offs. I’m not too pessimistic. I think we’ll eventually end up in a better place, but we will likely stub our toes and bump our head am any times on the way there.


I'm only 15 years in but have thought about this a lot. I agree with you that there is nothing new under the sun; that we just cycle back through old ideas; fashions come and go and come back.

However, the environment changes so often old ideas go from being "possible" to "they just work" or "possible but really slow" to "instant".

These environmental changes can make orders of magnitude differences, and cause old ideas to suddenly feel very different in practice.

When I think of programming languages for example, of the languages you mention, a lot of their design (like matched brackets in XML), were decisions that made a lot of sense when we had monochrome editors, but now with extremely fast and advanced IDEs, we can take old PL ideas from the 50's and design languages with IDEs in mind that are a lot less cryptic and concise.


Good points. Though if you've picked up computing from almost the ground up (soldering, hardware hacking, low-level programming, of which I did only a little however), there's that experience when sitting in front of an overwhelming, notebook-melting IDE where you say to yourself "I don't need all those arbitrary abstractions; I've got a pretty good understanding of what I want to achieve, thank you very much" and realize the cognitive overhead can become a net-negative compared to the perceived problems that modern IDEs are attempting to solve. Matter of taste, of course.

As to XML, matched end-element tags (if that's what you mean) actually were a simplification compared to SGML from which XML was derived/subset. In SGML, you can omit/infer end-element tags or can type "</>" to make SGML auto-close the most recent element. I agree the verbose-ness of XML looks especially redundant since it's always the most recent element you have to close anyway whereas SGML (in principle, at least) has overlapping ("concurrent") markup. And I can assure you that around 1998 we had 32bit color monitors and IDEs didn't look all that different from today ;)

But back to the topic, I believe a lot of material has simply vanished from the 'net, or isn't accessible through search engines anymore.


IDEs and languages that an IDE can exploit are awesome. Especially if you don't really have such a good idea of a project (yet). It helps architects or contractors tremendously (as examples of people that might jump from one code base to another quite frequently).

What I mean is that I personally really like statically typed languages like Java and compile time safety. I even like some of 'verboseness' that people always complain about. I can take a modern IDE and a Java project that hasn't replaced everything with runtime magic yet (Spring comes to mind) and I can simply click my way through things to get the info I need and/or to build a mental model. However, I don't need a finished mental model already just to be effective at every task. Also refactorings that are really braindead simple and that you don't even have to think about are possible, precisely because they're simple and guaranteed to be correct.

Contrast that with other languages, such as Javascript, Python, Perl (yeah mentioning that because I loved Perl back when I was doing almost exclusively Perl - with a bit of shell scripting and lots of SQL - at my first 'real' job), where you have to have a mental model already and you have to know certain 'magic' to even be able to search for all the right things. Something that stuck in my head in that regard was AngularJS. I forgot the specifics but some type of identifier was use underscores in one place but dashes in another. How the eff am I supposed to find things easily? I have to know that magic conversion and grep for it specifically. If I come to a FE project written in Angular and have never done Angular, I will not find anything whatsoever and I have zero chance but to learn Angular to do basic things. And coming back to refactorings, even a simple rename can be a pain in the rear to do (and you won't do it and be stuck with really bad naming) because you can't be certain that you found all the right places to change until runtime i.e. your 2 a.m. batch run failed and you have users screaming at you. That teaches you to just stick with bad naming real fast.

Good languages and frameworks, if you ask me, allow someone that is senior in his role but a total noob with the specific language or framework to look at existing code and easily make certain modifications.

I wrote a simple T-SQL (Sybase, not MS-SQL) parser that output graphviz format to get a call graph of some backend job we had that consisted of a gazillion individual stored procedures strewn across a gazillion files and printed that and hung it up on the wall, just to be able to easily navigate that thing. Queue IDE where you just Ctrl-Click for the same end result.


Very interesting indeed and I had some glimpse of that in my dad, who all his life worked for only one company. And especially as I got older and interested in computers he started talking about some work stuff sometimes. His first programming language at work was 360 assembler. For the uninitiated/too young to remember: https://en.wikipedia.org/wiki/IBM_System/360. When I started working at the same company for my first job and I had to log into 'the host' - meaning the IBM mainframe they had (I believe by that time a Z series https://en.wikipedia.org/wiki/IBM_Z) I noticed how the custom login screen for their entire system had actually been written by my dad.

So all this just as a little context. But when I was talking about SQL like 15+ years ago he would always tell me that sure that's nice and all, but all those joins. They're painfully slow. Why would you do that? They had a table that had exactly all the information that they needed to have and they accessed it via a key. Fast and easy. Need a different view of the data? Make a new table that's organized the way you need it. Done and fast. Ring a bell? (NoSQL :))

I also started playing with Linux and system administration and obviously virtualization (vmware at the time, qemu stuff etc.). So I go talk to my dad about it and how I love the concept and how it does X and Y and Z cool thing. Yeah well, for him that was really old stuff, coz they had that on their mainframe since like forever (now called z/VM, then called VM/370 - in 1972).

We had to learn and use Java at university, which my dad always made fun of. Especially with articles like "The state of Java application middleware" and such that I was looking at. All that new fangled stuff, pah! He had been working with IBM CICS as the middleware for forever. Initial release of CICS July 8 1969, latest release June 12, 2020. (https://en.wikipedia.org/wiki/CICS)

I get it. I say the same thing(s) about some of the new fangled stuff I have to work with nowadays. With the kids. Some of it is great. Other things I'm just asked myself why the wheel had to be re-invented just to come back to square one. Like NoSQL databases that add features you'd expect from relational databases.


As a young blood, 5 years in, got hired by learning react in the great JS hiring of the mid 2010s, I look back towards those standards bodies as something that would be very beneficial now. All we have is chrome creating standards by monopoly.

Maybe I have a older mindset aswell. My training in in Civil Engineering and I have my expectations for standards bodies set pretty high because of it.

The code is so high level now, going through algo and data structures I see the benefit of that fundamental knowledge, but also see that you can be a very valuable engineer to a company without it(thanks to the rich developer ecosystem for that)

If we think about the maturity of software like a biological ecosystem, maybe the zen garden built by past engineers has overgrown into a dense and varietal forrest. Im not sure if its bad or good, maybe there are more niches to move into.


I think one source of tech churn people don't talk about is just growth. If you're 5x size you were last year, an entire system rewrite is relatively cheap, and if you expect to grow in the future, you can make riskier bets knowing they'll be cheaper to replace down the line. It's not ipso facto irrational. I'd be interested in seeing a breakdown of the correlations.


I'm 30 years into this industry also. We all have a tendency to defend and use what we're comfortable with, which often is what we learned ages ago. I try to challenge myself regularly by looking for my biases and assumptions. (Easier said than done, of course.)

However, I'd say the rate of "tech churn" has become impossible to keep up with in the most popular stacks (Java[Spring], .NET, front-end, Node). If you are in other areas, things tend to move slower.

I think the initiatives for standardization move at about the same rate as they used to, but the increasing amount of industry churn makes it seem like standardization has slowed down.


Curious how logic programming fits in with the rest of that list?


Why wouldn't it? I've always been a fan of polyglot programming and use of languages oriented towards specific use cases, as opposed to language-centric ecosystems. Now Java doesn't exactly fit that criterion ;( but makes up for it by being portable, relatively open, and cross-platform, and being instrumental in having prevented Windows dominance on the server-side in the 1990s. And it pays the bills. Haven't used it for new personal projects since 2008 or so, though.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: