I've really changed my perspective on this type of thing as I approach 50.
Great art is always made under great constraints. Software is no different.
It's always the weaker engineers that are constantly complaining about the things listed in this article. Complaining about all the "stupid code" and "stupid decisions" that were made before they arrived.
There is no realistic scenario where you can spend infinite time developing the perfect solution. And I say infinite because there is no amount of time that will allow you to achieve perfection. Perfection does not exist.
But like in art and so also in programming you can definitely strive for perfection.
There are plenty of writers, some famous, most not, who keep rewriting and rewriting because it's not perfect and still get annoyed because it's not good enough when published. Software generally is a job to make money : who give a crap if it's perfect; not your boss or your company clients. But personally, is another thing. I definitely have software that is perfect in my eyes. I don't care if others don't think so but I worked decades on it and using and updating it makes me happier than other things. I am well over 50 and I do not see this change for me.
There are well known examples too in software, for instance Jonathan Blow, who estimates stuff and then overshoots by a long shot because he does not like the result enough and Arthur Whitney who keeps rewriting his 'perfect' (in his eyes) software (k) to just a little perfect-er.
As an Artist and an Engineer, too many engineers are perfectionist in a reality where it doesn't exist. To the people here who quote artists and works of art as if they are "perfect"... you. were. not. there. It's only perfect to you in your perfection biased brain. Art is very much imperfect. Concessions were made, pieces restarted, plans changed. Creation is messy and painful. Art or Engineering.
OTOH, I'd say that software perfection doesn't exists because of all the slackers who accept their crap as "good enough", leading to enshittification.
On the most important level, software is either pefect or it fails.
ETA: I mean for functionality in the above. That's why I don't like web design: too many style choices. It's also why I stick to the commandline nowadays.
When I think of "great art under great constraints", I think of the demo scene, not dealing with the legacy cruft in someone else's million-line codebase.
> Great art is always made under great constraints. Software is no different.
I don't agree that great art (or software) is always made under great constraints. If you have an intrinsic drive, having enough time can yield a compound return. For example, in research, the "publish or perish" mentality often forces people to focus on shorter-term problems rather than pursuing more ambitious, long-term breakthroughs.
I'm curious what your career trajectory was like. I'm surprised that your experiences are so different than mine (see my comment below). In the early years, we had tons of time to just play around (e.g. Paul Graham wrote Hackers and Painters in 2004).
My theory is agile turned software writing into a production line, well it attempted too. Hard to fit experimentation into the everything must be a ticket process/mentality and endless ceremony meetings. Also I think the quality of developers decreased, not sure if agile caused this or it's some sort of work around for it.
> My theory is agile turned software writing into a production line, well it attempted too.
Right conclusion, wrong origin. Let me explain.
Business management theory has been rooted in the lessons learned from Ford manufacturing for over a century. This has worked well for industries which manufacture goods using physical resources, of which most qualify.
However, software engineering is not bound by those forces. Adding more developers to an effort does not shorten delivery nor increase productivity (quite the contrary, actually, and well documented in "The Mythical Man Month"[0]). But adding "line workers" to a factory, assuming sufficient raw materials are available, will shorten its delivery cycle.
Because assembly line workers have a quantifiable job, easily measured based on physical factors, and fairly easily scalable (assuming sufficient factory capacity).
> Also I think the quality of developers decreased, not sure if agile caused this or it's some sort of work around for it.
IMHO, there is no substitute for understanding the problem needing to be solved. No SDLC paradigm can make a developer which eschews this successful.
Quality has definitely decreased, and I think it's the natural consequences of specialization. Most modern devs I've worked with (even/especially those from big tech companies >1B val) know their on particular niche well, but flounder when faced with a different kind of problem. They have a hammer, so everything is a nail. The power of modern infrastructure and orchestration systems has eliminated their need to understand the full stack in order to "deliver value".
From my POV, hacker culture is going away. Because it does not Scale in the way capitalists want it to scale. And the same capitalists are foaming at the mouth at the notion that they might be able to replace expensive engineers and developers with AI.
Our niche has been captured by global stakes, and those stakeholders are all too happy to believe that they can scale innovation without all of the previous "cultural baggage" that, IMO, is the only reason we have the systems that we have today.
I don't think hacker culture is going away, I think it's just drowned out by software eating the world in a capitalist economy. It used to be that software and computers in general didn't pay any better than any other white collar job, and were generally more arcane and less familiar to people, so only those of us with an inherent interest were drawn to it. I believe there are more of us than ever, there's just orders of magnitude more people drawn in for the money and power.
I certainly feel some nostalgia for the old days, but while I'm not thrilled by a lot of directions the internet has taken, I don't think there's ever been a better time to be a hacker in terms of tools available and what can be achieved by an individual or small group. Getting attention for your work is another matter, but distribution has always been hard, the internet making it easier to deliver bites just led to that much fiercer competition. The fact that there was a short-lived window where technical barriers favored hackers was just a coincidence of history, not a stable state that it makes sense to try to replicate.
I always understood that Agile was supposed to reduce the bureaucracy, not increase it. It seems to have been embraced, extended and extinguished by the sort of people who were pushing Waterfall in the previous era.
>Also I think the quality of developers decreased, not sure if agile caused this or it's some sort of work around for it.
I think it's mostly a function of developer quantity and the pervasive "anyone can do this" attitude. (My assessment: most people probably could, but fundamentally aren't comfortable using their brains the right way.)
There is a large variety between perfect code and code people usually complain about. Not only weak engineers complain about crappy code and stupid decisions.
As I got more senior it wasnt the crappiness of the code that frustrated me as much as it was the intransigence people that created the circumstances that made it happen.
Im totally happy with crappy codebases I can fix, I just get fed up coz because management wants 34 new features delivered by next tuesday or a junior with an attitude doesnt want to pair or be trained to TDD.
I low key kind of like it when I describe a harmful archetype or toxic opinion online and somebody responds "b...b...but that isnt bad, that's me!"
In the end the companies that were like this "because there is no alternative" usually did suffer the consequences and the ones that werent reaped the benefits shrug
What you're describing as a harmful archetype is the job you've been hired to perform. The disconnect is between your self-image and reality. Refusing to accept that is intransigent.
It doesn't matter if you consider it good or bad - morals don't come into commercial software development. The closest you ever get is platitudes when it doesn't conflict with profits.
Morals aren’t always involved in commercial software development, and likely they never have been in any of your workplaces. However, I think it’s a gross mischaracterization to claim that morals and business don’t have any overlap. I work in the health tech industry, and I feel good knowing that patients benefit from using our device. I know I wouldn’t feel the same way if I was working at some fintech optimizing stock trading to the Nth degree.
I never said anything about morals. I just like having agency and have professional pride in my work. Perhaps you dont.
This has very little to do with capitalist realities. As I mentioned before, the saner the company was about this stuff the less likely they were to eat losses.
>Great art is always made under great constraints.
It's sad, the Mona Lisa never quite reached its peak because da Vinci didn't have a Jira board and a scrum master /s
Some of these constraints are not truly necessary and often stifling and once you've done work without them, you can't go back. Usually that's when you're older.
Great art is always made under great constraints. Software is no different.
It's always the weaker engineers that are constantly complaining about the things listed in this article. Complaining about all the "stupid code" and "stupid decisions" that were made before they arrived.
There is no realistic scenario where you can spend infinite time developing the perfect solution. And I say infinite because there is no amount of time that will allow you to achieve perfection. Perfection does not exist.
It's true in art, and it's true in engineering.