Agile never gave organizations a holistic, viable alternative to Waterfall. Because there’s a difference between theory and practice. Product work is more about practice. When we complain about “AINO” (Agile In Name Only), we’re not being honest with ourselves.
I agree with most of the article. Specially the keep-learning part.
All Agile did was put software development teams unfairly under a microscope.
I believe Agile has been tremendously beneficial for the industry globally, especially in some subtle ways. For example, Agile says you have to communicate a lot, if you want to get software done. Here is the subtlety: if, today, you stop telling the average programmer to communicate, they will stop and go back to silo mode, "naturally". At least some of them. If you think about software at the industry scale, you have to think about a wide population which require processes.
I was once working in a nice little company where, one day, they introduced agile. It was so much beneficial. Before, the bosses thought that talking was simply a lost of productivity. You're a programmer right? So code, don't talk. So we would develop software without talking to each other. After we had daily meetings, scrums things and stuff. Our velocity has sky-rocketed.
You have to realise that before Agile, a fair portion of all software development projects that were started would simply bust and never get shipped. The code is a complete monster or the budget is nuked. When I started in the company I just mentioned, I started working on a codebase that was the worse code I've ever seen in my life. You would touch one line and everything would stop working. It was a condensed piece of spaghetti with hacks on top of hacks on top of hacks. Software architecture? That requires some talking and thinking, forget about that, not permitted.
Now I agree with you, agile is not much useful for a hacker who consistently get software done and understand deeply what's happening.
edit: not that I'm an agile guru or anything. Actually I only know the thing superficially. I show up at the meetings, when I'm ask how hard something is, I answer, then I mess around with my office friends. Still I can appreciate it works much better than any process the bosses can come up with.
> You have to realise that before Agile, a fair portion of all software development projects that were started would simply bust and never get shipped.
I have plenty of gripes with Agile, but there's definitely a "victim of its own success" aspect to the whole thing.
"Agile in name only" is frustrating compared to a good system, but in many cases it's still far better than what came before it. Basic ideas like "we should expect requirements to change" and "if the programmer doesn't know what their code will be used for, something is wrong" weren't necessarily accepted prior to agile. Projects that are chaotic, mismanaged messes under AINO might well have been orderly deathmarches or cancelled outright in the past. Some of this progress is technological (source control, post-release patching, digital distribution), but some of it really does owe to Agile.
It reminds me a bit of the scientific method. The elaborate eight-step thing taught in school feels like silly boilerplate, but it's shocking to realize that "ideas are tested by experiment" was a genuine breakthrough from a past that made major choices like disease treatments and scurvy cures based on 'reasoning' without testing them at all.
> You have to realise that before Agile, a fair portion of the software development projects that were started would simply bust and never get shipped. The code is a complete monster or the budget is nuked.
The difference is that of ivory tower planning and the following phases of development, testing and a “big bang” release (waterfall) vs working with an MVP with the purpose of releasing as soon as possible and then work in iterations based off of actual feedback and demand (agile). If you manage to nuke your budget or create a monster of a code base already at the MVP stage no methodology is going to save you.
I really do believe agile (or at least "not waterfall") has reduced the rate of major software flops. But I'd be fascinated to know how often it turns failures into successes, versus revealing failures earlier in the project cycle. Both are valuable, obviously. It's just interesting that part of Agile's value is in revealing engineer dysfunction or fundamentally bad ideas at the MVP stage instead of the completion stage.
That’s a point, and I agile development would not be possible without the internet. But working incrementally based off of customer/user input is only possible after the first release. What was observed here was the huge number of projects that would previously fail without ever getting that point.
I have the subjective sense that the rate of massive, irreparable flops in software has gone down a lot. (Very anecdotally, the latest "permanent failure" on Wikipedia's list of major failed software projects is from 2014. That list is worth a read regardless.)
Projects still get cancelled, budgets and timelines get blown out, features and testing get compromised. But at least in dedicated software companies, it's pretty rare for something to get through 90% (or 150%) of its allotted development time and then be completely discarded as unsalvageable. (For technical reasons; market changes are a different beast.) Projects in technical crisis are either apparent sooner or have usable elements, rather than failing outright at the end of a waterfall.
Of course, there's a lot of room to debate what changed and why. If Agile led projects to fail sooner or less dramatically, that's still important, but less notable than if it changed them into successes. And a lot of technological advances have helped too; things that might have been outright failures as shrinkwrapped software can become late or overrun projects in the era of digital distribution.
I was interested in comp sci precisely because I didn't have to talk to people, an area where every interaction is a performance. I want to be alone - what's so wrong with that?
If you are writing software just for yourself, there is nothing wrong with that. If you are writing software for others, be prepared to be steamrolled by people who know how to write good code AND how to communicate with consumers of the software (i.e., paid customers, OSS devs, enterprises, etc., that one will depend on what kind of software you are writing).
I think that a lot of developers with a mindset similar to yours tend to underestimate how important good/valuable feedback (and communication in general) is. And I am saying that as someone who initially went into comp sci for reasons similar to yours. Some of the best engineers I ever worked with had amazing communication skills, and it acted as a x10 multiplier to their technical skills and overall productivity.
Feel ya. Surprisingly enough, I'd say that I've not had to attend so many boring meetings in my career but certainly I'm just lucky. Also, once I did get a manager who was an "agile guru" and he was annoying. He talked to me like he's the jedi of software development and I'm his padawan. Thankfully, he got fired.
I had a professor at university who acted that way about Agile. He thought he was literally Uncle Bob incarnate. But when you looked at his credentials, he had never once worked in a real production environment, he had only ever taught.
Nothing is wrong with that, but don't expect to keep a job where you don't have someone to do all the business and project planning and coordination for you, at the expense of your salary. If your computer science talent is good enough, that might work for you.
I had a similar sentiment - As a programmer, I did not want to be communicating with a client, I wanted to be left alone in a sense that I only ever had to interact with people that understand how coding works.
I think it's doable; say you are the sole developer of a massively used library. The programmers being your only users - they read the API docs, submit an occasional feature request/bug report.
Heh. I have a reverse sentiment. As a programmer, I want to communicate with the actual users who'll be using my product. But without two layers of intermediaries on both our and customer side, who end up turning this whole thing into a game of telephone, ostensibly in the name of all the other important stakeholders.
The programmer users are arguably "clients" in this context too. Much the same sorts of issues, different labels.
One need only look at the bug tracker on the average "massively used library" to see that you will be communicating with clients a great deal. Your dream of a small group of perfect programmers filing neat and accurate feature requests or bug reports doesn't match any reality I've ever seen. To expect them all to "understand how coding works" is also likely a pipe-dream - users of all ability levels are out there writing code, filing bug reports for things that aren't bugs, asking questions in forums that aren't meant for those questions, mad at you because the direction the project has taken isn't the one they or the company they work for want/need, etc etc...
Users of a library are still users. You can write a library in an ivory tower, just like you can write applications in an ivory tower, but that kind of software is written for yourself and only incidentally for anyone else. People might still use it if there are no viable alternatives, but don’t expect them to be very happy about it. Especially because scarcity of communication also implies scarcity of documentation.
You're abdicating your responsibility of developing working software. Your job is to solve problems, using code where appropriate. Your job is to understand the entire system and explain how the corporate system above interacts with the technology you are tasked to develop.
If you can't do that, what is the difference between you and a group of outsourced employees making pennies on the dollar?
you’re presupposing he wants to fit into that holistic model, as a senior person. maybe he is fine plugging away at code, the task defined, like a monk transcribing texts.
the world needs both kinds, and a range in between.
Ultimately, these days programming is table stakes - IMHO you get zero credit for knowing how to code. The best developers are those who are able to step up and actually interact with people to articulate and understand problems. Whether that's with fellow developers about code, with managers/stakeholders about technical things, or with end-users/clients about users needs.
You're more than welcome to want to sit in isolation and code away, but I would suspect you would find it difficult to do this and be happy. I've never worked in a team that would work with someone like that.
There is nothing wrong with that, but careers under capitalism are about _signaling value_. I hope you find a place (or have found a place) where you're appreciated. :)
As far as your career - salary and promotions and recognition - you can't just sit in a corner and be quiet. People have to recognize that you're good at what you do and it's hard for others to see when you only communicate in code check-ins.
You have to start attending meetings, speaking at meetings, and being useful beyond the code.
I'm not sure why you're being downvoted but some of the best career advice I've heard is someone saying you should assume when you go to work that you're working under a communist dictatorship. I.E. Toe the party line, make your boss look good, (pretend to) eat up the company propaganda, assume the leader(s) (C-suite) will do whatever they want, when they want, especially giving themselves bonuses regardless of company performance.
It's a weird contrast to normal life, considering a lot of us live in democracies, but once I understood this and started acting accordingly, it helped me handle my work life.
I don't know if that's great advice, but its certainly advice that will help get you promotions and salary bumps.
Like anything there's a grey zone here. Understanding the political realities of a business is highly beneficial and will help you move forward with the company and let you know when to pick your battles. But I've watched people who do nothing but this lose the trust of those at their level. And that trust is crucial for agile development. I also personally feel like pushing back on the company when appropriate can indeed provide a lot of value to the organization. "Why are we doing this meeting?" "We need another week for testing" "I'd like to see the roadmap you are planning". Just don't push back all the time.
It is probably good advise for most cases but individually depends on the shop and culture. Like if you really should always wear a three piece suit to interviews regardless of what they request. Some are flatter, others are more hierarchical (ironically practice can fail to line up with org chart structures).
Some are all about the politics, others don't have the luxury of self delusion or actually value the sort of brutal honesty to say "I have looked at the new framework - while trendy it is buggy inefficient crap." The ones who lack the luxury tend to be smaller but small size is no guarantee that they'll just say no to the flavor aid.
In all areas of life we judge those around us not by the true facts, but on our limited knowledge. Working to improve that knowledge can result in a change in judgment. It seems completely reasonable for there to be a person who does good work but who isn't know for doing such, and as such is viewed worse than they should be. By working to increase the knowledge of the work they do, their evaluation in the eyes of others will improve.
Of course, it isn't a straight forward or simple in practice. There are those who lie and misrepresent, and if you are to obvious about your intent you will be viewed as manipulative.
I don't see why this deserves either the label of BS or a ban from HN?
Perhaps some thing think it's uncomfortably close to the 'slur' of 'virtue signalling' that people throw around. While I'm sure the intention was nothing but pure in how it was used here, it did make bit second-read the comment to assess whether it was being used in a negative way, and to really understand the point being put across.
> > All Agile did was put software development teams unfairly under a microscope.
Most managers crave control over their teams; this is not something that Agile introduced. If anything, Agile let them use the sort of control they were already demanding towards more useful and productive goals, by introducing bazaar-like practices to centralized software development (release early, release often; shorten feedback loops as much as possible; make extensive use of refactoring, software testing and XP principles); while at the same time not being altogether incompatible with self-organizing development teams (these were mentioned in the original Agile manifesto, after all).
Eh? I'd compare it to No True Scotsman, not AI: I mostly hear people insist that anything which doesn't work isn't Agile.
I'm extremely grateful for the changes agile wrought on the software industry, and while I think many of the best insights have become commonplace, I don't think formal Agile methods are outdated or exhausted yet. But I do think formal Agile is extremely hard to do right, and has very common failure modes of knowingly-unrealistic planning, unproductive meetings, excess design changes, and tech debt neglected to ship MVPs.
One measure of a methodology is how much value it provides when it's used right, and I agree that Agile shines here. But it's also worth asking how easy a method is to get right, and how gracefully it devolves when things aren't perfect. My experience is that agile advocates commonly neglect those parts, dismissing widespread frustration with 'bad agile' on the basis that if it had been implemented perfectly, those issues wouldn't have come up.
I agree with most of the article. Specially the keep-learning part.
All Agile did was put software development teams unfairly under a microscope.
I believe Agile has been tremendously beneficial for the industry globally, especially in some subtle ways. For example, Agile says you have to communicate a lot, if you want to get software done. Here is the subtlety: if, today, you stop telling the average programmer to communicate, they will stop and go back to silo mode, "naturally". At least some of them. If you think about software at the industry scale, you have to think about a wide population which require processes.
I was once working in a nice little company where, one day, they introduced agile. It was so much beneficial. Before, the bosses thought that talking was simply a lost of productivity. You're a programmer right? So code, don't talk. So we would develop software without talking to each other. After we had daily meetings, scrums things and stuff. Our velocity has sky-rocketed.
You have to realise that before Agile, a fair portion of all software development projects that were started would simply bust and never get shipped. The code is a complete monster or the budget is nuked. When I started in the company I just mentioned, I started working on a codebase that was the worse code I've ever seen in my life. You would touch one line and everything would stop working. It was a condensed piece of spaghetti with hacks on top of hacks on top of hacks. Software architecture? That requires some talking and thinking, forget about that, not permitted.
Now I agree with you, agile is not much useful for a hacker who consistently get software done and understand deeply what's happening.
edit: not that I'm an agile guru or anything. Actually I only know the thing superficially. I show up at the meetings, when I'm ask how hard something is, I answer, then I mess around with my office friends. Still I can appreciate it works much better than any process the bosses can come up with.