That would mean they believe if you "wrote it correctly the first time" you wouldn't need to refactor. Therefore you would have productivity to produce new features.
This is obviously shortsighted for reasons most developers already understand.
I version control and unit test in my mind palace before chiseling my code onto marble slabs. Lasts forever, encourages parsimony, and gives me an excuse to stare at the wall pretending to work 39 hours per week. I call it Bruno coding.
Much technical debt is caused by poor specifications, or fluctuating ones as a company flails around seeking product-market fit. Sometimes the best way to leverage developers is to improve other parts of the organization.
Unclear requirements are certainly a major cause of inefficiency and technical debt, but I would imagine that over-engineering is at least a close second, and that under-engineering, poor choice of tools and poor choice of processes probably fill out the top 5.
I believe that what is missing more than anything in our industry is the veterans who have gained knowledge and wisdom through both the breadth and the depth of their past roles and can now do better than the previous generation. In real engineering fields or other scientific disciplines like medicine, there is a culture of learning from experience, and the people doing the most challenging jobs often have decades of it. In software development, we laughably call someone with 5 years of experience split between 3 different jobs "senior", and a lot of developers in their 30s are already looking for an escape hatch before ageism halts their career development.
So another good way to leverage developers might be to improve their working environment so they don't all quit just as they're starting to figure out what they're doing.
Veterans are fine but engineering moves on, as does medicine, the legal profession etc. What worked well before might work well this time, but it might not. Also, how do I know that a veteran's opinion is still valuable/value for money/technology appropriate etc.
The real issue is there is not 1 correct way to write software but although we are understanding programming more than ever, we are still not great at quantifying and describing the values or requirements that drive our choice of process or even our selection of the right supplier to write it for us.
We had a bad experience from a supplier, not because they were rubbish or lacked skillsets or ethics but because they were not a good fit for the type of product we were building.
Veterans are fine but engineering moves on, as does medicine, the legal profession etc.
Sure. But they all move slowly and deliberately, mostly through careful evolution of good practices as new evidence and reasoned analysis become available. We don't throw out everything we've ever known about how to build reliable bridges every six months because someone thinks that suspending a paper bridge 200 sheets thick from orbiting satellites instead would be cool.
True game-changing developments do happen in software development, but they are quite rare. Most progress in industries like web development is illusory, and it's only a successful illusion because they people who have been around for a while and seen 99% of it before in other contexts have left.
Exactly... maybe just maybe because design/reqs changed? It can be as simple as... the company name changed, so maybe we need to refactor some classes.
>in a perfect world this would be the case every time you build something.
Yet the same people who hate devs spending time on refactors also laugh when you give them a six month estimate for feature X. I think "perfect" is "good enough" in this context. "Good enough" is always preferable to "non existent yet theoretically perfect".
This is obviously shortsighted for reasons most developers already understand.