Is this the early days of unit-testing? I love the way he talks about the tests. A pleasant reminder that a well-formed test suite is akin to mathematical proof of functionality, as long as you've covered all the "appropriate" cases.
I know the Burroughs (1961) and Apollo teams did tests thorough enough to be compared to unit tests. Probably started there. Concept has been independently invented many times. The smarter one, though, was Cleanroom methodology's usage-centered testing designed to find all the bugs users would encounter in practice. Idea is they'd appreciate 30 bugs they never experienced more than even 1 that they did. That plus inspections/verification led to very low-defect software.
Agile is like a watered-down version of things like Cleanroom with unproven fads added. Check out Cleanroom to see what software engineering looks like. One of few. One or two people even wisely combined it with Python given its readability and low-defect level. Worked very well. :) Ada or Ocaml would be the ideal, next project for Cleanroom or another engineering methodology (eg Praxis Correct by Construction).
"Select a project as advanced as you can conceive, as ambitious as you can justify, in the hope that routine work can be kept to a minimum"
I like that. Not sure it's practical lol. Would be nice. One I found true, though, was this:
"A less qualified young man, originally included, found our activities beyond his mental grasp and left the group. I mention this explicitly, because at least in Holland, the intellectual level needed for system design is in general grossly underestimated. I am more than ever convinced that this type of work is just difficult and that every effort to do it with other than the best people is doomed to either failure or moderate success at enormous expenses."
It seems especially true. Languages, compilers, OS's, verification of the above... the stuff that seems to be the best has taken some bright people to come up with. Old papers on Orange Book A1 systems said they usually got done because they picked geniuses to build them. Google uses the same model to come up with things like MapReduce and F1 DBMS far as I can tell. However, as A1 papers said, geniuses are in short supply. Oh, what to do...?
I think the prior work and 80/20 rule give the answer: leverage geniuses to build tools and methods that make it almost trivial for the rest to apply them correctly. Good examples are the capability-secure OS's making POLA easy, languages such as Ocaml knocking out tons of issues, parsing systems like LANGSEC, middleware like ZeroMQ, web runtimes like Opa (or Cornell's SWIFT), processors like SAFE (crash-safe.org), fault-tolerance like NonStop, and so on. Each is easy enough to learn to use day-to-day while knocking out most of the tough problems users will encounter. And each could be designed, implemented, and verified/validated by elite developers.
The success of tools such as Ada, Oberon, Java, Eiffel, and Python in eliminating entire classes of problems shows us the approach works. Just gotta make it even better, put the brightest on it, and absolutely make sure it's marketable to the rest. Ada, LISP, and Wirth languages' adoption showed last part is critical. Plus... (rolls eyes)... probably needs a bullet-proof, usable C/C++ or Java FFI. Can always have geniuses in metaprogramming convert all that shit later. ;)