I always wondered where this multiplicative factor of "several times" comes from. In my experience, writing correct software was marginally slower than writing sloppy software as long as most thinking is done with a pen & paper. Would you mind to elaborate a bit more?
I'm guessing: because of cheap labor. Writing correct and/or fast software involves quite a bit of "don't do stupid things" all across the development process. Doing these things right doesn't add much time to development, but one has to first learn how to do these things right. A fresh and inexperienced developer, or the "one year of experience repeated 10 times" person that only worked at "move fast and break things" project isn't going to have this knowledge, but will be cheaper to hire.
Not OP/GP, but I think it's mostly about the definition of correct.
Does it correctly handle every possible sequence of inputs? For the vast majority of software in use today, the answer is "no"; The follow up question is "does it matter?" and (luckily or unluckily) the answer for the vast majority of software in the vast majority of use cases, is "no" as well.
>>> The failure occurred only when a particular nonstandard sequence of keystrokes was entered on the VT-100 terminal which controlled the PDP-11 computer: an "X" to (erroneously) select 25 MeV photon mode followed by "cursor up", "E" to (correctly) select 25 MeV Electron mode, then "Enter", all within eight seconds
Software was not obviously correct or incorrect; In fact, it had been acceptable for an earlier model (which had some hardware protections missing from the newer one). Reaching the incorrect state required the race described above to trigger, which did, in fact, happen in practice a handful of times.
You can work very hard to formally prove your implementation, only to find out that the compiler had a bug and makes your software bad. Or the CPU does; Or all the designs are fine, but there's an bit flip due to electro-migration or cosmic rays. Many people consider this "force majeur" - "an act of god" one cannot anticipate, but cosmic rays are in fact an expected -- and hard to avoid -- input to many systems.
You are in control of a logical model which you can, with extra work, do (provably) correctly. But that IS, from experiencce, 10x to 100x more expensive, and unless you go for 10x-100x more expensive hardware, reduces and moves the sloppiness factor around, but does not eliminate it.