> Rarely, at least according to U.S. Supreme Court Justice Antonin Scalia. In a 2006 opinion he cited an approximate error rate of 0.027 percent, based on [...]
I laughed out loud. 1 mistake per 3700 cases. The software industry averages 1 bug every 50 lines of code [1]. Microsoft hits ~1/2000. The space shuttle achieved ~1/20000, but that takes serious dedication and cost [2].
Doesn't matter what the "based on" is. Deciding court cases is not easier than writing a line of code. That estimate is off by at least two orders of magnitude. Which is funny until the fact that it was a supreme court justice who repeated it sinks in.
(The article itself gives a false positive rate of 4.1%, which is much more reasonable.)
While that number also seems low to me, comparing the failure rate of a court case with a single line of code is a ludicrous analogy. I, a distracted and harried developer, will write hundreds of lines of code a day and not think twice about them, whereas each criminal trial is, for some period of time, the subject of undivided attention of at least one judge, two lawyers and a jury.
Errors certainly happen, but they're almost certain not to be dominated by errors of simple oversight, which explain the vast majority of bugs.
undivided attention? being in the courtroom doesn't make you any less distracted or harried. the lawyers are handling multiple cases at the same time. the judge is probably doing crosswords up on the bench. some of the jurors are probably thinking about where to eat for lunch, others decided the case soon as they saw the defendant walk in the courtroom wearing jailhouse orange. and That's just for the 10% of cases that actually make it to trial.
I agree that the style of the typical errors differs, although there are definitely cases decided by some stupid mistake made when collecting evidence.
But even when I try and test I don't hit 1 bug per 2000 lines of code. For example, I wrote a collapsing futures library for obj-c [1]. It only has about 1000 lines of non-test non-header code. The code is tested, I've used it in projects, and I re-read it now and then trying to come up with ways to break it. Is it reasonable for me to lay 50:50 odds on a bug being present? I don't think so.
(Are you the owner of the github repo statsd.net [2]? `someGraphiteLine.Equals(null)` returns false but `someGraphiteLine.Equals((GraphiteLine)null)` throws an exception.)
A public defender may have in excess of 200 cases. Some defendants spend fewer than five minutes with their attorney. Some counties bid out public defense at fixed prices; in 2007 Virginia capped public defense for 20-to-life felonies at $2085. Bach[1] documents a defender who was paid $42,150; in four years he took fourteen cases to trial, out of 1493; on one day he represented 89 defendants, all of whom plead guilty or had their trials deferred.
These numbers compare unfavorably with the cost of software contracts, the typical pay of software developers, and the amount of time that a line of code gets to spend with its developer[2].
[1] Amy Bach, _Ordinary Injustice: How America Holds Court_
[2] Steve McConnell, _Software Estimation_
I laughed out loud. 1 mistake per 3700 cases. The software industry averages 1 bug every 50 lines of code [1]. Microsoft hits ~1/2000. The space shuttle achieved ~1/20000, but that takes serious dedication and cost [2].
Doesn't matter what the "based on" is. Deciding court cases is not easier than writing a line of code. That estimate is off by at least two orders of magnitude. Which is funny until the fact that it was a supreme court justice who repeated it sinks in.
(The article itself gives a false positive rate of 4.1%, which is much more reasonable.)
1: http://amartester.blogspot.com/2007/04/bugs-per-lines-of-cod... 2: http://www.fastcompany.com/28121/they-write-right-stuff