Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Software that infringes on the public (even if they are criminals) as opposed to software that people can opt to use or not, needs to have a very serious question asked at design time: If the software produces an incorrect result, what mechanism exists to override it/audit it/provide damages etc.

The fact people are not asking that is worrying. I understand why the system was not designed to do something that happened later (even if it could have been reasonably foreseen) but the fact that it was implemented with no override is really the scandal.

I don't know whether this comes down to an amount of power that exists in a Governor that means the rest of the organisation can't say, "sorry Guv, but we can't do this because the software wasn't written to". If TV is to be believed, Governors want things done yesterday and you worry about the problems.



As someone with a civil engineering background:

This right here is the difference between conventional engineering disciplines where designs require a Stamp from an Engineer of Record who takes on personal responsibility in the event of design failures vs. the current discipline of software engineering.

There's a big difference between a software developer and a software engineer, and I think that difference should be codified with a licensure and a stamp like it is in every other engineering field in the states.

Software like this ought to require a stamp.

A decent analogy is the environmental work I've done. When we come up with solutions and mitigations to environmental problems, like software, we can't always predict the result because of the complexities involved. So we stamp a design, but we, or the agencies responsible for allowing the project often specify additional monitoring or other stipulations with very specific performance guidelines. It's a flexible system and possible to adapt to, but there are real consequences and fines when targets aren't met. When bad things happen, the specifics of what went wrong and why are very relevant and the engineer may be to blame, or the owner/site manager, or the contractor who did the work, or sometimes no one is to be blamed but the agencies are able to say: "Hey this isn't working and needs to be addressed, do it by this date or else."

In engineering, there's an enormous amount of public trust given to engineered designs. The engineer takes personal responsibility for that public trust that a building or bridge isn't going to fall down. And if you're negligent, it's a BFD.

Given the current level of public trust that we are putting into software systems, it's crazy to me that we haven't adopted a similar system.


I would love for software engineering as a discipline to go that way, but it's going to be very hard. Software usually has more moving parts than hardware.

I don't mean to understate the difficulty of being a hardware engineer, of any sort. But the whole reason we do things in software at all is because software is more flexible, and adding a new thing comes with less overhead. Hardware, while challenging, tends to follow similar sets of solutions to similar problems. There are only so many things a bridge, or a building, or even a CPU will be tasked to do.

Not saying this is impossible for software, either. Software gets built for man-rated tasks -- and jobs like this should be considered man-rated, because lives depend on it. That means it's going to cost more and take longer, especially when it's software of a kind nobody has ever built before. Who has experience in "software that releases prisoners?"

The reason they don't do that is, therefore, money. I doubt the prison system is willing to pay 10x as much for the software. The software was probably built by the lowest bidder technically acceptable, where "technically acceptable" was incredibly flexible because nobody really knew what had to be done.


> additional monitoring or other stipulations

That does happen with software a lot, frequently flying under the title of Compensating User Entity Controls (CUECs) or User Control Considerations (UCC). Basically the “here it is, don’t feed it after midnight and don’t let it get wet, and good luck” riders. Sounds like these problems happened way earlier in the lifecycle though - either the requirements were missed or the testing was thorough enough.


Software is completely different from your typical other engineering fields. You just can't apply the same methodology there. In other fields such as building bridges you are quite often taking what has already been proven to work well and building it, while in software if you start to repeat yourself you are doing things wrong.


I have a very cynical take. Probably too cynical. The ability to shift blame to software as opposed to the humans responsible for administering a bureaucracy is exactly what makes it so appealing. The question is ignored intentionally.


That was my first thought actually. We are probably not cynical enough! In addition to blame shifting, the prison industrial complex is benefitting from having the prisoners stay longer, so there is zero incentive to fix the problem.


Not cynical enough: if private prisons with a profit motive delay prisoner release, they’ve made more money.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: