> If a system doesn't work as its human operators intend, that's a system failure, not a human being failure.
Well, I think we have to consider who was responsible for it not working as its human operators intended (my use of 'who', rather than 'what', in that sentence is not a grammatical error; it is a clue as to the correct answer.)
Unless the outcome described here was one of the explicit goals of the creators of the system, then you cannot assume it was an intended outcome, as opposed to an unintended consequence of what they chose to do. If it was the latter, then "that's what it will do" is not a justification for what it does do; if it was the former, then it was just a bad choice. The only way to justify the situation is if there was no alternative that was not strictly worse, and if so, then it should be made clear that it is so.
Well, I think we have to consider who was responsible for it not working as its human operators intended (my use of 'who', rather than 'what', in that sentence is not a grammatical error; it is a clue as to the correct answer.)
Unless the outcome described here was one of the explicit goals of the creators of the system, then you cannot assume it was an intended outcome, as opposed to an unintended consequence of what they chose to do. If it was the latter, then "that's what it will do" is not a justification for what it does do; if it was the former, then it was just a bad choice. The only way to justify the situation is if there was no alternative that was not strictly worse, and if so, then it should be made clear that it is so.