I think the more important point jacques_chester was making is that even in the case where computer systems are 100% deterministic it doesn't matter because they are so complex they become chaotic systems. And then hypothetical determinism won't save you because there are too many variables in play simultaneously, and even the slightest variation in any one of them could change the behavior completely.
Sorry guys, but I don't buy your defeatist arguments for one second. Sure using traditional tools, you may be right. However, all we have to do is look at the brain with its 100 trillion Connections (axioms) or a CPU with its billions of transistors to see that your logic is flawed. If it was not possible to design fault tolerant systems on large scale neither of these two examples could exist.
I don't claim to have the answers, but a few years ago, I had to help out some EE's write some testing software in
LabVIEW. The first thing that shocked me was the system was immune to bad data. Secondly, how elegantly it took advantage of our multiprocessing and multithreading hardware.
What I wish systems researchers would work on is concurrent, signal-based, synchronous languages. This is a winnable game, but we must swallow the red pill and change the rules of the game with new tools and fresh ideas.