Not so fast. You can still get nondeterministic behavior with things like thermal effects (think temperature affecting clock speed affecting relative timing between threads running on different processors)
However, from the perspective of the programmer it is non-deterministic. It only becomes deterministic when it is coupled with a particular scheduler and dispatcher. Most programmers strive to write cross platform code.
That's false. If you're doing it "right" then it will produce the same functional output for all of the possible execution paths. That doesn't make it deterministic.
The programmer is still allowed to specify operations to perform as well as restrictions on the order in which the operations are performed. Not fully specifying what to do just gives the computer more options to pick from -- it will still pick one of those options.
which is the part that gets rid of the "does what you tell it" bit, since now it's doing what it thinks it should, which might not be very consistent or easily determinable because minor variations during a race can produce wildly differing results.
If I tell someone, "bring me a sandwich or a bowl of chili," and the person brings me a bowl of chili, that person has done exactly what I told them to do.
What if they get bored waiting at the counter in the cafe and come back empty-handed? 'Cos that's what can happen as soon as IO is allowed into the picture; particularly so with network IO.
The problem is that combining this with multithreading makes for a combinatorial explosion of possible states that you might have to deal with. It's just not feasible to exhaustively specify everything., and leaving things implicit means that you're expecting the computer to do things you haven't told it to do.
Sure, there is complexity. It's easy to make assumptions or forget real-world details too. But you DID tell it to time-out, because that's how the I/O library is specified.
True, but my point is still valid. When you start a thread, you're saying "do this these tasks in whatever order is most convenient for you". And the computer does just that.
Philisophically, you can't say either way. For example, consider the case where the kernel scheduler uses random numbers from a hardware random number generator which uses quantum uncertainty. Since physics doesn't know whether or not quantum mechanics is deterministic, we can't say whether or not this system is deterministic.
Knowing the laws wouldn't necessarily make the system deterministic, though. You'd need hidden variables (and non-local ones, to boot).
As best we can tell, sometimes certain information doesn't exist so you may very well get something truly random that averages out to something that looks like what we think of as deterministic, classical behavior.
I would say there are just so many unknown variables (probably impossible to know), to the extent that people can get away with pretending that it's non-deterministic, when it actually is deterministic.
Which is why there's no way to be certain that it is or is not deterministic. According to current knowledge, quantum uncertainty is not deterministic -- solving the Schroedinger wavefunction for hydrogen, for example, shows no way to predict the future location of an electron. The best we can do is a probability distribution.
If, however, as the parent mentions, hidden variables do exist behind quantum mechanics, it could turn out that it was deterministic all along.
> "Equation X doesn't help us understand phenomena Y"
How does that even remotely suggest non-determinism?
> If, however, as the parent mentions, hidden variables do exist behind quantum mechanics,
Isn't it obvious that we're not even remotely close to knowing everything?
The uncertainty principles simply suggests that we perhaps can never grasp or understand all the details; it doesn't in any way imply that these unknown details don't exist. It'd be foolish to think that "because we can't predict it therefore it's non-deterministic".