Not necessarily? If you have a real-time OS and you write your program well, you can synchronise timings and have cores send messages to each other without queues. It's hard to write fast code that does that, but in narrow circumstances, it's possible (I'm thinking embedded applications) – and when it is possible, it's faster than the equivalent algorithm with indeterminacy.
Indeterminacy slows things down. It's a concession.
And natural determinacy, if you can get it, speeds up processing more than indeterminacy. The useful question is “what's the lowest-level model I can usefully use?”, not “can I do it with indeterminacy?”.
Do you think the above statement is wrong? If so, why?
a many-core computer has inherent indeterminacy.