To be able to decode under heavy interference you need heavy error correction in the digital hardware and in the standard. Nobody designs space-grade consumer radios. You are basically limited with the bits you have and the recovery you have, no more. A really bad digital signal is unintelligible by definition. It is a sharp cutoff when you reach the limit.
However an analog signal is ehmm .... an analog of the original. It is continuous. So the error is also analog and you're only limited by the average case. The signal degradation is also analog, there is no sharp cutoff. So as long as human brain can extract the information from the noise, analog continues to work.
if you design the protocol well you could easily make a digital signal degrade cleanly. rather than sending a plain signal, break it up into a low, medium and high fidelity version where the higher fidelity versions store extra frequency info but the low quality ones have more error correction
However an analog signal is ehmm .... an analog of the original. It is continuous. So the error is also analog and you're only limited by the average case. The signal degradation is also analog, there is no sharp cutoff. So as long as human brain can extract the information from the noise, analog continues to work.