> then even small probabilities of catastrophic events wiping out humanity yield enormous negative expected value. Therefore, nothing can produce greater positive expected value than preventing existential risks—so working to reduce these risks becomes the highest priority.
This is the logic of someone who has failed to comprehend the core ideas of Calculus 101. You cannot use intuitive reasoning when it comes to infinite sums of numbers with extremely large uncertainties. All that results is making a fool out of yourself.
They use technical terms (eg expected value, KL divergence) in their verbal reasoning only to sound rational, but don’t ever mean to use those terms technically.
This is the logic of someone who has failed to comprehend the core ideas of Calculus 101. You cannot use intuitive reasoning when it comes to infinite sums of numbers with extremely large uncertainties. All that results is making a fool out of yourself.