Hacker News new | past | comments | ask | show | jobs | submit login

> they're discounting the people around them heavily compared to other people

This statement of yours makes no sense.

EAs by definition are attempting to remove the innate bias that discounts people far away by instead saying all lives are of equal worth.

>turning them into a math problem that assigns no special responsibility to the people around you

All lives are equal isn't a math problem. "Fuck it blow up the foreigners to keep oil prices low" is a math problem, it is a calculus that the US government has spent decades performing. (One that assigns zero value to lives outside the US.)

If $100 can save 1 life 10 blocks away from me or 5 lives in the next town over, what kind as asshole chooses to let 5 people die vs 1?

And since air travel is a thing, what the hell does "close to us" mean?

For that matter, from a purely selfish POV, helping lift other nations up to become fully advanced economies is hugely beneficial to me, and everyone on earth, in the long run. I'm damn thankful for all the aid my country gave to South Korea, the number of scientific advances that have come out of SK damn well paid for any tax dollars my grandparents paid on many orders of magnitude times over.

> It can be hard to say exactly why in words, but that doesn't make it less true.

This is the part where I shout racism.

Because history has shown it isn't about people being far or close in distance, but rather in how those people look.

Americans have shot down multiple social benefit programs because, and these are what people who voted against those programs directly said was their reasons "white people don't want black people getting the same help white people get."

Whites in America have voted, repeatedly, to keep themselves poor rather than lift themselves and black families out of poverty at the same time.

Of course Americans think helping people in Africa is "weird".






> If $100 can save 1 life 10 blocks away from me or 5 lives in the next town over, what kind as asshole chooses to let 5 people die vs 1?

The thing about strict-utilitarian-morality is that it can't comprehend any other kind of morality, because it evaluates the morality of... moralities... on its own utilitarian basis. And then of course it wins over the others: it's evaluating them using itself!

There are entirely different ethical systems that are not utilitarian which (it seems) most people hold and innately use (the "personal morality" I'm talking about in my earlier post). They are hard to comprehend rationally, but that doesn't make them less real. Strict-utilitarianism seems "correct" in a way that personal morality does not because you are working from a premise "only things that I can understand like math problems can be true". But what I observe in the world is that people's fear of the rationalist/EA mindset comes from the fact that they empirically find this way of thinking to be insidious. Their morality specifically disagrees with that way of thinking: it is not the case that truth comes from scrutable math problems; that is not the point of moral action to them.

The EA philosophy may be put as "well sure but you could change to the math-problem version, it's better". But what I observe is that people largely don't want to. There is a purpose to their choice of moral framework; it's not that they're looking at them all in a vacuum and picking the most mathematically sound one. They have an intrinsic need to keep the people around them safe and they're picking the one that does that best. EA on the other hand is great if everyone around you is safe and you have lots of extra spending money and what you're maximizing for is the feeling of being a good person. But it is not the only way to conceive of moral action, and if you think it is, you're too inside of it to see out.

I'll reiterate I am trying to describe what I see happening when people resist and protest rationalism (and why their complaints "miss" slightly---because IMO they don't have the language to talk about this stuff but they are still afraid of it). I'm sympathetic to EA largely, but I think it misses important things that are crippling it, of the variety above: an inability to recognize other people's moralities and needs and fears doesn't make them go away; it just makes them hate you.


> The thing about strict-utilitarian-morality is that it can't comprehend any other kind of morality, because it evaluates the morality of... moralities... on its own utilitarian basis.

I can comprehend them just fine, but I have a deep seated objection to any system of morality that leaves behind giant piles of dead bodies. We should be trying to minimize the size of the pile of dead bodies (and ideally eliminate the pile altogether!)

Any system or morality that boils down to "I don't care about that pile of dead bodies being huge because those people look different" is in fact not a system morality at all.


Well, you won't find anyone who disagrees with you here. No such morality is being discussed.

The job of a system of morality is to synthesize all the things we want to happen / want to prevent happening into a way of making decisions. One such thing is piles of dead bodies. Another is one's natural moral instincts, like their need to take care of their family, or the feeling of responsibility to invest time and energy into improving their future or their community or repairing justice or helping people who need help, or to attend to their needs for art and meaning and fun and love and respect. A coherent moral system synthesizes these all and figures out how much priority to allocate to each thing in a way that is reasonable and productive.

Any system of morality that takes one of these criteria and discards the rest of them is not a system of morality at all, in the very literal sense that nobody will do it. Most people won't sell out one of their moral impulses for the others, and EA/rationalism feels like it asks them too, since it asks them to place zero value in a lot of things that they inherently place moral value in, and so they find it creepy and weird. (It doesn't ask that explicitly; it asks it by omission. By never considering any other morality and being incapable of considering them, because they are not easily quantifiable/made logical, it asks you to accept a framework that sets you up to ignore most of your needs.)

My angle here is that I'm trying to describe what I believe is already happening. I'm not advocating it; it's already there, like a law of physics.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: