What's the seductive-but-wrong part of EA? As far as I can tell, the vast majority of the opposition to it boils down to "Maybe you shouldn't donate money to pet shelters when people are dying of preventable diseases" vs "But donating money to pet shelters feels better!"
That hasn't defined EA for at least 10 years or so. EA started with "you should donate to NGOs distributing mosquito nets instead of the local pet shelter".
It then moved into "you should work a soulless investment banking job so you can give more".
More recently it was "you should excise all expensive fun things from your life, and give 100% of your disposable income to a weird poly sex cult and/or their fraudulent paper hedge fund because they're smarter than you."
This is what some EAs believe, I don't think there was ever a broad consensus on those latter claims. As such, it doesn't seem like a fair criticism of EA.
You can’t paint a broad brush of a whole movement, but it’s true if the leadership of the EA organization. Once they went all-in on “AI x-risk”, there ceased being a meaningful difference between them and the fringe of the LW ratsphere.
That same link puts AI risk under the "far future" category, basically the same category as "threats to global food security" and asteroid impact risks. What's unreasonable about that?
GiveWell is an example of the short-termist end of EA. At the long-termist end people pay their friends to fantasize about Skynet at 'independent research institutes' like MIRI and Apollo Research. At the "trendy way to get rich people to donate" end you get buying a retreat center in Berkley, a stately home in England, and a castle in Czechia so Effective Altruists can relax and network.
Its important to know which type of EA organization you are supporting before you donate, because the movement includes all three.
I assume that GiveWell is the most popular of them. I mean, if you donate to MIRI, it is because you know about MIRI and because you specifically believe in their cause. But if you are just "hey, I have some money I want to donate, show me a list of effective charities", then GiveWell is that list.
(And I assume that GiveWell top charities receive orders of magnitude more money, but I haven't actually checked the numbers.)
Even GiveWell partnered with the long-termist/hypothetical risk type of EA by funding something called Open Philanthropy. And there are EA organizations which talk about "animal welfare" and mean "what if we replaced the biosphere with something where nothing with a spinal cord ever gets eaten?" So you can't trust "if it calls itself EA, it must be highly efficient at turning donations into measurable good." EA orgs have literally hired personal assistants and bought stately homes for the use of the people running the orgs!
It's not all bad, some conclusions are ok. But it also has ideas like "don't donate to charity, it's better to invest that money in, like, an oil fund, and grow it 1000x, and then you can donate so much more! Bill Gates has done much more for humanity than some Red Cross doctor!". Which is basically just a way to make yourself feel good about becoming richer, much like "prosperity gospel" would for the religious.
It might be the parts that lead a person to commit large scale fraud with the idea that the good they can do with the stolen money outweighs all the negatives. Or, at least, that’s the popular idea of what happened to Sam Bankman-Fried. I have no idea what was actually going through that man’s mind.
In any case, EA smells strongly of “the ends justify the means” which most popular moral philosophies reject with strong arguments. One which resonates with me is that there are no “ends.” The path itself is the goal.
> the ends justify the means” which most popular moral philosophies reject with strong arguments.
This is a false statement. Our entire modern world is built on the basis of the ends justify the means. Every time money is spent on long term infrastructure vs giving poor kids food right now, every time a war is fought, every time a doctor triages injuries at a disaster.
I don't think it's useful to conflate "the ends justify the means" with "cost-benefit analysis". You sometimes use the latter to justify certain means, but you don't have to, that's why they're different. When you believe that the ends justify the means, you can also just give no consideration at all to the ethics of the means. No doctor triaging patients would ever shoot a patient in the head so he could move onto one they thought was more important. Yes they might let patients die, but that's different than actively killing them.
I've noticed this with a lot of radical streamers on both sides. They don't care about principles, they care about "winning" by any means necessary.
Winning at things that align with your principle is a principle. If you don't care about principles, you don't care about what you're winning at, thereby making every victory hollow and meaningless. That is how you turn into a loser at everything you do.
Or something like focusing on high paying job and donating money does absolutely nothing to solve any root or structural problems which defeats the supposed purpose of caring about future people or being effective altruist in the first place
Of course the way your comment is written makes criticism sound silly.
Pet shelters is just the pretty facade. The same ideology would have you step over homeless elderly ladies and buy mosquito nets instead, because the elderly lady cannot have children so mosquito nets maximize the number of future humans more efficiently.