Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

But your flaw here is extrapolating this utility difference to its limit, that something like a cookie can become exponentially more valuable to someone versus their peers. The value of insulin is just that — life and death. But it certainly doesn’t equate to a million lives or deaths.


You are suffering a failure of creativity.

Suppose a psychedelic drug that is incredibly difficult to synthesize makes someone inhumanly good at problem solving, but only for a few minutes.

Further suppose one of the easiest ways to create this drug is to harvest human brains. They have to be extracted while the donor is still alive, though, as the delicate compounds coming from the brain become unusable rapidly after death.

According to utilitarianism, if that person uses their insane problem solving abilities to increase utility for others, and they're good enough at it with this drug, then we should throw the people who generate the least utility for others (e.g. the poor and disadvantaged, who usually have few resources to offer and little training or skill) into the brain ripper so that the remaining people can get the gains.

This example may seem absurd, but many others like it can be constructed. Spend a few minutes trying and you'll come up with some. This one took me thirty seconds, if that.

See "The Ones Who Walk Away From Omelas" by Ursula K. Le Guin for the canonical illustration of this problem.


The reductio ad absurdum of utilitarianism was, of course, Eliezer Yudkowsky's "Torture vs Dust Specks"[1]

[1] https://www.lesswrong.com/posts/3wYTFWY3LKQCnAptN/torture-vs...


Not so absurd: Do you prefer that old people can die of Covid-19 (a few are tortured) or that public life is locked down (millions are mildly inconvenienced)?


Although the purpose of the lockdowns isn't to prevent old people dying, per se. It's to buy time and prevent system collapse while the government builds out systems to track and quarantine individuals who may be infected.

The lockdown can be eased once we have a better option.

You could argue that there was plenty of time to build such systems before COVID-19 made it to your country, and you'd probably be right. You could also argue that maybe your government hasn't been on the ball with the second half of this story, and you might be right, but that's neither here nor there.


We should make the people who discovered the drug take it and use its effects to figure out a way to synthesize more of it without shredding living human brains. Then we can get to the business of other gains.

For the first run we should just ask for volunteers. No need to pre-optimize, maybe some of those poor and disadvantaged people will end up contributing more on NZT.


I was unclear - I meant to suggest that one specific individual gained these superpowers from the drug, while most people do not.

That said, playing back-and-forth to out-game the other side in hypotheticals is a never-ending game. I see little value in it.


These reductio ad absurdums aren't very absurd, because of this clause: "if that person uses their insane problem solving abilities to increase utility for others, and they're good enough at it". That's a big if, in order to offset the disutility of harvesting brains, but if that is the premise, the conclusion is unsurprising.


To many of us, it seems obvious that any moral system which can lead to these conclusions is not a complete moral system, in much the same sense that Newtonian physics is not a complete physics system.

I imagine that gulf of perception is what actually needs to be crossed somehow in the debate between hardcore utilitarians and those of us who are not.


> any moral system which can lead to these conclusions is not a complete moral system

But "these conclusions" are unobjectionable, because they include the premise "things are net better". Because "net better" is imprecisely defined, I can substitute as strong a condition as desired up to and including impossible conditions. Which means they can subsume non-utilitarian moralities (I believe non-utilitarians are just utilitarians with binary step functions for certain utilities).


I believe that "things are net better" is a profoundly wrong way to decide if something is moral, no matter how you define "things are net better."

Again, see "The Ones Who Walk Away From Omelas." I hope I would walk away, even at the price of slow starvation.

I believe I would, but it is hard to know what you will actually do in a truly awful situation before it arrives.


> I hope I would walk away, even at the price of slow starvation.

Because to you, Omelas is not net better. Which is perfectly understandable. But what basis do you have for rejecting Omelas, other than that you believe it to be worse?


My belief that there is an external standard of right and wrong that has little to do with utility, and that I know what that standard says about torturing people for your own gain.

It's not about utility, it's that it's wrong to hate and harm a few people. The "good" of the many does not have one iota of influence over the wrongness of that action.

To quote the book that defines so much of my moral code, "What does it profit a man if he gain the whole world but loses his soul?"

Side note: describing my worldview with language from yours does not mean yours subsumes mine.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: