Is 10^20 people getting dust specs in their eye preferable to 50 years of one person suffering from being burnt alive?
43
708
877
resolved Jan 3
Resolved
NO

Bases on this post: https://www.lesswrong.com/posts/4ZzefKQwAtMo5yp99/circular-altruism

The number 10^20 came from the number of seconds in 50 years (which is about 10^9) times the difference in intensity in pain (which pulled completely out of my ass)

This will resolve YES if by the end it gets >50% and NO otherwise

Get Ṁ200 play money

🏅 Top traders

#NameTotal profit
1Ṁ1,433
2Ṁ994
3Ṁ194
4Ṁ68
5Ṁ65
Sort by:
bought Ṁ0 of NO

resolves NO...

predicted NO

@TheBayesian but for what it's worth I think NO is right, 10^20 people experiencing a slight inconvenience is much much worse than one person being burnt alive for 50 years

predicted YES

@TheBayesian You volunteering?

predicted NO

@Auracle noooo!

predicted YES

@TheBayesian So you’re saying other people should suffer for you?

predicted NO

@Auracle No..? I don’t think anyone should suffer

predicted NO

I don’t think we should run this thought experiment at all. I think the YES position is largely a failure of multiplying. If it were me I would rather avoid the worse suffering than the least, but i would pick a 1/10^20 chance of torture over a dust peck in the eye. If every single human on earth got a dust peck in the eye that would be a sad moment. 10^20 is about 10 billion times humanity getting a dustpeck in the eye. It is a tragedy beyond all imagining.

bought Ṁ690 of YES

might be that using a linear model for this kind of problem just does not work and you need to model things like torture with something like transfinite numbers and multiplication alone just won't get you there

@bashmaester If you invent(or find) such theory please let me know.

But I would suspect (and even bet) that since all humans experiences are finite we actually don’t need anything transfinite.

Would I rather experience having a dust spec in my eye as 10^20 people or being burnt alive for 50 years? The answer is clear.

but what about 10^30? or is it a qualitive difference?

Or what about 1 second of suffering vs 10^10 people getting dust specs?

All these were meant to be genuine questions.

bought Ṁ20 of NO

@lambdasaturn Not sure there is a threshold. The only thing holding my intuition back is having to wait around for basically eternity to experience the dust specs, which isn’t an issue in the scenario itself since it happens to different people.

@Mvem I highly recommend to read the linked LW post. To contrast why I think this is a wrong approach, consider this dilemma: one innocent person locked in prison for 50 years vs 10^15 innocent people locked in prison each for 1 second. Is your intuition still holds? Consider that in second case total prison time is about 10^5 times more.

@Mvem And btw why have you bought NO shares then?

sold Ṁ19 of NO

@lambdasaturn Not reading the question carefully enough lol. And yeah it definitely still holds as I don't think the badness of prison time scales linearly across different people.

Is this Q inspired by Ursula K. Le Guin’s The Ones Who Walk Away from Omelas?

@lambdasaturn I had no bet in this very interesting market. I have seen no discussion of hope and despair. All the momentary one off inconvenience to all of mankind will by my reckoning never equate to the despair and lost hope of one individual over their lifetime. I recommend the short story sci fi by Le Guin…really on point, here!

This market's resolution criteria make it a Keynesian beauty contest; its resolution isn't necessarily going to reflect the "correct" moral position, or even what people believe.

@a do you know how to construct a q uestion for it to represent what people believe?

@lambdasaturn Lots of people have tried to figure out how to do this — usually with polls or randomness — but it's a tricky problem! Some methods are listed in this market's description.

bought Ṁ100 of NO

@a I disagree, since 10^20 don't all exist at once. Therefore, the time at which one would get a dust speck in their eye could not happen at the same time as everyone else. Imagine driving when getting a speck in the eye, the odds are slim but it could lead to an accident and loss of life. Actually 10^20 occurrences with random fluctuations and low odds of it leading to accidents is still going to be a pretty high death count. Vs one person's terrible suffering, this is an actuarrist trolley car problem if you consider it in this way.

@rogerplanck When creating this question, I intended just this sufferings, without knock-on effect. And I assume EY also meant the same in original essay.

bought Ṁ15 of NO

I'm not sure we know enough about experiences to directly compare these two. I couldn't remember how many times I've gotten a speck of dust in my eye and I expect on the margin it might matter literally not at all. In this case then, it's not a provocative example of people's intuitions not working or not scaling to large values, but meaningfully untrue. Utilitarianism works in many cases, and I would say I'm a small-u utilitarian but I'm not sure adding up the ghosts of departed qualia does actually get you to unimaginable suffering.

But it doesn't have a solution without concrete situation
https://www.lesswrong.com/posts/dpMZHpA59xFFjCqBp/the-value-of-a-life

More related questions