Is it possible for someone to have the moral imperative to die for the sake of saving at least one other life?
Suppose there is a standard, non-evil person A with an average amount of moral weight assigned to their life-years and an average number of years left. If there is a hypothetical scenario where A ending their life could save someone else, or multiple people, would they have a moral imperative to end their life? Or is it impossible for someone to have a moral imperative to do that.
Resolution Criteria
This market resolves based on a poll vote by the Manifold community. Traders will vote on whether, in their judgment, a standard person with average moral weight and average remaining lifespan could have a moral imperative to end their life to save others. The resolution will reflect the community consensus on this philosophical question.
Background
This question engages with classical utilitarian ethics and the doctrine of supererogation—the distinction between what morality requires versus what goes "above and beyond" moral duty. Utilitarians argue that if an action produces the greatest good for the greatest number, it may be morally required. However, most moral frameworks recognize that extremely costly personal sacrifices (particularly one's own life) typically fall into the category of supererogatory acts—praiseworthy but not obligatory. The question specifically brackets out considerations of evil actors and assumes a standard person, focusing on whether the calculus of lives saved could ever create a binding moral obligation rather than merely a noble option.
Considerations
The framing includes threshold variations (saving 1, 10, 100, or 1000+ people) which traders should evaluate separately. The question assumes a hypothetical scenario rather than real-world constraints, meaning traders should consider the philosophical principle rather than practical feasibility. Different moral frameworks—consequentialist, deontological, virtue ethics, and others—will yield different answers, so traders voting should clarify which ethical framework they're applying or vote based on their own considered judgment.
This description was generated by AI.
@skibidist I agree that there's more of a moral imperative to save the lives of your family than the lives of strangers, but I couldn't justify taking the life of 1/8 of the total population in order to save one or two family members.
@skibidist That's not very smart, because if you decide it's okay to take a billion lives to save your family then it's a lot more likely that you and your family will be among the billion being sacrificed than the other way around.