Which x-risks will kill half the population first?
9
40
600
2100
51%
Artificial Superintelligence
15%
Biotechnology
15%
Nuclear Weapons
10%
Climate Change
2%
Aliens
1.2%
Asteroid Impacting Earth
6%
Something Else

Given that existential risks are a problem, which one will happen first? After all, it's hard to solve climate change if everybody is dead from nukes.

Get Ṁ200 play money
Sort by:

how does this resolve if everyone running and using manifold, including me and you, dear reader, is dead?

How does this resolve if ASI causes another x risk?

@Nikola If the risk occurred because of ASI, then it resolves as ASI.