Is AI a greater existential risk to humanity than a pandemic?
11
100Ṁ2014resolved Feb 5
Resolved as
18%1H
6H
1D
1W
1M
ALL
I will resolve this market to the current probability (MKT) after trading closes next week.
For example, that means if it ends at 90%, YES bettors will get 90% of the pool and NO bettors would get 10% of the pool, distributed in proportion to each bettor's shares of each pool.
AI existential risk is typically framed as an out-of-control AI intentionally or incidentally taking actions that kill all humans.
Pandemic risk includes both engineered and naturally occurring pathogens including viruses and bacteria that could wipe out humanity.
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
🏅 Top traders
# | Name | Total profit |
---|---|---|
1 | Ṁ148 | |
2 | Ṁ132 |
People are also trading
Are AI and its effects are the most important existential risk, given only public information available in 2021?
89% chance
Will AI cause an existential catastrophe (Bostrom or Ord definition) which doesn't result in human extinction?
25% chance
OpenAI CEO doesn't think existential risk from AI is a serious concern in Jan 2026
27% chance
Public opinion, late 2025: Out-of-control AI becoming a threat to humanity, a real threat?
Will AI Cause a Deadly Catastrophe?
30% chance
Will humanity wipe out AI?
14% chance
Will humanity wipe out AI x-risk before 2030?
10% chance
When (if ever) will AI cause human extinction?
By end of 2028, will AI be considered a bigger x risk than climate change by the general US population?
56% chance
Will AI wipe out humanity before the year 2030?
3% chance
Sort by:
But I do believe AI risk has a higher percent chance of wiping out all of humanity.
(Another variant of this kind of question would be: "At the end of a week, will James believe that AI is a greater existential risk than a pandemic?", and then traders can post arguments to persuade James one way or the other)
People are also trading
Related questions
Are AI and its effects are the most important existential risk, given only public information available in 2021?
89% chance
Will AI cause an existential catastrophe (Bostrom or Ord definition) which doesn't result in human extinction?
25% chance
OpenAI CEO doesn't think existential risk from AI is a serious concern in Jan 2026
27% chance
Public opinion, late 2025: Out-of-control AI becoming a threat to humanity, a real threat?
Will AI Cause a Deadly Catastrophe?
30% chance
Will humanity wipe out AI?
14% chance
Will humanity wipe out AI x-risk before 2030?
10% chance
When (if ever) will AI cause human extinction?
By end of 2028, will AI be considered a bigger x risk than climate change by the general US population?
56% chance
Will AI wipe out humanity before the year 2030?
3% chance