Is AI a greater existential risk to humanity than a pandemic?
11
14
Ṁ2kṀ100
resolved Feb 5
Resolved as
18%1D
1W
1M
ALL
I will resolve this market to the current probability (MKT) after trading closes next week.
For example, that means if it ends at 90%, YES bettors will get 90% of the pool and NO bettors would get 10% of the pool, distributed in proportion to each bettor's shares of each pool.
AI existential risk is typically framed as an out-of-control AI intentionally or incidentally taking actions that kill all humans.
Pandemic risk includes both engineered and naturally occurring pathogens including viruses and bacteria that could wipe out humanity.
Get Ṁ600 play money
Related questions
🏅 Top traders
# | Name | Total profit |
---|---|---|
1 | Ṁ148 | |
2 | Ṁ132 |
Sort by:
But I do believe AI risk has a higher percent chance of wiping out all of humanity.
(Another variant of this kind of question would be: "At the end of a week, will James believe that AI is a greater existential risk than a pandemic?", and then traders can post arguments to persuade James one way or the other)
More related questions
Related questions
Will AI wipe out humanity before the year 2030?
7% chance
By end of 2028, will AI be considered a bigger x risk than climate change by the general US population?
50% chance
Will AI wipe out humanity before the year 2026?
2% chance
Will existential risk from AI be a topic during the 2024 US presidential debates?
14% chance
Will AI wipe out humanity before the year 2025?
1% chance
IF an existential crisis is caused as a result of AI misalignment, THEN will it be from an AI uprising? (Yes, really)
50% chance
Will AI cause an existential catastrophe (Bostrom or Ord definition) which doesn't result in human extinction?
25% chance
Will AI wipe out humanity before the year 2029?
2% chance
When (if ever) will AI cause human extinction?
Will AI wipe out humanity before the year 2025?
1% chance