Is AI a greater existential risk to humanity than a pandemic?
11
14
100
resolved Feb 5
Resolved as
18%
I will resolve this market to the current probability (MKT) after trading closes next week. For example, that means if it ends at 90%, YES bettors will get 90% of the pool and NO bettors would get 10% of the pool, distributed in proportion to each bettor's shares of each pool. AI existential risk is typically framed as an out-of-control AI intentionally or incidentally taking actions that kill all humans. Pandemic risk includes both engineered and naturally occurring pathogens including viruses and bacteria that could wipe out humanity.
Get Ṁ600 play money

🏅 Top traders

#NameTotal profit
1Ṁ148
2Ṁ132
Sort by:
bought Ṁ500 of YES
Let's see how this goes.
sold Ṁ0 of YES
Looks like a great time to sell out of the market.
bought Ṁ1 of YES
Oh, I see! I have a solution to that....
bought Ṁ1 of YES
Oh yes! My initial NO bet was more a comment on the nature of the market than a statement about my beliefs on the actual question.
bought Ṁ1 of YES
Doesn't existential risk mean pretty much the same as wiping out humanity?
sold Ṁ9 of NO
But I do believe AI risk has a higher percent chance of wiping out all of humanity. (Another variant of this kind of question would be: "At the end of a week, will James believe that AI is a greater existential risk than a pandemic?", and then traders can post arguments to persuade James one way or the other)
bought Ṁ10 of NO
A market that always resolves to MKT seems particularly prone to market manipulation.