Is AI a greater existential risk to humanity than a pandemic?
11
100Ṁ2014
resolved Feb 5
Resolved as
18%
I will resolve this market to the current probability (MKT) after trading closes next week. For example, that means if it ends at 90%, YES bettors will get 90% of the pool and NO bettors would get 10% of the pool, distributed in proportion to each bettor's shares of each pool. AI existential risk is typically framed as an out-of-control AI intentionally or incidentally taking actions that kill all humans. Pandemic risk includes both engineered and naturally occurring pathogens including viruses and bacteria that could wipe out humanity.
Get
Ṁ1,000
to start trading!

🏅 Top traders

#NameTotal profit
1Ṁ148
2Ṁ132
Sort by:
Let's see how this goes.
Looks like a great time to sell out of the market.
Oh, I see! I have a solution to that....
Oh yes! My initial NO bet was more a comment on the nature of the market than a statement about my beliefs on the actual question.
Doesn't existential risk mean pretty much the same as wiping out humanity?
But I do believe AI risk has a higher percent chance of wiping out all of humanity. (Another variant of this kind of question would be: "At the end of a week, will James believe that AI is a greater existential risk than a pandemic?", and then traders can post arguments to persuade James one way or the other)
A market that always resolves to MKT seems particularly prone to market manipulation.
© Manifold Markets, Inc.TermsPrivacy