Does there exist a serious argument that extinction risk from AI is <1% over the next 50 years?
30
570Ṁ2097
resolved Feb 5
Resolved
NO

In order to qualify to resolve this market to YES, I don't have to agree with the argument, it just needs to be a real attempt to argue in favor of that conclusion, without basic logical errors. A non-exhaustive list of such errors:

  • Assuming the probability you want without justification, such as "if we don't know a probability it must be 0" or "obviously the probability is negligible".

  • Magical or non-physicalist arguments like "computers are incapable of intelligence" or "God would prevent that from occurring".

  • Disproving one specific attack vector and treating that as having disproven all risk. "[20 pages explaining how an AI could not possibly use stuffed animals to exterminate humanity], and therefore AI isn't a risk!"

If I'm not aware of any by market close, this resolves NO.

I won't bet.

Get
Ṁ1,000
to start trading!

🏅 Top traders

#NameTotal profit
1Ṁ104
2Ṁ64
3Ṁ52
4Ṁ47
5Ṁ43
© Manifold Markets, Inc.TermsPrivacy