What will be the P(doom) of these individuals when Manifold thinks ASI is <1y away?
95%
Eliezer Yudkowsky
63%
Scott Alexander
57%
Nick Bostrom
55%
Toby Ord
50%
Ilya Sutskever
41%
Dario Amodei
41%
Yoshua Bengio
41%
Geoffrey Hinton
24%
Sam Altman
20%
Yann LeCun

Artificial superintelligence (ASI) here means any artificial intelligence able to carry out any cognitive task better than 100% of the unenhanced biological human population.

P(doom) here means the probability of humanity being wiped out by misaligned ASI.

Ideally the individuals will have publicly expressed their P(doom) within the past year, directly or indirectly (e.g. “I think it's practically guaranteed we're all gonna die” = 99%, “I think it's a tossup whether we'll survive” = 50%, “there's a small but significant risk it'll kill us” = 10%, “the risks are negligible” = 1% etc.), or they may even be contacted and asked for their P(doom) as defined above.

If it is impossible to get a P(doom) (e.g. they are dead, or refuse to give their opinion), then their option may resolve n/a. 

When Manifold thinks ASI is <1y away here means the earliest point in time where there is a Manifold market asking whether ASI will be created before a deadline less than a year away, with a definition of ASI equally or more strict than the one in this market, 50 or more traders, and odds that have remained above 50% for the majority of the past month.

Get
Ṁ1,000
to start trading!
© Manifold Markets, Inc.TermsPrivacy