At the beginning of 2035, will Eliezer Yudkowsky still believe that AI doom is coming soon with high probability?
208
2.1kṀ46k
2035
54%
chance

If Eliezer believes that there's at least a 75% chance of an AI existential risk coming to pass within the next 50 years, this resolves YES.

I'll resolve the market based on public statements from them in the previous and subsequent few months. Eliezer doesn't like putting explicit probabilities on this, so I'll attempt to infer their beliefs from their more subjective statements.

Resolves N/A in the event that Eliezer is no longer alive/conscious or AI doom has already occurred.

  • Update 2025-05-17 (PST) (AI summary of creator comment): The creator clarifies that any stated probability of doom conditional on ASI (Artificial Superintelligence) from Eliezer Yudkowsky will be considered distinct from his general, unconditional probability of doom (p(doom)). The market's resolution is based on an inference of this general p(doom), which the creator notes would be lower than a stated conditional probability like p(doom|ASI).

Get
Ṁ1,000
to start trading!
© Manifold Markets, Inc.TermsPrivacy