
If Eliezer believes that there's at least a 75% chance of an AI existential risk coming to pass within the next 50 years, this resolves YES.
I'll resolve the market based on public statements from them in the previous and subsequent few months. Eliezer doesn't like putting explicit probabilities on this, so I'll attempt to infer their beliefs from their more subjective statements.
Resolves N/A in the event that Eliezer is no longer alive/conscious or AI doom has already occurred.
Update 2025-05-17 (PST) (AI summary of creator comment): The creator clarifies that any stated probability of doom conditional on ASI (Artificial Superintelligence) from Eliezer Yudkowsky will be considered distinct from his general, unconditional probability of doom (p(doom)). The market's resolution is based on an inference of this general p(doom), which the creator notes would be lower than a stated conditional probability like p(doom|ASI).