By default, continued AI development is assumed to lead to existential catastrophe. If we survive until 2035, this market asks: what kind of world are we in?
This is a retrospective counterfactual question: imagine it is January 1st, 2035, and humanity is still alive. What explains our survival?
Resolution Criteria:
On December 30th, 2035, this market will resolve to the option that best describes the primary reason humanity avoided extinction by that date. Resolution will be based on public evidence and expert consensus to the extent possible.
If the world is unambiguously dead (e.g. global ASI kills everyone), the market resolves N/A.
Options:
Effective international moratorium
A moratorium on large training runs or capabilities research (e.g. international agreement, enforcement mechanisms, etc.) was implemented and appears to be working. Further AI development is sharply curtailed worldwide. This includes regulatory pauses that halt frontier model development indefinitely.AI winter
No moratorium was implemented, but AI capabilities hit technical or economic limits. Hype collapsed. Progress slowed substantially or reversed. There is no live threat of ASI.Benign superintelligence
We built superintelligence, and it turned out fine. Humanity is still here, possibly transformed or uplifted, but not extinct. There may have been risks, but they were successfully navigated.Other
Survival occurred for a reason not captured by the above options. This could include:Major global catastrophes (e.g. nuclear war, collapse of civilization) that indirectly halted AI progress.
A stable coordination equilibrium not meeting the threshold for a "moratorium."
Successful containment of dangerous models by other means.
ASI exists but is boxed or constrained without becoming benign.
N/A: Humanity did not survive
If humanity is unambiguously extinct or effectively extinct (e.g. via a misaligned ASI or other terminal outcome), this market resolves N/A.
Notes:
Ambiguous or mixed worlds will be judged based on the most salient proximal cause of our survival.
If multiple causes apply, the one that seems most decisive in preventing extinction will be chosen.
If the question proves unresolvable due to lack of data or civilizational collapse, it will be resolved N/A or judged by the Manifold team.
I welcome suggestions for other options to add.
Update 2025-07-02 (PST) (AI summary of creator comment): The creator has clarified that if humanity is still alive and there is no significant political restraint on AI development, the market will resolve to AI winter. This is true even if ASI is still considered a live prospect.