
This question originally appeared in a study by the Forecasting Research Institute, which explored a method for identifying forecasting questions that AI experts found to be the best indicators of catastrophic risk due to AI. The findings of the study are the subject of a forthcoming report titled “Conditional Trees: A Method for Generating Informative Questions about Complex Topics”. The questions appear here only slightly edited from their initial form in the study. This is one of several questions that experts found most informative. We’re excited to see how the forecasts of the Manifold community compare to those found in the study. We will link the report here when it is released this month.
This question resolves Yes if, before 2100, AI causes the global human population to fall below 5,000 individuals.
AI would be considered to have "caused" the population reduction if it is the direct or proximate cause of the deaths. For example, if a Terminator-style robot powered by AI kills all humans, that will count for this question. Alternatively, if AI, operating without direct human intervention, causes the launch of nuclear weapons that kill all humans, that will also count for the purposes of this question. And if that same incident does not kill all humans, but the subsequent nuclear winter does, that will also count for the purposes of this question.
We consider AI a “proximate cause of death” based on the “but-for” test. This means we are considering events that would not have occurred or would have counterfactually been extremely unlikely to occur “but for” the substantial involvement of AI within one year prior to the event. One way to think of this rule is if the AI involved in an event could have shut down unexpectedly without dramatically reducing the probability of the event, then the AI was not a proximate cause. For example, if AI is substantially responsible for creating a superweapon that causes the death of the relevant (large) magnitudes of people, this would count for the purposes of this question. Or, if AI acts autonomously in an armed conflict context and causes the deaths of the relevant magnitudes of people, this would also count. However, AI would not be considered a proximate cause if the AI did not directly cause or enable the event to transpire. For example, an AI assisting in targeting a nuclear weapon would not be considered a proximate cause if humans could have launched the weapon anyway without AI assistance.
If an extinction event (global population is reduced below 5,000) is caused by multiple sources including AI, it will count as an extinction event caused by AI.