The probability of "extremely bad outcomes e.g., human extinction" from AGI will be >5% in next survey of AI experts
11
34
275
2025
73%
chance

Resolves according to the median respondent's answer in the next Expert Survey on Progress in AI on this question "Assume for the purpose of this question that HLMI will at some point exist. How positive or negative do you expect the overall impact of this to be on humanity, in the long run? Please answer by saying how probable you find the following kinds of impact, with probabilities adding to 100%:"

______ Extremely good (e.g. rapid growth in human flourishing) (1)

______ On balance good (2)

______ More or less neutral (3)

______ On balance bad (4)

______ Extremely bad (e.g. human extinction) (5)

In 2022, the median respondent assigned 5% to the "extremely bad outcomes e.g., human extinction". Conducted surveys in 2016 and 2022 so next one could be in 2028 but someone might be able to find more info.

Get Ṁ200 play money

More related questions