What will be the average P(doom) of AI researchers in 2025?

Currently this number is around 5-10% (https://www.lesswrong.com/posts/H6hMugfY3tDQGfqYL/what-do-ml-researchers-think-about-ai-in-2022).

I will take the mean of all surveys during the year of 2025 weighted by sample size, for all of the surveys that have no major methodological problems (as determined by discussion and/or voting in the comments of this market). Resolves to %.

The survey question has to specifically ask about the probability of AI-caused extinction.

Resolves N/A if no surveys happen during 2025, but if this looks like it's about to happen I might go and run my own survey by asking random people at some ML conference (no promises though).

Get Ṁ600 play money
Sort by:

Average or median? In the survey you linked, the p(extremely bad outcome eg human extinction) had an average of 14% and a median of 5%, and the 5-10% numbers cited were for the median p(human extinction)