
Related questions
At the beginning of 2035, will Eliezer Yudkowsky still believe that AI doom is coming soon with high probability?
56% chance
The probability of "extremely bad outcomes e.g., human extinction" from AGI will be >5% in next survey of AI experts
76% chance
What will be the average P(doom) of AI researchers in 2025?
19% chance
Will Eliezer's AI doom market have a higher P(doom) in the third quarter of 2026 than today's (2023-09-27) 21%?
33% chance
Does an AI disaster kill at least 10,000 people before 2029?
44% chance
Will AI cause a global catastrophe killing at least 10% of humans before 2100?
30% chance
Will a misaligned AI kill 1% of the world population within any 12 month period before 2035?
14% chance
Does an AI disaster kill at least 10,000,000 people before 2029?
8% chance
Will AI kill >20% of the human population before 2030?
5% chance
Will an AI Doomer turn to violence by the end of 2026?
35% chance