In 2050, will the general consensus among experts be that the concern over AI risk in the 2020s was justified?
62
250
Ṁ2.3KṀ1.2K
2051
78%
chance
1D
1W
1M
ALL
Get Ṁ1,000 play money
Related questions
Sort by:
@jonsimon Neither matters. What this market cares about is "was the probability they placed on the world being destroyed by AI justified by the evidence they had at the time?"
@IsaacKing Whose probability/concern needs to be justified? Laypeople? Computer scientists? Computer scientists who responded to the AI Impact survey? Existential safety advocates / the AI existential risk community? Eliezer Yudkowsky?
I mainly ask because I think the probabilities of, say, extinction would range from something like 5% (maybe laypeople and computer scientists) to 50% (average existential safety advocate) to >99.9% (Yudkowsky).
Related questions
Will AI start a war before 2040
19% chance
Will AI wipe out humanity before the year 2030?
3% chance
Will a reliable and general household robot be developed before January 1st, 2030?
36% chance
Will AI wipe out humanity before the year 2100
12% chance