
In 2050, will the general consensus among experts be that the concern over AI risk in the 2020s was justified?
76
1.2kṀ40112051
75%
chance
1H
6H
1D
1W
1M
ALL
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
People are also trading
Contingent on AI being perceived as a threat, will humans deliberately cause an AI winter before 2030?
23% chance
Public opinion, late 2025: Out-of-control AI becoming a threat to humanity, a real threat?
If humanity survives to 2100, what will experts believe was the correct level of AI risk for us to assess in 2023?
38% chance
Is the nature of AI risk completely misunderstood today with respect to the state of the art in 2030?
30% chance
By end of 2028, will AI be considered a bigger x risk than climate change by the general US population?
65% chance
Will humanity wipe out AI x-risk before 2030?
10% chance
In 2050, what will be the most accurate statement about the control of AI?
Will there be a massive catastrophe caused by AI before 2030?
27% chance
Will AI take over the world by 2100?
49% chance
Will we have a sufficient level of international coordination to ensure that AI is no longer threat before 2030?
22% chance
Sort by:
@jonsimon Neither matters. What this market cares about is "was the probability they placed on the world being destroyed by AI justified by the evidence they had at the time?"
@IsaacKing Whose probability/concern needs to be justified? Laypeople? Computer scientists? Computer scientists who responded to the AI Impact survey? Existential safety advocates / the AI existential risk community? Eliezer Yudkowsky?
I mainly ask because I think the probabilities of, say, extinction would range from something like 5% (maybe laypeople and computer scientists) to 50% (average existential safety advocate) to >99.9% (Yudkowsky).
People are also trading
Related questions
Contingent on AI being perceived as a threat, will humans deliberately cause an AI winter before 2030?
23% chance
Public opinion, late 2025: Out-of-control AI becoming a threat to humanity, a real threat?
If humanity survives to 2100, what will experts believe was the correct level of AI risk for us to assess in 2023?
38% chance
Is the nature of AI risk completely misunderstood today with respect to the state of the art in 2030?
30% chance
By end of 2028, will AI be considered a bigger x risk than climate change by the general US population?
65% chance
Will humanity wipe out AI x-risk before 2030?
10% chance
In 2050, what will be the most accurate statement about the control of AI?
Will there be a massive catastrophe caused by AI before 2030?
27% chance
Will AI take over the world by 2100?
49% chance
Will we have a sufficient level of international coordination to ensure that AI is no longer threat before 2030?
22% chance