If humanity survives to 2100, what will experts believe was the correct level of AI risk for us to assess in 2023?
17
1kṀ3882100
38%
chance
1H
6H
1D
1W
1M
ALL
In 2100, I will distribute a poll among philosophers, superforecasters, AI professionals, superintelligences, and whatever other entities could reasonably be called "experts" to determine what probability they believe was correct to assign to the following statement in 2023:
Before 2075, humanity will be extinct or have suffered a fate worse than extinction, due to the effects of smarter-than-human artificial intelligence.
("Humanity" includes any non-biological successors that biological humans voluntarily turned into.)
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
People are also trading
Related questions
In 2050, will the general consensus among experts be that the concern over AI risk in the 2020s was justified?
78% chance
Will humanity wipe out AI x-risk before 2030?
10% chance
Will humanity wipe out AI before the year 2030?
11% chance
Will humanity wipe out AI before the year 2030?
7% chance
Is the nature of AI risk completely misunderstood today with respect to the state of the art in 2030?
30% chance
Will AI cause human extinction before 2100 (and how)?
Will humans wipe out AI by 2030?
6% chance
If humanity goes extinct by 2030, will AI unwipe out humanity by 2040?
34% chance
Will a sentient AI system have existed before 2030? [Resolves to 2100 expert consensus]
43% chance
Will anything other than AI wipe out humanity by 2100?
5% chance