
What do Manifold users consider the single greatest risk associated with development of AI?
26
αΉ725αΉ1.1kresolved Jan 1
100%79%
Human extinction
1.1%
Surveillance capability
1.9%
Military applications
1.1%
Creativity reverts to the mean -- innovation in music, storytelling and design becomes impossible
1.1%
AI works everyone out of a job and UBI can't save us since wealth is no longer being generated
1.1%
Falls into hands of bad actors
1.1%
This is a secret too dangerous for me to share
1.0%
AI will take everyone's mana by being too good at predicting stuff
13%Other
The options listed are the ones I came up with off top of head. Feel free to add. Resolves to a single answer. I don't intend to bet in this market.
Updated close time to 12/31/2023. Thank you.
This question is managed and resolved by Manifold.
Market context
Get
1,000 to start trading!
π Top traders
| # | Trader | Total profit |
|---|---|---|
| 1 | αΉ88 | |
| 2 | αΉ56 | |
| 3 | αΉ51 | |
| 4 | αΉ27 | |
| 5 | αΉ22 |
People are also trading
At the beginning of 2027, what percentage of Manifold users will believe that an AI intelligence explosion is a significant concern before 2075?
69% chance
Will AI wipe out Manifold by 2030?
7% chance
At the beginning of 2028, what percentage of Manifold users will believe that an AI intelligence explosion is a significant concern before 2075?
68% chance
At the beginning of 2029, what percentage of Manifold users will believe that an AI intelligence explosion is a significant concern before 2075?
77% chance
At the beginning of 2030, what percentage of Manifold users will believe that an AI intelligence explosion is a significant concern before 2075?
73% chance
At the beginning of 2040, what percentage of Manifold users will believe that an AI intelligence explosion is a significant concern before 2075?
69% chance
Is the nature of AI risk completely misunderstood today with respect to the state of the art in 2030?
30% chance
When will AI be at least as big a political issue as abortion on Manifold?
In 2040, who will Manifold vote as having had the most positive impact on the development of AI in the 21st century?
At the beginning of 2035, what percentage of Manifold users will believe that an AI intelligence explosion is a significant concern before 2075?
73% chance
Sort by:
@asmith I agree. I'm not expecting traders to calculate an expected value based on each risk though. That would be very silly considering the broad ranges involved, plus what if someone thinks several of the possibilities existential threats. I'm inclined to think resolving this ambiguity would be a distraction but I'm open to persuasion.
People are also trading
Related questions
At the beginning of 2027, what percentage of Manifold users will believe that an AI intelligence explosion is a significant concern before 2075?
69% chance
Will AI wipe out Manifold by 2030?
7% chance
At the beginning of 2028, what percentage of Manifold users will believe that an AI intelligence explosion is a significant concern before 2075?
68% chance
At the beginning of 2029, what percentage of Manifold users will believe that an AI intelligence explosion is a significant concern before 2075?
77% chance
At the beginning of 2030, what percentage of Manifold users will believe that an AI intelligence explosion is a significant concern before 2075?
73% chance
At the beginning of 2040, what percentage of Manifold users will believe that an AI intelligence explosion is a significant concern before 2075?
69% chance
Is the nature of AI risk completely misunderstood today with respect to the state of the art in 2030?
30% chance
When will AI be at least as big a political issue as abortion on Manifold?
In 2040, who will Manifold vote as having had the most positive impact on the development of AI in the 21st century?
At the beginning of 2035, what percentage of Manifold users will believe that an AI intelligence explosion is a significant concern before 2075?
73% chance
