What do Manifold users consider the single greatest risk associated with development of AI?
26
154
αΉ1.1KαΉ725
resolved Jan 1
1D
1W
1M
ALL
100%79%
Human extinction
1.1%
Surveillance capability
1.9%
Military applications
1.1%
Creativity reverts to the mean -- innovation in music, storytelling and design becomes impossible
1.1%
AI works everyone out of a job and UBI can't save us since wealth is no longer being generated
1.1%
Falls into hands of bad actors
1.1%
This is a secret too dangerous for me to share
1.0%
AI will take everyone's mana by being too good at predicting stuff
13%Other
The options listed are the ones I came up with off top of head. Feel free to add. Resolves to a single answer. I don't intend to bet in this market.
Updated close time to 12/31/2023. Thank you.
Get αΉ200 play money
Related questions
π Top traders
# | Name | Total profit |
---|---|---|
1 | αΉ88 | |
2 | αΉ56 | |
3 | αΉ51 | |
4 | αΉ27 | |
5 | αΉ22 |
Sort by:
@asmith I agree. I'm not expecting traders to calculate an expected value based on each risk though. That would be very silly considering the broad ranges involved, plus what if someone thinks several of the possibilities existential threats. I'm inclined to think resolving this ambiguity would be a distraction but I'm open to persuasion.
Related questions
According to Manifold users, which possible cause of human extinction by AI needs the most urgent attention?
Will an AI model outperform 95% of Manifold users on accuracy before 2026?
67% chance
At the beginning of 2026, what percentage of Manifold users will believe that an AI intelligence explosion is a significant concern before 2075?
66% chance
Will there be a disaster caused by open source developers doing unsafe things with AI by 2028?
65% chance
At the beginning of 2025, what percentage of Manifold users will believe that an AI intelligence explosion is a significant concern before 2075?
66% chance
According to 20 AI safety experts, what is the biggest mistake the AI safety community has made in the past?
At the beginning of 2027, what percentage of Manifold users will believe that an AI intelligence explosion is a significant concern before 2075?
66% chance
At the beginning of 2040, what percentage of Manifold users will believe that an AI intelligence explosion is a significant concern before 2075?
67% chance
What will be the main constraint to AI development in 2028?
Is the nature of AI risk completely misunderstood today with respect to the state of the art in 2030?
64% chance