What do Manifold users consider the single greatest risk associated with development of AI?
Basic
26
αΉ€1111
resolved Jan 1
100%79%
Human extinction
1.1%
Surveillance capability
1.9%
Military applications
1.1%
Creativity reverts to the mean -- innovation in music, storytelling and design becomes impossible
1.1%
AI works everyone out of a job and UBI can't save us since wealth is no longer being generated
1.1%
Falls into hands of bad actors
1.1%
This is a secret too dangerous for me to share
1.0%
AI will take everyone's mana by being too good at predicting stuff
13%Other

The options listed are the ones I came up with off top of head. Feel free to add. Resolves to a single answer. I don't intend to bet in this market.

Updated close time to 12/31/2023. Thank you.

Get
αΉ€1,000
and
S3.00
Sort by:

Market poky, pushed out close time to June. If you do not like this please invite your friends to get the party started.

Jeez. I thought human extinction was a given. I already saw 2001: A Space Odyssey. Here is improved version:

Why would human extinction be a greater risk than human existence becomes unbearable and also we can't end it? I don't want to live in a world with no music or inside jokes or privacy.

Let the record show that there is an ambiguity in the question. "Greatest" can be interpreted to mean either "most likely" or "most impactful".

@asmith I agree. I'm not expecting traders to calculate an expected value based on each risk though. That would be very silly considering the broad ranges involved, plus what if someone thinks several of the possibilities existential threats. I'm inclined to think resolving this ambiguity would be a distraction but I'm open to persuasion.

This is a secret too dangerous for me to share
Β© Manifold Markets, Inc.β€’Terms + Mana-only Termsβ€’Privacyβ€’Rules