What do Manifold users consider the single greatest risk associated with development of AI?
26
154
αΉ€725
resolved Jan 1
100%79%
Human extinction
1.1%
Surveillance capability
1.9%
Military applications
1.1%
Creativity reverts to the mean -- innovation in music, storytelling and design becomes impossible
1.1%
AI works everyone out of a job and UBI can't save us since wealth is no longer being generated
1.1%
Falls into hands of bad actors
1.1%
This is a secret too dangerous for me to share
1.0%
AI will take everyone's mana by being too good at predicting stuff
13%Other

The options listed are the ones I came up with off top of head. Feel free to add. Resolves to a single answer. I don't intend to bet in this market.

Updated close time to 12/31/2023. Thank you.

Get αΉ€200 play money

πŸ… Top traders

#NameTotal profit
1αΉ€88
2αΉ€56
3αΉ€51
4αΉ€27
5αΉ€22
Sort by:

Market poky, pushed out close time to June. If you do not like this please invite your friends to get the party started.

Jeez. I thought human extinction was a given. I already saw 2001: A Space Odyssey. Here is improved version:

Why would human extinction be a greater risk than human existence becomes unbearable and also we can't end it? I don't want to live in a world with no music or inside jokes or privacy.

bought αΉ€8 of Human extinction YES

Let the record show that there is an ambiguity in the question. "Greatest" can be interpreted to mean either "most likely" or "most impactful".

@asmith I agree. I'm not expecting traders to calculate an expected value based on each risk though. That would be very silly considering the broad ranges involved, plus what if someone thinks several of the possibilities existential threats. I'm inclined to think resolving this ambiguity would be a distraction but I'm open to persuasion.

This is a secret too dangerous for me to share
why so serious joker GIF

More related questions