If AI doesn't destroy the world, will it be primarily because it lacked the means, motive, or opportunity?
5
Ṁ175Ṁ1062051
1H
6H
1D
1W
1M
ALL
56%
It lacked the means
36%
It lacked the motive
7%
It lacked the opportunity
This market is inpired by /EliezerYudkowsky/if-artificial-general-intelligence and /IsaacKing/if-we-survive-general-artificial-in , as well as this blog post by @JonathanMann. It strikes me that the two markets above have too many options/too much jargon, which makes them sort of overwhelming. The taxonomy laid out in the blog post is easier to understand.
This resolves according to my judgement, guided as much as possible by the description in the blog post (I will not bet). I will resolve it when I feel it is clear that at-least-human-level AGI exists, or at the market close date in 2050, whichever comes later.
This question is managed and resolved by Manifold.
Market context
Get
1,000 to start trading!
People are also trading
Related questions
Do you think AI will take over the world?
55% chance
Will humanity wipe out AI?
13% chance
If AI kills everyone, how will it do it?
Will AI wipe humanity by 2030?
28% chance
If AI wipes out humanity, will it resolve applicable markets correctly?
40% chance
If an AI wipes out humanity on Earth, what will be true of this AI?
Will AI wipe out AI before the year 2030?
4% chance
Will AI wipe out AI before 2030?
9% chance
If AI wipes out humanity, which organization(s) will be primarily responsible for the development of this AI?
Will AI cause human extinction before 2100 (and how)?