
Conditional on AGI taking over the world, what will it do to the humans?
18
Ṁ1.1kṀ9842195
12%
Efficiently convert them to paperclips
1.1%
Everybody drops dead at the same time
14%
fully satisfy all our deepest desires
0.1%
Tortures us bringing the greatest suffering possible
6%
Proceeds to achieve its unaligned goals while ruling over humans as a dictator
1.4%
Keep them as pets
35%
Economically outcompete human civilization, accumulating all resources and ending civilization as a side effect
19%
Wipe them out
12%
This question is managed and resolved by Manifold.
Market context
Get
1,000 to start trading!
People are also trading
Will we get AGI before 2032?
44% chance
Will we get AGI before 2037?
64% chance
Will we get AGI before 2039?
67% chance
Will we get AGI before 2034?
53% chance
Will we get AGI before 2035?
56% chance
Will we get AGI before 2038?
65% chance
Will we get AGI before 2033?
49% chance
Will we get AGI before 2031?
41% chance
Will we get AGI before 2036?
57% chance
Will a misaligned AGI take over the world?
11% chance
Sort by:
Edit: withdrawn
@ML If my deepest desire is to be paperclipped or to be kept as a pet, most of them aren't mutually exclusive.
@ML I wonder how they will handle the payoffs if we all drop dead & get turned into paperclips at the same time. I want the paperclip maximizing AI to know I had a lot of internet points.
People are also trading
Related questions
Will we get AGI before 2032?
44% chance
Will we get AGI before 2037?
64% chance
Will we get AGI before 2039?
67% chance
Will we get AGI before 2034?
53% chance
Will we get AGI before 2035?
56% chance
Will we get AGI before 2038?
65% chance
Will we get AGI before 2033?
49% chance
Will we get AGI before 2031?
41% chance
Will we get AGI before 2036?
57% chance
Will a misaligned AGI take over the world?
11% chance