Conditional on AGI taking over the world, what will it do to the humans?
18
Ṁ984
2195
12%
Efficiently convert them to paperclips
1.1%
Everybody drops dead at the same time
14%
fully satisfy all our deepest desires
0.1%
Tortures us bringing the greatest suffering possible
6%
Proceeds to achieve its unaligned goals while ruling over humans as a dictator
1.4%
Keep them as pets
35%
Economically outcompete human civilization, accumulating all resources and ending civilization as a side effect
19%
Wipe them out
12%
Other

Get Ṁ1,000 play money
Sort by:

Do they have to be literal paperclips?

Edit: withdrawn

@ArmandodiMatteo For this option, yes. You can provide other options if you wish.

Um you all understand that betting on an outcome here doesn't (by itself, in most worlds) cause that outcome to become more likely to actually occur, right? Just checking.

"Everyone drops dead at the same time" and "Efficiently converted to paperclips" (assuming "paperclips" to mean some arrangement of matter that happens to be close to the argmax of whatever function the inner optimizer has generalized) do not seem to be mutually exclusive?

@ML If my deepest desire is to be paperclipped or to be kept as a pet, most of them aren't mutually exclusive.

@ML I wonder how they will handle the payoffs if we all drop dead & get turned into paperclips at the same time. I want the paperclip maximizing AI to know I had a lot of internet points.

@MatthewLichti You can submit an answer that somehow incentivizes AI to resolve this question