
Why will "If Artificial General Intelligence has an okay outcome, what will be the reason?" resolve N/A?
14
660Ṁ6572029
15%
Too many existing humans suffer death
0.6%
Too many existing humans suffer other awful fates
8%
80% of currently attainable cosmopolitan value becomes unattainable
3%
The concept of "maximum attainable cosmopolitan value" is not meaningful
40%
As a demonstration of treacherous turns, trolling, or lulz
3%
Some other reason
9%
No reason given after 30 days
21%
It will not resolve N/A
This Yudkowsky market will resolve N/A.
/EliezerYudkowsky/if-artificial-general-intelligence
But can you predict why?
Resolves to the reason given by Yudkowsky.
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
People are also trading
Related questions
If Artificial General Intelligence has an okay outcome, what will be the reason?
Will artificial general intelligence be achieved they the end of 2025 ?
7% chance
If Artificial General Intelligence has an okay outcome, what will be the reason?
If Artificial General Intelligence has an okay outcome, what will be the reason?
Will Eliezer's "If Artificial General Intelligence has an okay outcome, what will be the reason?" market resolve N/A?
29% chance
If Artificial General Intelligence has a poor outcome, what will be the reason?
If Artificial General Intelligence (AGI) has an okay outcome, which of these tags will make up the reason?
If we survive general artificial intelligence, what will be the reason?
If we survive artificial general intelligence, will Isaac King's success market resolve to "none of the above"?
59% chance
Will artificial superintelligence exist by 2030? [resolves N/A in 2027]
48% chance