Why will "If Artificial General Intelligence has an okay outcome, what will be the reason?" resolve N/A?
8
100
2025
1.3%
Too many existing humans suffer death
1.3%
Too many existing humans suffer other awful fates
1.3%
80% of currently attainable cosmopolitan value becomes unattainable
4%
The concept of "maximum attainable cosmopolitan value" is not meaningful
74%
As a demonstration of treacherous turns, trolling, or lulz
3%
Some other reason
7%
No reason given after 30 days
7%
It will not resolve N/A

This Yudkowsky market will resolve N/A.

/EliezerYudkowsky/if-artificial-general-intelligence

But can you predict why?

Resolves to the reason given by Yudkowsky.

Get Ṁ200 play money
Sort by:

People are uncertain about AI doom but 75% confident that Yudkowsky will do it for the lulz.

bought Ṁ10 of It will not resolve N/A

ISTM it's likely we all die before EY gets the chance to log on to Manifold and resolve the market.

How does this market resolve if that one doesn't resolve N/A?

@IsaacKing Oh, I didn't see that one of the responses covers that.

@IsaacKing I will take a brief break from luxuriating in 20% of max attainable value to realize that I'm in an impossible thought experiment set up to test my integrity, put down my ultra-chocolate, and carefully resolve this market to the correct answer to demonstrate my counterfactual integrity to the larger universe that is simulating me, thus slightly increasing my expected returns in the larger universe. And then I'll go back to the ultra-chocolate.

More related questions