Will Eliezer Yudkowsky use AGI to figure out why he was wrong?
➕
Plus
20
Ṁ1585
2040
29%
chance

Will Eliezer Yudkowsky use AGI/ASI/Digial-Superintelligence to figure out why he was wrong predicting AI would destroy humanity.

Resolves yes if this happens by 2040.

Get
Ṁ1,000
and
S3.00
Sort by:

Nice meme

In the unlikely event that AI destroys humanity but there's still someone and/or something with the ability and desire to resolve this market, how should it resolve?

@SonataGreen It'll resolve no if theres no public statement by Eliezer saying he changed his mind

I'm curious how Eliezer et al interpret the anthropic issues with predictions like these. It's not as if he'll get to have the satisfaction of using AGI/ASI to prove why he was right, and there's more than zero branches (according to MWI followers like himself) where we survive regardless of who had the better argument.

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules