If AGI has an okay outcome, will there be an AGI singleton?
Basic
4
Ṁ638
2101
28%
chance

An okay outcome is defined in Eliezer Yudkowsky's market as:

An outcome is "okay" if it gets at least 20% of the maximum attainable cosmopolitan value that could've been attained by a positive Singularity (a la full Coherent Extrapolated Volition done correctly), and existing humans don't suffer death or any other awful fates.

This resolves YES if I can easily point to the single AGI that has an okay outcome, and NO otherwise.

Get
Ṁ1,000
and
S3.00
Sort by:

Do you have a neat general definition of "single AI"? (Note: it's fine to say No, I don't have one ready and wouldn't want you to waste time on this).

More specific cases: Were all responses from GPT-4o in the OpenAI API part of the same AI? How about GPT-4 and GPT-4o together?

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules