If we survive general artificial intelligence, what will be the reason?
224
6.4kṀ35k2200
4%
There's a fundamental limit to intelligence that isn't much higher than human level.
31%
There was an alignment breakthrough allowing humanity to successfully build an aligned AI.
10%
At a sufficient level of intelligence, goals converge towards not wanting to harm other creatures/intelligences.
1.1%
Building GAI is impossible because human minds are special somehow.
6%
High intelligence isn't enough to take over the world on its own, so the AI needs to work with humanity in order to effectively pursue its own goals.
5%
Multiple competing AIs form a stable equilibrium keeping each other in check.
24%
Humanity coordinates to prevent the creation of potentially-unsafe AIs.
20%
One person (or a small group) takes over the world and acts as a benevolent dictator.
This market resolves once either of the following are true:
AI seems about as intelligent as it's ever plausibly going to get.
There appears to be no more significant danger from AI.
It resolves to the option that seems closest to the explanation of why we didn't all die. If multiple reasons seem like they all significantly contributed, I may resolve to a mix among them.
If you want to know what option a specific scenario would fall under, describe it to me and we'll figure out what it seems closest to. If you think this list of reasons isn't exhaustive, or is a bad way to partition the possibility space, feel free to suggest alternatives.
See also Eliezer's more fine-grained version of this question here.
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
People are also trading
Related questions
If we survive general artificial intelligence before 2100, what will be the reason?
If Artificial General Intelligence has a poor outcome, what will be the reason?
Why will "If Artificial General Intelligence has an okay outcome, what will be the reason?" resolve N/A?
If we survive artificial general intelligence, will Isaac King's success market resolve to "none of the above"?
59% chance
If Artificial General Intelligence (AGI) has an okay outcome, which of these tags will make up the reason?
Will humanity wipe out AI?
10% chance
When artificial general intelligence (AGI) exists, what will be true?
When (if ever) will AI cause human extinction?
Will artificial general intelligence be achieved they the end of 2025 ?
19% chance
Which AI future will we get?