
There will be only one superintelligence for a sufficiently long period that it will become a singleton
18
1kṀ3212300
64%
chance
1H
6H
1D
1W
1M
ALL
Full question:
There will be only one superintelligence rather than multiple for a sufficiently long period that it will become more powerful than any single government (i.e., unipolar AI takeoff).
One of the questions from https://jacyanthis.com/big-questions.
Resolves according to my judgement of whether the criteria have been met, taking into account clarifications from @JacyAnthis, who made those predictions. (The goal is that they'd feel comfortable betting to their credance in this market, so I want the resolution criteria to match their intention.)
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
People are also trading
Related questions
The superintelligence will brainwash society into declaring alignment researchers (with high pdooms) insane.
51% chance
Conditional on no existential catastrophe, will there be a superintelligence by 2100?
90% chance
Conditional on no existential catastrophe, will there be a superintelligence by 2050?
72% chance
Will there be less than a year between the first AGI and the first superintelligence?
38% chance
Conditional on no existential catastrophe, will there be a superintelligence by 2040?
62% chance
Conditional on no existential catastrophe, will there be a superintelligence by 2075?
85% chance
Conditional on no existential catastrophe, will there be a superintelligence by 2030?
40% chance
What organization will first create superintelligence?
Will six months or fewer elapse between when Manifold declares the achievement of AGI and superintelligence?
37% chance
What will be the name of the singleton AI that uplifts humanity?