
Conditional on no existential catastrophe, will there be a superintelligence by 2075?
65
1kṀ49222075
84%
chance
1D
1W
1M
ALL
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
People are also trading
Will humanity wipe out superintelligent AI before 2040?
15% chance
IF artificial superintelligence exists by 2030, will AI wipe out humanity by 2030? [resolves N/A in 2027]
33% chance
Will superintelligent AI take over humanity by 2030?
18% chance
Will there be a highly risky or catastrophic AI agent proliferation event before 2035?
50% chance
By 2050, will there be a cure to aging conditional on no AGI by 2050?
20% chance
Will there be geopolitical instability over fears of other countries reaching superintelligence? (ACX, AI 2027 #2)
59% chance
Will there be a massive catastrophe caused by AI before 2030?
31% chance
By 2028, will I believe that contemporary AIs are aligned (posing no existential risk)?
33% chance
Conditional on humanity surviving to 2035, will a global AI pause have been enacted?
13% chance
Will artificial sentience be created by end of 2050?
58% chance
Sort by:
@CodeandSolder An existential catastrophe would likely prevent me from resolving this market, so it's not a functional difference from not having that there; it just reminds people to take that into account.
@IsaacKing in real world obviously yes, but I assumed we ignored boring practical aspects like that, much like with markets ending in 2075 vs 2100
@IsaacKing the question is is it (conditional on no existential catastrophe by 2075) (will there be a superinteligence by 2075) or (conditional on no existential catastrophe before there is a superinteligence) (will there be a superinteligence by 2075)
People are also trading
Related questions
Will humanity wipe out superintelligent AI before 2040?
15% chance
IF artificial superintelligence exists by 2030, will AI wipe out humanity by 2030? [resolves N/A in 2027]
33% chance
Will superintelligent AI take over humanity by 2030?
18% chance
Will there be a highly risky or catastrophic AI agent proliferation event before 2035?
50% chance
By 2050, will there be a cure to aging conditional on no AGI by 2050?
20% chance
Will there be geopolitical instability over fears of other countries reaching superintelligence? (ACX, AI 2027 #2)
59% chance
Will there be a massive catastrophe caused by AI before 2030?
31% chance
By 2028, will I believe that contemporary AIs are aligned (posing no existential risk)?
33% chance
Conditional on humanity surviving to 2035, will a global AI pause have been enacted?
13% chance
Will artificial sentience be created by end of 2050?
58% chance