Conditional on no existential catastrophe, will there be a superintelligence by 2075?
51
162
995
2075
68%
chance
Get Ṁ200 play money
Sort by:

It's called Trump. Resolve YES.

predicts NO

Have you guys still not gotten rid of Mark Ingraham‽ I left for 4 months!

How do you define existential catastrophe? Does this set of market resolve at first superinteligence even if it causes such an event soon after?

predicts NO

@CodeandSolder An existential catastrophe would likely prevent me from resolving this market, so it's not a functional difference from not having that there; it just reminds people to take that into account.

@IsaacKing in real world obviously yes, but I assumed we ignored boring practical aspects like that, much like with markets ending in 2075 vs 2100

predicts NO

@CodeandSolder I don't understand the question then.

@IsaacKing the question is is it (conditional on no existential catastrophe by 2075) (will there be a superinteligence by 2075) or (conditional on no existential catastrophe before there is a superinteligence) (will there be a superinteligence by 2075)