Conditional on no existential catastrophe, will there be a superintelligence by 2040?
199
1.9K
2.2K
2040
46%
chance
Get Ṁ200 play money
Sort by:
bought Ṁ127 of NO

M$5000 limit order up, come at me YESers.

bought Ṁ0 of NO

@Lovre Put up another 10k for you.

predicts YES

@IsaacKing Opened this page expecting it at 32%, not at 40%! :)

predicts NO

@Lovre I already gave you 5k at 32! How about we meet in the middle at 36?

predicts YES

@IsaacKing Make it so.

bought Ṁ100 of NO

Superintelligence in the normal person's meaning, and not in Isaac's meaning, will be achieved with chain of thought, which means there is essentially a 100% chance it will not "design and deploy a complicated website" in under a minute; the inference involved will take a lot longer than a minute.

predicts NO

@DavidBolin the first one might use chain of thought, but there could be additional iterations by 2040 that speed things up.

predicts NO

@ErickBall Not that much though.

"Can design and deploy a complicated website such as a Facebook clone from scratch in under a minute."

This resolves NO, and so does the one about 2050.

How will you know there isn't a superintelligence?

predicts NO

@ScroogeMcDuck I can't, so I'll resolve based on my best guess at the time.

bought Ṁ90 of NO

If this resolves YES then I don't expect my mana to have any value. It will be a record of my forecasting ability in a universe where humans aren't needed for forecasting. Or indeed anything.

So maybe I should "rationally" bet NO?

@MartinRandall You have bragging rights for being correct, that is part of the value

predicts NO

@ShadowyZephyr Sure, but those bragging rights are worth less. It's like bragging rights for being a great horse trainer after the invention of the car.

If e.g. a nuclear exchange kills 90% of humanity but the 10% remaining still manage to build a superintelligence, does that NA this market because there was a realized existential risk even if it wasn't quite successful? Or YES?

Wondering if the conditional is a formality or if it would actually affect resolution in some edge case.

predicts NO

@Mira Mostly a formality. If society is still doing well enough after a nuclear exchange that people are still using Manifold, I wouldn't consider that an existential threat coming to pass.

Technically there exists some form of S-risk where our eternity of torture includes continued usage of Manifold, and in that case I would resolve this N/A. Seems unlikely though. Here's a comparison market:

bought Ṁ50 of YES

What's your definition of superintelligence?

predicts NO

@BairAiushin It'll be very obvious if something qualifies.

predicts YES

@IsaacKing what might be obvious to me might not be obvious to you and vice versa. Give at least one criterion please.

predicts NO

Can get a perfect score on any test designed for humans where such a score is theoretically achievable. Can solve any mathematical problem that we know to be in principle solvable with the amount of computing power it has available. Can pass as any human online if given that human's history of online communications and a chance to talk to them. Can consistently beat humans in all computer games. Can design and deploy a complicated website such as a Facebook clone from scratch in under a minute. Can answer any scientific question more accurately than any human once given a chance to read all of the internet.

predicts YES

@IsaacKing thanks, that seems about right

predicts NO