Will Yudkowsky claim that he is more than 90% sure that AI will kill everyone no later than 1 year after the claim?
45
10kṀ16k
2036
28%
chance

Criteria for Resolution:

  1. Claim of Probability: Eliezer Yudkowsky must make a public claim that he is more than 90% sure that AI will kill everyone no later than 1 year after the claim is made. The claim must be made in all seriousness, not as a joke or sarcasm.

  2. Types of Qualifying Statements: Statements such as "certain”, "almost sure”, "pretty certain”, "almost inevitable", or any similar expressions that indicate a probability of higher than 90% will qualify, beyond explicit quantitative claims.

  3. If resolution criteria are not met, the market will resolve NO on January 1, 2036.

Verification of Ambiguity:

- In cases where the claim is ambiguous, efforts will be made to explicitly ask Yudkowsky if his statement means that the probability is higher than 90%.

A related market (the same question but 50% instead of 90%): https://manifold.markets/IhorKendiukhov/will-yudkowsky-claim-that-he-is-mor

Get
Ṁ1,000
to start trading!
© Manifold Markets, Inc.TermsPrivacy