Criteria for Resolution:
Claim of Probability: Eliezer Yudkowsky must make a public claim that he is more than 90% sure that AI will kill everyone no later than 1 year after the claim is made. The claim must be made in all seriousness, not as a joke or sarcasm.
Types of Qualifying Statements: Statements such as "certain”, "almost sure”, "pretty certain”, "almost inevitable", or any similar expressions that indicate a probability of higher than 90% will qualify, beyond explicit quantitative claims.
If resolution criteria are not met, the market will resolve NO on January 1, 2036.
Verification of Ambiguity:
- In cases where the claim is ambiguous, efforts will be made to explicitly ask Yudkowsky if his statement means that the probability is higher than 90%.
A related market (the same question but 50% instead of 90%): https://manifold.markets/IhorKendiukhov/will-yudkowsky-claim-that-he-is-mor
For someone to make a statement like this wouldn't it pretty much have to be true that a misaligned AGI/ASI was already visibly in the process of trying to kill everyone? If you accept the premise that it would be competent at it, that leaves a pretty narrow window to issue such statements, even in worlds where Eliezer is 100% correct.
@nsokolsky I will add the part about the seriousness but I am not sure about the reversal. Given the nature of this claim, there may be many reasons why it can be removed, and I think it should not affect the outcome of this market.
@GG it means “after the claim is made”. Please see the description of the market. The length of the market title is limited, I could not fit in this phrase.
@IhorKendiukhov I'm sorry I was just confused by the grammar. So if I understand you correct, Eliezer would have to say "I'm 90% sure that AI will kill us within one year of today." Is that right?