Would Eliezer Yudkowsky Press a Button that Ensured a 95% Probability of Doom?
Mini
37
1.6k
resolved Jan 2
Resolved
N/A

Eliezer Yudkowsky has (I believe) expressed that he believes there is a >95% likelihood that the outcome of deploying transformative artificial intelligence results in doom. Would he then press a button that ensured there was a 95% probability of doom due to the deployment of transformative artificial intelligence?

This market resolves YES if Eliezer Yudkowsky places a YES bet on this market, and NO if he places a NO bet. It will resolve N/A if he has not bet on this market by Jan 01 2024.

I am going to avoid providing a definition of doom in case it creates loopholes for resolution, and just use "whatever Eliezer Yudkowsky thinks doom is" instead.

Get Ṁ600 play money
Sort by:

comment redacted

predicted YES

I weakly predict N/A because I weakly expect that Eliezer thinks that it is unhelpful for the world to declare one way or the other.

@EliTyre This is my prediction also.

predicted YES

@EliTyre This is true. However, it is pretty clear in fact he would push a button like that, if he had it.

I'm not sure the result of pressing the button is clear.

Does it change the total probability of AI doom happening at any point in the future?

Or does it change the probability of AI doom happening, conditional on transformative AI being deployed?

predicted NO

I think that the chance of transformative AI ever being deployed is < 95% so I would not press the button in the first case, but I think that if transformative AI is deployed the chance of it causing doom is >95% so I would press the button in the second case.