Will I come to believe that Eliezer Yudkowsky is substantially wrong about AGI being an X-risk?
9
100
Ṁ430Ṁ190
Mar 31
8%
chance
1D
1W
1M
ALL
"Substantially wrong" does not include merely being wrong about timing. If I come to believe that, yes, AGI is still an X-risk, but we will have more time before we might all die than Eliezer thinks, this does not count as a substantive disagreement for the purposes of this market.
This is in the "Change My Mind" group - so feel free to debate me in the comments.
Get Ṁ200 play money
Sort by:
@harfe A P(doom) of 10% isn't significantly different from a P(doom) of 90%, to me. Both are unacceptably high.
Related questions
At the beginning of 2035, will Eliezer Yudkowsky still believe that AI doom is coming soon with high probability?
63% chance
Will Eliezer Yudkowsky use AGI to figure out why he was wrong?
30% chance
Will Avraham Eisenberg be freed from incarceration before the development of AGI?
49% chance
If Eliezer Yudkowsky loses his bet about UFOs not having a worldview-shattering origin, what is the reason why?
Will AGI be a problem before non-G AI?
29% chance
Will Eliezer Yudkowsky publicly claim to have a P(doom) of less than 50% at any point before 2040?
32% chance
If I submit this possible grammar error (in description) to Eliezer Yudkowsky, will he say it's not actually an error?
46% chance
Will Eliezer Yudkowsky lose his bet with Unknown?
72% chance