Is your P(Doom) currently between 10% and 90%? (By 2050)
124
resolved Jun 8
Yes
No, it's lower than 10%
No, it's higher than 90%
See Results

For this question, we'll define P(Doom) as the likelihood of artificial intelligence causing human extinction or an outcome you consider similarly undesirable before the year 2050.

This question is inspired by recent discourse around Jan Leike, who recently resigned from OpenAI because of safety concerns, having previously said that his P(Doom) is 10%-90%:

This poll will be used to resolve this market:

Get Ṁ600 play money
Sort by:

So assuming the possibility that AI will doom us all is worth considering. Why not talk about it in language a broader public will understand? “Branding” “storytelling” “actionable insights” all overused, sure. But mathematical notation? Come on man! Lucky me I happened to take one graduate course in statistics so I can decode it! It’s almost like OTOH inside joke but OTOH existential threat. I am being kind with the “almost.” If it’s really such a terrible giant existential threat then why not put some minimal effort into getting the word out?

Plus it’s competing with a half-dozen other existential threats. You can hardly go to your kitchen in the morning without bumping into one.

lmao apparently i am the biggest doomer on the site

@SaviorofPlant AI or nuclear war? I personally think nuclear war is order of magnitude is more likely, than AI doom.

reposted

Sharing!

More related questions