For this question, we'll define P(Doom) as the likelihood of artificial intelligence causing human extinction or an outcome you consider similarly undesirable before the year 2050.
This question is inspired by recent discourse around Jan Leike, who recently resigned from OpenAI because of safety concerns, having previously said that his P(Doom) is 10%-90%:
This poll will be used to resolve this market:
So assuming the possibility that AI will doom us all is worth considering. Why not talk about it in language a broader public will understand? “Branding” “storytelling” “actionable insights” all overused, sure. But mathematical notation? Come on man! Lucky me I happened to take one graduate course in statistics so I can decode it! It’s almost like OTOH inside joke but OTOH existential threat. I am being kind with the “almost.” If it’s really such a terrible giant existential threat then why not put some minimal effort into getting the word out?
Plus it’s competing with a half-dozen other existential threats. You can hardly go to your kitchen in the morning without bumping into one.
@SaviorofPlant AI or nuclear war? I personally think nuclear war is order of magnitude is more likely, than AI doom.