In a scenario @EliezerYudkowsky uses as an illustrative example of how a superintelligent AI could cause human extinction, everyone on Earth falls over dead in the same second. The scenario goes as follow (slightly edited from this transcript: https://www.lesswrong.com/posts/Aq82XqYhgqdPdPrBA/full-transcript-eliezer-yudkowsky-on-the-bankless-podcast):
"If [the AI] is better than you at everything, it's better than you at building AIs. That snowballs. The AI gets an immense technological advantage. If it's smart, it doesn't announce itself. It doesn't tell you that there's a fight going on. It emails out some instructions to one of those labs that'll synthesize DNA and synthesize proteins from the DNA and get some proteins mailed to a hapless human somewhere who gets paid a bunch of money to mix together some stuff they got in the mail in a vial (smart people will not do this for any sum of money. Many people are not smart). [The AI, through the hapless human] builds the ribosome, but the ribosome that builds things out of covalently bonded diamondoid instead of proteins folding up and held together by Van der Waals forces. It builds tiny diamondoid bacteria. The diamondoid bacteria replicate using atmospheric carbon, hydrogen, oxygen, nitrogen, and sunlight. And a couple of days later, everybody on earth falls over dead in the same second."
This is a terribly sad scenario to think about. We wouldn't even be able to say goodbye to each other. That's why I will use this prediction market to weigh my sadness with a number between 0 and 1.
Feb 28, 7:51am: If AI wipes out humanity will everyone fall over dead in the same second? → If AI wipes out humanity will everyone on Earth fall over dead in the same second?