
I'll resolve this question to my P(doom) at the start of 2025.
P(doom) is the probability with which I expect humanity to go extinct
EDIT 2024-1-3: I view outcomes as doom if neither humans get to have long fun lives, nor does our death lead to other beings getting long fun lives instead. Scenarios 4 and 5 (listed below) are roughly the lower bound on what Not-Doom looks like.
Example outcomes I might believe to happen:
An engineered pathogen kills everyone. -> Doom
An unaligned superintelligence kills everyone, and then goes off to do things we don't find valuable even by very cosmopolitan standards -> Doom
We build an aligned superintelligence which then does lots of nice things for us. -> Not Doom
The superintelligence still kills us, but then fills the universe with excitement, wonder and happiness regardless -> Not Doom
The superintelligence sells us to charitable benevolent aliens -> Not Doom
I currently expect P(S1)=2% P(S2)=87% P(S3)=1% P(S4)=0.1% P(S5)=0.1% P(other)=9.8%.
I a-priori believe to not be calibrated for tiny probabilities.
At the start of 2024 my P(doom) was about 98%.
Update 2025-01-01 (PST): - Resolution updated to ~95% all-things-considered probability of a "doom" outcome. (AI summary of creator comment)
🏅 Top traders
# | Name | Total profit |
---|---|---|
1 | Ṁ34 | |
2 | Ṁ5 | |
3 | Ṁ0 | |
4 | Ṁ0 |