This includes accidental or engineered pandemics, out-of-control gene drives, etc.
@L if you can give an actual mechanical reason, I'd sell. but if nobody can find a reason why, I'm gonna start buying up to 80% and just adding more until it stays there. I'm pretty fuckin sure this is a fairly straightforward type of threat to the world; most of ai risk is from anticipated threat from bioweapons. why would we be safe?
@L tl;dr nothing smart
This looks like a “permanent stock” beauty contest again.
Some people may default to a pure EV perspective: I’m minimally likely to see a payout, but especially unlikely to see a “NO” payout.
Others will want to turn a profit from volatility, however, and front-run the crowd. Humans seem to be optimistic, Manifold users too. There’s an “AI wipe out humanity before 2100” market here at 23%, so it’s likely this one will still move there.
Speaking of optimism and more to your actual point, others yet may (more or less vaguely) think that so far humanity handled M.A.D.-type risks not horribly and in general, tend to do not horribly in some repeated coordination and anti-coordination games. Your and yours’s death risk is a strong incentive.