Is risk of extinction from AI >1% in the next century / Should we be spending >1% of our resources to prevent this?
63
resolved Jan 1
P(Extinction) < 1%, Spending should be < 1%
P(Extinction) < 1%, Spending should be > 1%
P(Extinction) > 1%, Spending should be < 1%
P(Extinction) > 1%, Spending should be > 1%

Many forecasts on this site estimate a >1% risk of human extinction in the next century from artificial superintelligence.

If that's true, it seems to me we should be dedicating far more resources to preventing this! At least 1%, if the risk of all of us dying is greater than 1%.

But what does Manifold think?

This isn't about how to spend money on preventing AI Doom, it's about how much in your best case scenario.

Just imagine we're doing all the things that you think should be done, every nation is working together, and the funding is coming from wherever you think is best. The spending can be any mix of public and private you think is best.

To give a simple sense of scope, 1% of the US government's budget would be about 60 billion dollars a year being spent on the problem:

Get
Ṁ1,000
to start trading!
© Manifold Markets, Inc.TermsPrivacy