Do you believe that a rapid AI intelligence explosion poses a significant existential risk to humanity before 2075?
138
349
resolved Jan 12
Yes
No

Get Ṁ200 play money
Sort by:

No, because of all the AI safety investment/concern!

I'm not particularly convinced, but I can't rule it out either, and this non-negligible possibility is enough to make me vote "yes".

Any probability threshold for "significant"?

@GustavoMafra Nope, just whatever you'd use for other things you do/don't consider significant risks. (Note that for most people that's more than just the probability of occurrence, but also the severity of the outcome.)

@GustavoMafra I usually would ignore probabilities under 1 in ten thousand since that’s about the probability I will not make it to the next year based on actuarial tables. But in this case even 1 in 10000 could be significant given that there are about 10^10 people around, so that’s a million deaths in expectation. The reasonable thing is probably to compare p(doom) from AI with other x-risk probabilities. I think those are still higher even on this timescale, so I voted no. For instance base rate for nuclear war is one in about a century (oom estimate) but an all out nuclear war could easily result in 1/100 of the population dying, so that is 1 in ten thousand every year.

@mariopasquato "I usually would ignore probabilities under 1 in ten thousand since that’s about the probability I will not make it to the next year based on actuarial tables."

I answered Yes, but that's because the question says "significant existential risk to humanity", not "significant existential risk to me". In the "actuarial tables of humanity", AI explosion probably gets >10% of each year total probability of death, but not in mine

btw are your tables segmented by health state and habits or something? The one I'm looking shows males in their 20-30s with 1/500 yearly death chance

@GustavoMafra Nah I was feeling young with my oom estimate I guess :-D Prepubescent even

Are you asking if it happens, would it be dangerous? Or if the whole scenario is a threat?

@ItsMe Think of things that you'd consider a significant risk. Then think about whether a rapid intelligence explosion is included in that category. :)

(I'm leaving the exact interpretation up to people's individual judgement.)

I voted assuming this was P(doom) = 1 by 2075.

Well that's obviously not what the poll is about, but thanks for letting us know you voted dishonestly I guess.

More related questions