Will humanity go extinct before 2030?

If at least one human is alive on Jan 1, 2030, then this question resolves to NO.

For this question, we'll define humans the same as this Metaculus question:

as biological creatures who have as their ancestors – via a chain of live births from mothers – circa 2000 humans OR who could mate with circa 2000 humans to produce viable offspring. (So AIs, ems, genetically engineered beings of a different species brought up in artificial wombs, etc. would not count). Any living humans living anywhere in the observable universe (or multiverse) (who are known to the entities operating Metaculus) on that date will be sufficient to resolve the question negatively.

Get Ṁ600 play money
Sort by:
bought Ṁ1 YES

ha! can't wait to get lots of mux on this one

Hi! The following market might be in your interest: https://manifold.markets/jgould1090Gould/how-many-times-before-january-1st-w

14% seems absurdly high.

predicts YES

@RossTaylor I think there's wide disagreement between forecasts on AI risk, and I actually don't think 14% is absurd, although it's certainly higher than most forecasts.

I'm going to cite a chain of reasoning with Metaculus forecasts here to paint the picture for why it's not absurd:

Multiplying all that out and assuming everything is uncorrelated, we get 3.6%. In actuality, I think the correlations between these questions make the risk much higher than that. Also to do the math properly you'd want to combine the distributions, not just the point estimates we used above, which I think also would end up giving a higher number.

Anyway, don't take the specific number too seriously - we can also compare to https://www.metaculus.com/questions/578/human-extinction-by-2100/ which says there's a 1.4% chance of extinction by 2100, which obviously disagrees with the above chain of reasoning. Or you can look at the Existential Risk Persuasion Tournament where we get a range of 1% - 6% probability of extinction by 2100. I just wanted to show that there's a plausible picture for assigning a probability of more than a few percentage points to AI extinction in this decade alone.

predicts NO

@jack I look it from the view of a plausible mechanism that could make humanity extinct.

  • Is it a nuclear event? Even in a nuclear winter, total extinction seems unlikely? I believe most simulations show this?

  • Is it AI eliminating humanity? What mechanism is this? Bioweapons? Nuclear? Some unknown consequence of superintelligence?

I would weigh Knightian factors arising from superintelligence higher in a few decades, but not in the next seven years.

TLDR: Catastrophic events are in the low single figure % in mind, but total extinction seems an order of magnitude lower?

FYI I am not at all surprised that this is lower than https://manifold.markets/MartinRandall/will-ai-wipe-out-humanity-before-th-d8733b2114a8. The conjunction fallacy is especially strong in x-risk forecasting, and also the superpower of prediction markets - the incentives for making accurate predictions - turn into a negative on x-risk predictions due to the misaligned incentives.

I actually think AI x-risk in this decade is not unlikely, and I think it's a much bigger concern than any other x-risk.

@jack inclusion of the multiverse, theoretically could send this to zero, but for the "observable" qualifier.

predicts YES

@ShitakiIntaki Yeah, I'm kind of confused by the multiverse part of that - isn't the observable part of the multiverse just the observable universe?

predicts NO

@jack i suppose it is a place holder for future developments in what the agent resolving this market might consider to be observable. This distinction does not matter if humans are still around in this universe.

In the here and now, I would agree that the observable multiverse is just the observable universe.

@jack you forgot to specify what happens if no humans are alive.

predicts NO

@MartinRandall That was intentional :)

predicts YES

@jack Well that seems misleading, given the title one might expect it to resolve yes in that case.

@MartinRandall Yes, that is certainly the intent of the question, I just can't promise that we'll be able to go about implementing that resolution.

In all seriousness though, the question certainly should be read as resolving YES if no humans are alive. And there are possible scenarios in which humanity goes extinct but we still manage to collect the mana payout - e.g. if all humans are uploaded it would be a YES (the resolution criteria specify biological humans) and the YES betters would collect their payouts (assuming Manifold still exists of course).

predicts YES

@jack Sure, but you also can't guarantee a NO resolution, many things could happen between now and 2030.

@MartinRandall Yo if someone tells me I have 72 hours to live, I'm maxing out my credit cards.

@MartinRandall Yeah, the omission was just for irony. And yes, like most questions there's an implicit conditional on Manifold existing, etc etc.

How is Metaculus at 3% by 2100?

predicts NO

@MartinRandall I also think that's too low. Metaculus's forecasts on extinction widely vary and are incompatible when compared to each other (just like Manifold's do), for example https://www.metaculus.com/questions/17735/conditional-human-extinction-by-2100/ is literally the same resolution criteria but is higher when they added a conditional. And https://www.metaculus.com/questions/4118/will-there-be-a-positive-transition-to-a-world-with-radically-smarter-than-human-artificial-intelligence/ assigns a very high chance to a negative outcome from AGI (in theory you might allow for negative non-extinction outcomes, but I think there are other Metaculus forecasts that put AI x-risks higher)