Will humans go extinct before 2100? [Resolves to Metaculus in 2027]
Basic
5
Ṁ443
2027
4%
chance

This resolves to the community prediction (median) on https://www.metaculus.com/questions/578/human-extinction-by-2100/ as of January 1, 2027.

Note: predictions on extinction aren't incentive-compatible in any scoring system, and the same is true on Metaculus, but at least the incentive misalignment is not as bad there as on a prediction market.

Get
Ṁ1,000
and
S3.00
Sort by:

@jack Do you think this is a bad idea in terms of increasing incentive misalignment? https://manifold.markets/kenakofer/will-metaculus-predict-human-extinc

predicts NO

@kenakofer I think Metaculus's prediction on this specific question is already bogus for a variety of reasons (low-information predictions and bad incentives), and Manifold derivatives have minimal impact on that.

at least the incentive misalignment is not as bad there as on a prediction market.

What's the argument for this?

predicts NO

@MartinRandall I think the main reason is that on metaculus each question is worth the same amount and so the impact of each individual question is small. But on manifold I can bet a ton on a market if someone is willing to bet against me, and therefore make a lot of profit from betting against extinction. So it's an argument about the size of the incentives.

@jack Huh. So toy model, suppose true risk is 50%, half of predictors honestly predict 50%, and half predict 0.1% due to anthropic bias.

On both Manifold and Metaculus, the anthropic predictors pick up mana/points from the honest predictors in case of NO. In the case of YES there's a vacuum collapse.

I think your argument is that on Metaculus the honest and anthropic predictors will give an average of ??% risk (I think this ends up being a "community prediction" of about 1-5%?). Whereas on Manifold anthropic predictors will out-spend honest predictors and drive the market probability to 0.1%.

That doesn't match what we observe, so I think I have misunderstood you somewhere and I don't know where.

predicts NO

@MartinRandall My argument is more that (using completely made up numbers): perhaps on Metaculus it's 50/50 while on Manifold it's 60/40 anthropic-based predictions vs honest predictions. Because incentives are stronger on Manifold, pushing more people to follow the incentives.

However, as we have observed there's more going on than just a) honest beliefs and b) anthropic bias. There's also betting to signal (both to raise awareness of risk, and to signal your beliefs for any number of other reasons). Empirically it seems like there are also stronger reasons to bet for signalling reasons on Manifold than on Metaculus, probably because on Manifold you can make large bets to signal more strongly.

predicts NO

Looking back at this thread, I want to add that I think the Metaculus prediction here is very poor, but I also think incentive misalignment isn't the main reason for it - I think low-information predictions are probably the more likely explanation.

@jack did you have a theory on why these Metaculus questions have low information predictions when Metaculus overall is competitive with Manifold?

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules