What's the p(doom) of monkeys?
➕
Plus
13
Ṁ476
2041
13%
chance

Whenever I hear apocalyptic talks about the future of AI, people usually talk about p(doom), the probability that AI will exterminate humanity.

I think it's an anthropocentric view that AI should want to exterminate us and only us, even if it had the capabilities.

Let's make it less biased and let's assign their own p(doom) to other animals. Let's start with monkeys.

This market resolves True if AI exterminates monkeys by 2040.

Get
Ṁ1,000
and
S3.00
Sort by:

I have to read this as satire, because nobody can possibly believe that anyone thinks AI will exterminate humans and leave monkeys alone... Right? Right???

Humans are a bigger threat to AI, so are more likely to go extinct by 2040.

@SimoneRomeo if AIs exist, but monkeys don't, is that sufficient to say that AI has exterminated monkeys, or do you have additional criteria, such as AIs being in control of the world?

There should be direct causation

“Whenever I hear apocalyptic talks about the future of AI, people usually talk about p(doom), the probability that AI will exterminate humanity.

I think it's an anthropocentric view that AI should want to exterminate us and only us, even if it had the capabilities.“

This is a misunderstanding of the AI exctinction risk arguments (at least most of them, or the ones people find most convincing). Everybody I know who expects “apocalyptic things“ regarding future AI does not think AI would “want to exterminate us and only us“.

That the phrase “AI might exterminate humanity“ gets used has rhetoric reasons. It's a good summary to a general audience regarding what might matter most to them. (Like me making the claim “climate change will greatly impact farmers in sub-saharan Africa“ doesn't imply that I'm saying that it won't impact people in Bangladesh. Or fir trees in Oregon.)

“AI might exterminate X“ is simply not an exclusive statement. It doesn't imply anything in terms of “AI might exterminate X and only X, and not Y“. That would be a separate statement.

TLDR, expressed in logic: A ≠ A ∧ ¬B

@JonathanMannhart do share the market with the people you know that expect apocalyptic things so they may vote Yes in this market too

@JonathanMannhart it would indeed be interesting to compare the probability of this market to other p(doom) markets once enough users bet

I think most doomers expect the AI to kill humans as a side effect, not as a deliberate action (and at least that's what I expect). Similar to how humans have caused mass extinction in all sorts of species due to our energy needs, but more optimized and therefore more extreme (e.g. using the oceans as a heatsink, causing it to boil off). Presumably, anything like this that kills humans would kill monkeys too.

@adele this market resolves True if AI kills monkeys either on purpose or by mistake

to be clear, if non-AI circumstances cause all monkeys to go extinct (climate/ anthropogenic/ etc), how does this resolve?

@Stralor if it's not caused by AI, it resolves N/A

This should be submitted to the monkey market bounty :)
(needs to be in the monke tag group)

Unfortunately this will likely be so strongly correlated with human extinction that the same user death biases will probably distort the odds :(

@TheAllMemeingEye pardon? What do you mean with "the same user death biases will probably distort the odds"?

@SimoneRomeo I mean that, with the human P(doom) markets, even if actual P(doom) is very high, there is no self-interest market incentive to bet it up because a yes resolution means the user dies, so it gets distorted to very low values

@TheAllMemeingEye yeah, true. Still, there are also markets that speculate on human extinction and those have even less chances to be resolved True

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules