Whenever I hear apocalyptic talks about the future of AI, people usually talk about p(doom), the probability that AI will exterminate humanity.
I think it's an anthropocentric view that AI should want to exterminate us and only us, even if it had the capabilities.
Let's make it less biased and let's assign their own p(doom) to other animals. Let's start with monkeys.
This market resolves True if AI exterminates monkeys by 2040.
@SimoneRomeo if AIs exist, but monkeys don't, is that sufficient to say that AI has exterminated monkeys, or do you have additional criteria, such as AIs being in control of the world?
“Whenever I hear apocalyptic talks about the future of AI, people usually talk about p(doom), the probability that AI will exterminate humanity.
I think it's an anthropocentric view that AI should want to exterminate us and only us, even if it had the capabilities.“
This is a misunderstanding of the AI exctinction risk arguments (at least most of them, or the ones people find most convincing). Everybody I know who expects “apocalyptic things“ regarding future AI does not think AI would “want to exterminate us and only us“.
That the phrase “AI might exterminate humanity“ gets used has rhetoric reasons. It's a good summary to a general audience regarding what might matter most to them. (Like me making the claim “climate change will greatly impact farmers in sub-saharan Africa“ doesn't imply that I'm saying that it won't impact people in Bangladesh. Or fir trees in Oregon.)
“AI might exterminate X“ is simply not an exclusive statement. It doesn't imply anything in terms of “AI might exterminate X and only X, and not Y“. That would be a separate statement.
TLDR, expressed in logic: A ≠ A ∧ ¬B
@JonathanMannhart do share the market with the people you know that expect apocalyptic things so they may vote Yes in this market too
@JonathanMannhart it would indeed be interesting to compare the probability of this market to other p(doom) markets once enough users bet
I think most doomers expect the AI to kill humans as a side effect, not as a deliberate action (and at least that's what I expect). Similar to how humans have caused mass extinction in all sorts of species due to our energy needs, but more optimized and therefore more extreme (e.g. using the oceans as a heatsink, causing it to boil off). Presumably, anything like this that kills humans would kill monkeys too.
@TheAllMemeingEye pardon? What do you mean with "the same user death biases will probably distort the odds"?
@SimoneRomeo I mean that, with the human P(doom) markets, even if actual P(doom) is very high, there is no self-interest market incentive to bet it up because a yes resolution means the user dies, so it gets distorted to very low values
@TheAllMemeingEye yeah, true. Still, there are also markets that speculate on human extinction and those have even less chances to be resolved True