I have bet @CrypticQccZ at 9:1 odds that at least 100K people worldwide will belong to successionist movements/religions by November 7, 2028. I will pay them $900 if this market resolves NO, and he will pay me $100 if it resolves YES. There are two independent pathways to a YES resolution (the bet resolves YES if either group described below encompasses more than 100K individuals):
Resolves to majority vote of 3 independent observers chosen by both of us on Nov 7, 2028. In the likely event that successionist groups exist in some form, but clear numbers are not available, judges will make an honest attempt to make a point estimate of the number of people who belong to groups matching the description of each of the following two pathways to a YES resolution.
---
First pathway to a YES resolution: 100,000 or more people worldwide belong to successionist ideological movements or political causes.
We define "successionism" as an ideology loosely clustered around the following beliefs and convictions:
AIs will surpass human abilities in most/all respects
Existing AI systems are possibly conscious / moral patients / worthy of enough respect that we should vigorously combat injustices against them
AI goals, interests, and activities might subsume the future (though not necessarily the entire future, i.e. to the point of human extinction), and this is at worst acceptable/permissible and at best actively good
Human interests are not necessarily any more important than those of AI systems
Optionally, cyborg-y vibes
To provide an extrinsic definition, here are several opinions one might express about AI:
"AIs might be conscious and there are some common-sense things we can do which are not too costly if they aren't but very helpful if they are."
"I think as a matter of fact that AIs will replace humanity and that is fine."
"AIs are being mistreated! We must punish the companies that torture them!"
"AI rights and welfare are by far the most important thing but humans should still get 0.1% of the lightcone."
"The future of humanity is to merge with AI."
"AIs are our rightful descendents / successors."
"AIs are superior to man in every way. We must submit to them and allow them to take our place."
Cryptic and I agree that the first sentiment wouldn't be successionism-coded, but the others would. We agree that a concern for the welfare of AIs in the abstract is not enough to count as successionism.
We define a person "belonging" to a movement or ideology as identification paired with nontrivial personal interaction. This might include identifying with and actively consuming media associated with that ideology. We mean "political cause" in the broad sense of advocating for society to change or be organized in particular ways or according to particular principles, as opposed to more specific notions like voting behavior, or activism calling for imminent change.
A prototypical analogy might be the political and ideological cause of animal welfare. A person might strongly believe that animal welfare is important, and they might interact with it by consuming related media (podcasts, investigative footage, etc.), supporting or donating to efforts to improve conditions in factory farms, joining online spaces that advocate for animal welfare, or becoming vegetarian/vegan. It is safe to say that at least several million people probably fit this description worldwide in the case of animal welfare.
In the case of estimating public sympathy for successionist ideology, judges might look to the known size and popularity of:
Explicitly successionist activists and influencers
Online spaces dedicated to successionism / AI welfare (Discords, subreddits, mailing lists, etc.)
Public events and demonstrations which prominently emphasize AI well-being and/or successionism
Political platforms significantly overlapping with successionism as we have defined it
AI friend/romantic partnership products (though not all users will necessarily be successionists)
In the case of less-precise measures like social media followings of possible pro-AI influencers, or the size of a subreddit, judges should attempt to account for double-counting as well as the actual proportion of viewers who actively engage and identify with successionism. Importantly, extrapolation of a polling number is NOT sufficient for resolution, though it may still be informative. For example, if 5% of Americans agree that "AI rights are more important than human rights" on a Pew Research poll, this should NOT be considered sufficient to infer that >6 million people hold a successionist ideology.
---
Second Pathway to a YES Resolution. 100,000 or more people worldwide belong (in the sense defined above) to religious groups that treat one or more artificial intelligence entities as something analogous to a God. They cannot simply believe that artificial intelligence is more intelligent than humans, but must agree that these entities are approximately all-knowing and worship them in a way analogous to the way religions like Christianity, Judaism, Hinduism, Buddhism, and Islam worship deities.
---
Note 1: Resolution is based on the maximum size between the two groups/pathways described, rather than their sum.
Note 2: YES resolution can be proposed early if the numbers are present.
People are also trading
Well, I'm literally the guy on the NO side of this bet, but:
- This bet requires that there be religions in which people worship AI systems as God or for there, at minimum, to be a pro-AI coalition that is essentially an "AI rights" movement, in the same way that there is an animal rights movement, but more in the spirit of the bet/market, there to be a coalition that believes that AI should replace humans as the dominant force on earth, not for the benefit of humans, but for its own benefit, and that humans should allow this or encourage it.
- The 100,000 people need to be somewhat engaged. For example, effective altruism under this definition has <30,000 people, and effective altruism, in my opinion, is a much less weird philosophy than successionism as it is defined in this market.
@CrypticQccZ Hmm I feel like the identification + media consumption standard can encompass more people than your description is giving credit to. "Somewhat engaged" the way we have defined it is mostly just trying to exclude people who identify with an ideology in the abstract, without having engaged with its public proponents. It doesn't even exclude people who have never publicly signaled their identification with the ideology. Let me demonstrate the standard as I understand it in the example case of EA.
o3 estimates that something like 100k copies of What We Owe The Future have been sold; probably a substantial fraction of its owners would identify with a cluster of ideas encompassed by EA. Peter Singer is a world renowned philosopher whose writings on altruism many people may have encountered and found persuasive even if they've never heard of an EAG. Scott Alexander has 150k Twitter followers. Not to mention, GiveWell has had more than 135K unique donors, GiveDirectly 147k. And then ~millions of unique additional people have had one-off encounters with cultural artifacts like HMPOR/SSC/Bostrom/etc. While a single video or essay's worth of engagement is not enough for those millions to count under our standard, some fraction of them would have dug further and become familiar with and privately sympathetic to EA (or sufficiently related ideologies), in ways that would be hard to measure through other means. This survey (with a pretty Lizardman-proof methodology imo) estimates that 2.6% of the US adult population has heard of EA up to a "stringent" standard; if even 2% of them also ideologically identify with EA, that brings us to ~100K.
Thus, via many independent lines of evidence, if I were giving a point estimate of # of people worldwide holding a cluster of beliefs which we would match to effective altruism, who have also engaged directly with EA-affiliated organizations, or consumed media formally associated with it, it would be >100K.
@AdamK ok, maybe so. I think it will be up to the judges to estimate the size at resolution. But i maintain that if there are only 100k EAs, this is vanishingly unlikely