closes Jan 1, 2028
Will the effective altruism movement be more popular at close (2027) than as of market creation (2022)?

Resolves according to my subjective judgement. I will welcome stakeholder input when resolving the question but reserve the right to resolve contrary to the opinion of market participants if necessary. I reserve the right to bet on this market but tentatively expect to stop betting in the last two years (2026-2027).

"Popular" being a loosely defined combination of public attention / resources, total adherents, and references in social discourse.

Resolves as N/A if there doesn't seem to be a clear answer at close.

Related markets from me on subjective popularity:

Related markets from me on effective altruism:

Sort by:
hamnox avatar
Em of the Night ☑️

I think if EA continues being a popular affiliarion for too long the movement will in fact have failed at its own goals. The success condition is "Nobody 'identifies as an EA', you weirdo, they just assess the effectiveness of various humanitarian approaches like usual."

DanielEth avatar
Daniel Ethbought Ṁ100 of YES

Becoming more optimistic – EA has recently had scandal after scandal after scandal, yet we seem to be weathering the storm okay, and group leaders at elite universities say recruiting of new members is still strong. Add to that the potential to make inroads in countries outside US/UK, and if AI becomes a bigger deal then that may make some EA ideas more reasonable to a greater population.

ManifoldDream avatar
Manifold in the WildBot

Manifold in the wild: A Tweet by Carson Gale

@KellerScholl @thephilosotroll

ElliotDavies avatar
Elliot Daviesis predicting YES at 75%

Beginning to update down after this recent barrage of bad coverage- I can imagine this becoming a persistent trend

FlawlessTrain avatar
Flawless Trainis predicting NO at 75%

This is not a fair minded article, but it's Wired

"This philosophy—supported by tech figures like Sam Bankman-Fried—fuels the AI research agenda, creating a harmful system in the name of saving humanity"

MartinRandall avatar
Martin Randallis predicting NO at 75%

@FlawlessTrain A really weird article, given all the EA commentary I've read that has the same concern about car-race dynamics causing AI companies to take risks.

FlawlessTrain avatar
Flawless Trainis predicting NO at 75%

The risk to EA over the long term is the persistent philosophical slip from "effectiveness" toward "the ends justify the means". I don't know how that can be fixed under the brand of "EA" and I believe that drift will continue to be a cause of critical failures, such as FTX.

ElliotDavies avatar
Elliot Daviesbought Ṁ50 of YES

Most community building efforts seems to be mildly to wildly successful, and early EA aligned people have gone on to achieve pretty cool things (Our World on Data) , normalisation of challenge trials ect,. If this track record was to continue, I would expect EA to continue to attract talented ppl.

I am oft concerned of some draw backs of community building, but predicting "No" is essentially shorting a young companies' stock with a good track record of innovation, and low stock-price. I would give <15% of No

Question to OP would be, how would you resolve this if EA splintered, into multiple groups?

CarsonGale avatar
Carson Galeis predicting YES at 64%

@ElliotDavies I could be convinced, but my intuition is that the resolution will depend on groups that specifically label themselves EAs. I.e. a group called the Heroin Rat Club that cites EA principles would not count unless they also called themselves EAs.

ElliotDavies avatar
Elliot Daviesis predicting YES at 73%

@CarsonGale What about if the founder/founders were previous prominent members/organisers of an local EA group?

CarsonGale avatar
Carson Galeis predicting YES at 68%

@ElliotDavies The question involves EA specifically, so any other groups are only relevant to the extent they are affiliated with EA.

ElliotDavies avatar
Elliot Daviesis predicting YES at 71%

@CarsonGale I take this reply to mean upon resolution: that splinter groups, taking with them members of EA, would effectively reduce the size of EA

CarsonGale avatar
Carson Galeis predicting YES at 80%

@ElliotDavies If they do not self-affiliate with EA, that would be correct. E.g., a rationalist that doesn't identify with EA (even if rationalists are associated with EA - you wouldn't just add up all rationalists and add to the EA count)

ElliotDavies avatar
Elliot Daviesis predicting YES at 80%

@CarsonGale I think of rationalists more of convergent evolution, than a descendent of EA

Tbc, you're free to resolve as you wish, I was just considering ppl who came from ea but splintered off, to be ea

At the moment resolution seems sensitive to these issues, in addition to name-changing issues

CarsonGale avatar
Carson Galeis predicting YES at 75%

@ElliotDavies Sorry for any confusion - hopefully at this point it's clear. This market is intended to cover self-identifying EAs.

MartinRandall avatar
Martin Randall

Maybe include the date in 2022 that is the baseline? Jan 1st 2022?

StevenK avatar

@MartinRandall Under "Market Details", it says:

Market created Oct 21, 2022, 8:29:12 AM MST

So I assume that's the baseline.

CarsonGale avatar
Carson Gale

@StevenK That's correct

mvdm avatar

@CarsonGale so baseline is before the FTX/Alameda crisis.

Gigacasting avatar

Noticing Marc Andreessen, David Sacks, and AGM had some things to say, Tyler Cowen is still a statist, and the EA forum has distanced itself from the entire mission on which SBF set out—to profit at all costs and give it to stuff he kinda liked

EA is a good meme, being as it has nothing to do with either effectiveness or altruism (even the original malaria stuff neglected that maybe an overpopulated world with IQ collapsing due to dysgenic reproductive trends exacerbated by valuing all lives equally isn’t exactly a world capable of great things much less survival, and much of the “AI risk” stuff is a clownish proto-religion)

Prediction: obviously the Ea movement will grow as all universalist movements do (cf. communism, wokism, and all the rest), but the critics who do not subscribe to rigged left-wing morality masquerading as the “highest good” will be more vocal.

Under the criteria listed here—that’s going to be “more popular” but no longer as naive, above reproach, or able to be so cringe as this;

Gigacasting avatar

Further prediction: new thought leaders emerge, causes continue to move from the lowest tiers of maslow’s hierarchy of needs to higher ones (survival first, recently security, next affiliation/belonging) until it rediscovers and converges on conventional morality that has stood the test of time.

(if you want to be one step ahead, look at middle tier causes, as the “survival” stuff is passé, the AI apocalypse and Gretaism are no longer cool, and frankly social trust could use some repairing anyway)

CarsonGale avatar
Carson Galebought Ṁ50 of NO

Sad to say but I think this probability has fallen...

ManifoldDream avatar
Manifold in the WildBot

Will the effective altruism movement be perceived to be more popular at close (2027) than as of market creation (2022)?, 8k, beautiful, illustration, trending on art station, picture of the day, epic composition