Will Effective Altruism noticably segregate into multiple movements within this decade?

Resolves subjectively.

What this might include is a noticeable split into e.g. longtermist and neartermist. For example, with a large (>10%) minority of members of either groups stopping to identify as part of broader EA generally.

Open to hashing out more objective criteria.

Get Ṁ600 play money
Sort by:

I hope so! more fracturing means more spread over idea space, which means the unambiguously good parts of EA can spread into more movements, and those movements can debate the things that need debating!

As far as I can tell, EA is already somewhat fractured into different sub-movements based on what they consider to be effective cause areas. Most notably, there's a lot of disagreement on how strongly to weigh animal suffering vs. that of humans, and also on how seriously to rate the existential threat posed by AI. There are probably other less obvious splits too. They're all still 'EA' but concerns downstream of the central idea rift them apart, and in the future this may intensify to the point of open tribal hostility. Would that count?

@AngolaMaldives Yeah, I think this is spot on; I even remember a post on the EA Forum not long ago addressing what someone perceived as an unfairly dismissive attitude towards nonhuman animal welfare coming from more longtermist-adjacent folks.

@AngolaMaldives This seems like it'd count. (Also, I'm hoping that EA's overlap with the rationality-community could lower the propensity to tribalism-related problems. Then again, if/when EA recruits more "mainstream"-ish people at universities, I could see conflicts cropping up. I'm not sure e.g. what % of EA groups have a high "having read the Sequences" penetration.)

@CadeMataya Of course the long-term/near-term distinction is actually somewhat orthogonal to the human/nonhuman animal distinction. You could prioritize reducing animal suffering over the very long-term future. Few people seem to have that combination of views, but I think that is mostly because suffering-focused longtermists tend to think most future sentience will be digital.

@NicholasKross "Rationalism" is a tribe! A strong one as far as I can tell. They have group houses!