Will an anti AI Cult exists by 2027?
➕
Plus
33
Ṁ1278
2027
77%
chance

The cult must be specific anti AI. Anti tech is not specific enough to resolve to YES.

Related:

Feb 19, 2:09pm: Will an anti AI Cult exists by 2027 → Will an anti AI Cult exists by 2027?

Get
Ṁ1,000
and
S3.00
Sort by:

I’ve seen techbros on here suggest we create an AI Jesus to start a “meta-religion”… so I’ll bet yes

@Qvex Oops, wrong thread haha

What does this mean? Like a cult that abstains from using AI tech, a la the Amish, or Richard Stallman not using Javascript-enabled websites? Or a cult that prays for the end of AI? Or a QAnon type thing that makes irrational, unsupported assertions about AI already having taken control in institutions?

@RobinGreen any of these

what do you mean, LessWrong clearly exists

@CodeSolder

Me:

Make the case that lesswrong qualifies as a cult.

GPT-4

While the term "cult" is often used to describe religious groups with extreme beliefs and manipulative practices, it can also be applied to non-religious groups that exhibit certain characteristics. It is important to note that the term can be controversial, and not everyone may agree with the following assessment. However, one could argue that the online rationality community LessWrong qualifies as a cult based on the following points:

Charismatic leader: LessWrong has a central figure, Eliezer Yudkowsky, who is considered an intellectual authority and has a dedicated following. His influence on the community has led to the development of a specific jargon and worldview that can be difficult for outsiders to penetrate.

Exclusivity and elitism: LessWrong members often view themselves as part of an elite group of rational thinkers, which can lead to an "us vs. them" mentality. This exclusivity can create an environment where members feel superior to those outside the community and view dissenting opinions with suspicion.

Ideological indoctrination: The LessWrong community has a strong focus on a specific set of ideas and beliefs, such as Bayesian reasoning, artificial intelligence, and effective altruism. These ideas are often presented as absolute truths, and members are encouraged to internalize them as part of their identity.

Social isolation: Members of the LessWrong community often interact primarily with other like-minded individuals, which can lead to social isolation from those outside the group. This insular environment can reinforce the group's beliefs and practices, making it difficult for members to question or critically evaluate the group's ideas.

Control of information: The LessWrong community has a strong focus on rationality and skepticism, which can create an environment where dissenting opinions are dismissed as irrational or misguided. This can lead to a controlled flow of information, where members are only exposed to ideas that support the group's core beliefs.

High demand for commitment: The LessWrong community often demands a high level of commitment from its members, such as attending workshops, participating in online discussions, and adopting specific behaviors and practices. This can make it difficult for members to disengage from the group, even if they begin to question its teachings.

It is important to note that these points are merely an argument for considering LessWrong as a cult and do not necessarily represent an absolute truth. Additionally, these characteristics can vary in degree and may not be present in every member's experience within the community.

LW is obviously not a cult any more than "the trans movement" or "Christianity" or "the government" or "4chan" are cults. In general I have a hard time believing that these sorts of "anyone who disagrees with me is brainwashed" takes are made in good faith, but if there are any serious questions about this, I'm happy to elaborate further.

@IsaacKing my comment was intended to be an obvious joke, sorry if it wasn't

@CodeSolder Ah, apologies. I probably should have picked up on that from context.

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules