Is LessWrong a cult?
25
490Ṁ636
resolved Dec 17
Resolved as
47%

Pro: eccentric charismatic leader, poorly-hidden agenda, religious texts.

Con: membership too globally dispersed to show up to white nights.

Resolves as market % at EoY (I think that's the right way to get a weather vane of user opinion without doing a zero-stakes survey?)

2023-09-18 Resolves as market % at random date and time in December 2023.

† see discussion @ https://manifold.markets/alexkropivny/is-lesswrong-a-cult#apSvbPyWyDRlV2VH3A0v

Get
Ṁ1,000
to start trading!

🏅 Top traders

#NameTotal profit
1Ṁ5
2Ṁ5
3Ṁ5
4Ṁ4
5Ṁ2
Sort by:
predictedYES

I look forward to spending all of 2024 proving this resolution wrong.

Angry Forgiveness GIF by Disney Channel

@Ophiuchus Sadly, per earlier discussion, self-resolving % markets or small polls are way too vulnerable to market manipulation. I suspect even my final resolution criteria of ending on a random date just encouraged a strategy of betting towards 50%.

A long-term non-resolving "stock" market would serve as an opinion weather vane and not have this problem, but having a prediction market work on longer timelines than the ever-changing community it maps also seems wrong.

tl;dr question was funny but ineffective.

predictedYES

@alexkropivny Do you want to create the stock market or shall I?

@Ophiuchus Go for it!

First google result for cult checklist: https://cultrecovery101.com/cult-recovery-readings/checklist-of-cult-characteristics/

I'd say the majority clearly don't apply. E.g. no preoccupation with recruitment or making money, no mind-numbing techniques, no lack of accountability or justification of normally immoral things.

The rest I'd say also basically don't apply (except for maybe elitism?), but are a bit more subjective.

predictedNO

@DanielFilan Not even elitism applies. The way it defines elitism isn't just, "Groups members think they're smarter than other people" or something like that that would arguably apply to LessWrong. It says that they claim special, exalted status for themselves, like the leader being the Messiah or having some ordained mission to save humanity. I guess you could argue that LessWrong users do think they have a mission to save humanity, but even that is clearly different from what's being referred to in the example given - LessWrongers don't think there's anything special about them that gives them a mission to save humanity. They just think that humanity isn't doing enough to prevent existential risks.

@JosephNoonan I guess it depends how high a bar you have for "special/exalted"

predictedYES

LessWrong almost certainly has some of the elements observed in more traditional cults…

I’m inclined to say that it is a borderline case for the time being. The uncritical worship of the leader (i.e. Yudkowsky) is present, as well as the presence of millenarian or apocalyptic beliefs; on the other hand, there is scant evidence of the imposition of extreme isolation from the rest of society on the part of its members.

One could possibly classify it as a political movement. It clearly aspires to influence public policy on a global scale focusing on AI Risk, for instance…

Resolution changed from Dec 31 to random time between Dec 1 and Dec 31.

Depends on how you define a "cult" - I'd argue that Yudkowsky definitely has a bit of a cult of personality surrounding him, with some elements of a doomsday cult as well.

predictedYES

@evergreenemily "What does that song have to do with this?" Nothing, it just slaps.

@evergreenemily There is also the less wrong community in a specific geographical area (which has been accused of being a cult) and on the internet as a whole. Also, a lot of groups look cults just at a glance from the weirdness factor alone: take the order of the dolphin (which later turned into SETI) anyone reading about it would say it sounded pretty weird and cultish.

Even if it is not a cult of personality, it has always sounded to me like a cult of rationality (I have spent very little time on the website though) with some related ideals: https://en.wikipedia.org/wiki/Cult_of_Reason

Resolving as % is incentive for someone with a lot of mana to come in at the last minute and buy the market to 0 or 100, and incentive for everyone without a lot of mana to stay out of the market because they don't know which way that will be.

You could instead resolve to a poll, which you can make as a separate question for 10 mana.

@Joshua Thank you for the explanation!

Seems like low trade volume %-polls are pretty exposed to manipulation (shout out to @deepfates) which just trades off the mana whale threat for a clout whale threat.

Presumably pre-declaring a resolution of N/A if manipulation is detected would just make manipulation more reliable. Hmm. :)

Resolving to the side with more people on it at close is also one way to do it. Still manipulatable, but anything is better than resolving to % imo

@Joshua Does a random resolution date (manually enforced by creator, or via convoluted automated schemes looking for BTC hashes ending with pre-committed values) solve the problem, or just add back uncertainty through complexity?

The category of polls that read the room without converging on a winning solution right before close seems interesting.

I think that's certainly better!

The other approach for users without a lot of mana is to buy towards 50%. If the last-minute-whale was completely unpredictable, then buying towards 50% is positive expected value, and in fact, it's better than that because the last-minute-whale can normally make more mana by buying against the "tide", i.e. through 50%.

Either way, there's virtually no incentive for anybody to bet with their true beliefs.

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules