
Please add more beliefs.
Once the market ends I will create a poll for each of them (with the options "agree and rationalist", "disagree and rationalist", "agree and not rationalist", "disagree and not rationalist", "neither disagree nor agree and rationalist", "neither disagree nor agree and not rationalist") and for each one find the difference between the percentage of rationalists who agree and the percentage of non-rationalists who agree. I will then normalize them such that answers with no difference will resolve to 50% while the answer with the most difference will resolve to YES (or NO if the largest difference is negative for some reason).
This market is meant to be about what makes rationalists and non-rationalists different, not just their differences. For example: rationalists and non-rationalists seem to have different beliefs about AI, and answers about AI are allowed, but I don't think those differences are essential, so I don't think they are the point of the market, please keep that in mind.
🏅 Top traders
# | Name | Total profit |
---|---|---|
1 | Ṁ418 | |
2 | Ṁ380 | |
3 | Ṁ162 | |
4 | Ṁ63 | |
5 | Ṁ36 |
Worth noting that there were very few non-rationalists taking part, usually 1-5 per poll, with me being the only data point for quite a few, including both the ones yielding most extreme positive and negative results.
In addition, both I and other non-rationalist Manifold users are disproportionately rationalist-adjacent relative to the general population, possibly further obscuring findings.
Should I class myself as rationalist or not in your opinion if I do support the movement that people should use insights from formal logic, statistics, economics, cognitive science etc to improve their individual reasoning, but have only had a chance so far to read a handful out of the dozens of lesswrong posts that would actually explain how I can do so myself, thus have a fairly poor understanding? In my poll I classed this as "maybe rationalist" but that isn't an option in yours
I think for this market it's more important whether you would consider yourself a rationalist then whether I (or anyone else) would consider you one. The main purpose of this market is to collect data on the differences between rationalists and non rationalists so that I can have a more complete concept of what the category means, so if I have to define it to collect that data that kind of defeats the whole point.
https://manifold.markets/bluerat/the-sequences-are-worth-reading-in?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/an-intuitive-understanding-of-bayes?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/the-probability-that-ai-destroys-hu?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/rationality-is-the-art-of-winning-i?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/i-would-be-very-happy-if-scott-alex?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/learning-about-common-biases-and-tr?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/beliefs-which-do-not-influence-futu?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/you-should-never-be-100-confident-o?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/studying-economics-can-help-one-to?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/i-would-be-very-happy-if-eliezer-yu?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/we-are-prone-to-become-attached-to?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/it-is-important-to-vet-beliefs-such?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/studying-game-theory-can-help-one-t?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/i-personally-try-to-do-whatever-i-c?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/everyone-should-do-what-they-can-in?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/i-am-smarter-than-most-people?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/most-people-are-too-often-overconfi?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/emotions-are-bad-and-dumb-smart-peo?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/one-should-change-their-beliefs-whe?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/consequentialism-and-deontology-are?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/its-important-to-distinguish-betwee?r=Ymx1ZXJhdA
@traders please vote
@traders I was hoping to get at least 5 rationalists and 5 non-rationalists vote on every poll, currently theres only 1 poll that doesn't have at least 5 rationalist votes, but almost all of them don't yet have 5 non-rationalist votes, if you bet on this market and aren't a rationalist please vote on all the polls.
"Emotions are bad and dumb. Smart people try to ignore them." being at 7% is basically the same as "Emotions are not bad or dumb. Smart people don't try to ignore them." being at 93%, isn't it? Do people actually think this is going to be the biggest difference between rationalists and non-rationalists? I am assuming it's actually that people are still confused by the resolution criteria, would anyone be willing to try explaining it better than I can?
The belief "I am smarter than other people" isn't a good belief for a rationalist to hold IMO, it implies that one can gauge their own intelligence in some kind of objective way which is an extraordinary claim. What about smart snap decisions, and especially decisions in a domain you don't have experience? I think it is certainly interesting how many people are sharing tips for personal growth and finding their own limits - this is wonderful! Maybe it does push you above average, at least marginally - now, would it be a good idea to hold the belief "I am smarter than other people"? Absolutely not. Intelligence is clearly a "spiky" profile. If you think you're smarter, you're just looking at the spikes. Someone with that view can delude themselves into thinking their real intelligence is more general than it really is. I'd argue that resisting having strong beliefs about one's relative intelligence is a more rational idea, only comparing yourself to yourself.
It seems like you're objecting more to the vagueness of the statement than the truth of the statement. But although it is a vague statement, it's still definitely true that some people are smarter than others, and it's often easy to realize when this is the case.
People usually don't want to affirm a statement like, "I am smarter than most people," because it's seen as arrogant, even when it's obviously true (It has to be true for 50% of people after all). But rationalists care much more about having true beliefs than the social acceptability of those beliefs, so they are more likely to affirm it when it is actually true.
That is a good point about the spiky profile. Any rationalist should take this into account when assessing their own intelligence. But I don't think that should lead to a blanket prescription against believing you're smarter than others. If you've accounted for the bias and still think you're smarter, it can be a rational thing to believe.
@KarlK I do think people sometimes go too far in their criticisms of therapy culture. Good submission, Karl.
Thanks. I was just trying to think of negative examples since too many things here are >50%!
A lot of rationalist beliefs are things that most people agree with in principle, but I think this belief is much more common among the rationalist than the general public due to Eliezer Yudkowsky's influence.
@MaxKammerman this isn't a belief, it's an action. I don't think a poll about whether people agree with it would make much sense grammatically, do you have any preference on how it should be rephrased?
EDIT: This answer was changed from "changing ones beliefs based on new information" to "one should change their beliefs when presented with new information" after discussion with @MaxKammerman.