Which beliefs are the most rationalist?
➕
Plus
60
Ṁ18k
resolved Aug 6
Resolved
YES
we are prone to "become attached to beliefs we may not want", but we can take explicit reasoning steps to mitigate this problem.
Resolved
82%
most people are too often overconfident that they know what someone else is trying to say. It can be helpful in discussions to take a step back and check whether you are actually
Resolved
82%
"The Sequences" are worth reading in their entirety
Resolved
76%
beliefs which do not influence future predictions (ie. ones that can only explain things in retrospect or ones that don't explain anything) are useless
Resolved
76%
you should never be 100% confident of any belief
Resolved
75%
Rationality is the art of winning; if there's a behavior you think is rational, but it tends to lose, you're not actually being rational. This includes properly modeling other people who are not rational.
Resolved
69%
I would be very happy if Scott Alexander was president
Resolved
60%
Emotions are bad and dumb. Smart people try to ignore them.
Resolved
59%
The probability that AI destroys humanity or causes incredible suffering before 2100 is >5%.
Resolved
56%
I personally try to do whatever I can in order to make better decisions
Resolved
50%
learning about common biases and trying to account for them in yourself can help one to make better decisions
Resolved
50%
studying game theory can help one to make better decisions
Resolved
50%
it's important to distinguish between your model of reality and reality itself
Resolved
50%
Consequentialism and deontology are not at odds. Use deontology because it's easier to work with on a daily basis; use consequentialism so you have the ability to reason about, question, and adjust your deontology, and handle exceptions.
Resolved
50%
It is important to vet beliefs such that your beliefs will be true and you will not hold beliefs that are false.
Resolved
44%
studying economics can help one to make better decisions
Resolved
44%
I am smarter than most people.
Resolved
40%
one should change their beliefs when presented with new information.
Resolved
36%
everyone should do what they can in order to make better decisions
Resolved
35%
I would be very happy if Eliezer Yudkowsky was president

Please add more beliefs.

Once the market ends I will create a poll for each of them (with the options "agree and rationalist", "disagree and rationalist", "agree and not rationalist", "disagree and not rationalist", "neither disagree nor agree and rationalist", "neither disagree nor agree and not rationalist") and for each one find the difference between the percentage of rationalists who agree and the percentage of non-rationalists who agree. I will then normalize them such that answers with no difference will resolve to 50% while the answer with the most difference will resolve to YES (or NO if the largest difference is negative for some reason).

This market is meant to be about what makes rationalists and non-rationalists different, not just their differences. For example: rationalists and non-rationalists seem to have different beliefs about AI, and answers about AI are allowed, but I don't think those differences are essential, so I don't think they are the point of the market, please keep that in mind.

Get
Ṁ1,000
and
S3.00
Sort by:

Worth noting that there were very few non-rationalists taking part, usually 1-5 per poll, with me being the only data point for quite a few, including both the ones yielding most extreme positive and negative results.

In addition, both I and other non-rationalist Manifold users are disproportionately rationalist-adjacent relative to the general population, possibly further obscuring findings.

Should I class myself as rationalist or not in your opinion if I do support the movement that people should use insights from formal logic, statistics, economics, cognitive science etc to improve their individual reasoning, but have only had a chance so far to read a handful out of the dozens of lesswrong posts that would actually explain how I can do so myself, thus have a fairly poor understanding? In my poll I classed this as "maybe rationalist" but that isn't an option in yours

I think for this market it's more important whether you would consider yourself a rationalist then whether I (or anyone else) would consider you one. The main purpose of this market is to collect data on the differences between rationalists and non rationalists so that I can have a more complete concept of what the category means, so if I have to define it to collect that data that kind of defeats the whole point.

fair enough

https://manifold.markets/bluerat/the-sequences-are-worth-reading-in?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/an-intuitive-understanding-of-bayes?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/the-probability-that-ai-destroys-hu?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/rationality-is-the-art-of-winning-i?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/i-would-be-very-happy-if-scott-alex?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/learning-about-common-biases-and-tr?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/beliefs-which-do-not-influence-futu?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/you-should-never-be-100-confident-o?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/studying-economics-can-help-one-to?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/i-would-be-very-happy-if-eliezer-yu?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/we-are-prone-to-become-attached-to?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/it-is-important-to-vet-beliefs-such?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/studying-game-theory-can-help-one-t?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/i-personally-try-to-do-whatever-i-c?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/everyone-should-do-what-they-can-in?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/i-am-smarter-than-most-people?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/most-people-are-too-often-overconfi?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/emotions-are-bad-and-dumb-smart-peo?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/one-should-change-their-beliefs-whe?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/consequentialism-and-deontology-are?r=Ymx1ZXJhdA
https://manifold.markets/bluerat/its-important-to-distinguish-betwee?r=Ymx1ZXJhdA
@traders please vote

also there's a high chance I made a typo or some other similar error in at least one of these polls, please let me know if you find one so I can correct it before it becomes a bigger problem

For long items, the titles are truncated and the market description doesn't have the full text in them. Please add the full text to the market descriptions.

@traders I was hoping to get at least 5 rationalists and 5 non-rationalists vote on every poll, currently theres only 1 poll that doesn't have at least 5 rationalist votes, but almost all of them don't yet have 5 non-rationalist votes, if you bet on this market and aren't a rationalist please vote on all the polls.

Emotions are bad and dumb. Smart people try to ignore them.

"Emotions are bad and dumb. Smart people try to ignore them." being at 7% is basically the same as "Emotions are not bad or dumb. Smart people don't try to ignore them." being at 93%, isn't it? Do people actually think this is going to be the biggest difference between rationalists and non-rationalists? I am assuming it's actually that people are still confused by the resolution criteria, would anyone be willing to try explaining it better than I can?

bought Ṁ10 Emotions are bad and... YES

I think you're right that it's just people being confused by the resolution criteria. I expect this one to have near-universal disagreement among both groups, so it should be close to 50%.

The belief "I am smarter than other people" isn't a good belief for a rationalist to hold IMO, it implies that one can gauge their own intelligence in some kind of objective way which is an extraordinary claim. What about smart snap decisions, and especially decisions in a domain you don't have experience? I think it is certainly interesting how many people are sharing tips for personal growth and finding their own limits - this is wonderful! Maybe it does push you above average, at least marginally - now, would it be a good idea to hold the belief "I am smarter than other people"? Absolutely not. Intelligence is clearly a "spiky" profile. If you think you're smarter, you're just looking at the spikes. Someone with that view can delude themselves into thinking their real intelligence is more general than it really is. I'd argue that resisting having strong beliefs about one's relative intelligence is a more rational idea, only comparing yourself to yourself.

It seems like you're objecting more to the vagueness of the statement than the truth of the statement. But although it is a vague statement, it's still definitely true that some people are smarter than others, and it's often easy to realize when this is the case.

People usually don't want to affirm a statement like, "I am smarter than most people," because it's seen as arrogant, even when it's obviously true (It has to be true for 50% of people after all). But rationalists care much more about having true beliefs than the social acceptability of those beliefs, so they are more likely to affirm it when it is actually true.

That is a good point about the spiky profile. Any rationalist should take this into account when assessing their own intelligence. But I don't think that should lead to a blanket prescription against believing you're smarter than others. If you've accounted for the bias and still think you're smarter, it can be a rational thing to believe.

Rationality is the art of winning; if there's a behavior you think is rational, but it tends to lose, you're not actually being rational. This includes properly modeling other people who are not rational.
bought Ṁ1 Rationality is the a... NO

This is misguided. Rationality is a tool. It is objective-agnostic

But this statement is objective-agnostic. "Winning" means "achieving your objectives, whatever they are."

There exists objectives that don't require winning as an agent

If you claim winning here doesn't mean winning as an agent then that whole statement is vague

Seems correctly priced to me, since this is a quote from the sequences, and I expect most rationalists to interpret it as saying just “you should actually achieve your goals, not just strive to act rationally”, as that is the context in which the saying was originally introduced.

I have no idea what you mean by "winning as an agent", but "winning" in that statement is synonymous with "achieving your goals". It comes from the sequences, where it's clear that that's what is meant by "winning".

Emotions are bad and dumb. Smart people try to ignore them.
bought Ṁ40 Emotions are bad and... YES

This is actually explicitly the opposite of the rationalist belief (I see you have already linked the straw vulcan, so I don't have to), but I also don't expect many non-rationalists to believe it, so it should be around 50%

Emotions are bad and dumb. Smart people try to ignore them.

@KarlK I do think people sometimes go too far in their criticisms of therapy culture. Good submission, Karl.

Thanks. I was just trying to think of negative examples since too many things here are >50%!


https://www.lesswrong.com/tag/straw-vulcan

The probability that AI destroys humanity or causes incredible suffering before 2100 is >5%.
bought Ṁ50 The probability that... YES

A lot of rationalist beliefs are things that most people agree with in principle, but I think this belief is much more common among the rationalist than the general public due to Eliezer Yudkowsky's influence.

bought Ṁ15 studying game theory... YES

The one caveat is that rationalists understand probability much better than the general public, so when a rationalist says >5% they actually mean it, whereas a random person might agree just because they think <5% means something close to "impossible".

one should change their beliefs when presented with new information.

@MaxKammerman this isn't a belief, it's an action. I don't think a poll about whether people agree with it would make much sense grammatically, do you have any preference on how it should be rephrased?

EDIT: This answer was changed from "changing ones beliefs based on new information" to "one should change their beliefs when presented with new information" after discussion with @MaxKammerman.

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules