Which beliefs are the most rationalist?
Basic
59
17k
Jul 22
84%
"The Sequences" are worth reading in their entirety
·
18d
82%
an intuitive understanding of bayes' theorem can help one to make better decisions
·
21d
78%
The probability that AI destroys humanity or causes incredible suffering before 2100 is >5%.
·
20d
77%
Rationality is the art of winning; if there's a behavior you think is rational, but it tends to lose, you're not actually being rational. This includes properly modeling other people who are not rational.
·
20d
75%
I would be very happy if Scott Alexander was president
·
18d
73%
learning about common biases and trying to account for them in yourself can help one to make better decisions
·
21d
73%
beliefs which do not influence future predictions (ie. ones that can only explain things in retrospect or ones that don't explain anything) are useless
·
21d
69%
studying economics can help one to make better decisions
·
21d
68%
you should never be 100% confident of any belief
·
21d
66%
we are prone to "become attached to beliefs we may not want", but we can take explicit reasoning steps to mitigate this problem.
·
21d
63%
studying game theory can help one to make better decisions
·
21d
63%
I personally try to do whatever I can in order to make better decisions
·
21d
62%
It is important to vet beliefs such that your beliefs will be true and you will not hold beliefs that are false.
·
20d
62%
I would be very happy if Eliezer Yudkowsky was president
·
18d
62%
everyone should do what they can in order to make better decisions
·
21d
61%
one should change their beliefs when presented with new information.
·
20d
61%
it's important to distinguish between your model of reality and reality itself
·
21d
54%
I am smarter than most people.
·
17d
52%
most people are too often overconfident that they know what someone else is trying to say. It can be helpful in discussions to take a step back and check whether you are actually
·
21d
37%
Consequentialism and deontology are not at odds. Use deontology because it's easier to work with on a daily basis; use consequentialism so you have the ability to reason about, question, and adjust your deontology, and handle exceptions.
·
20d

Please add more beliefs.

Once the market ends I will create a poll for each of them (with the options "agree and rationalist", "disagree and rationalist", "agree and not rationalist", "disagree and not rationalist", "neither disagree nor agree and rationalist", "neither disagree nor agree and not rationalist") and for each one find the difference between the percentage of rationalists who agree and the percentage of non-rationalists who agree. I will then normalize them such that answers with no difference will resolve to 50% while the answer with the most difference will resolve to YES (or NO if the largest difference is negative for some reason).

This market is meant to be about what makes rationalists and non-rationalists different, not just their differences. For example: rationalists and non-rationalists seem to have different beliefs about AI, and answers about AI are allowed, but I don't think those differences are essential, so I don't think they are the point of the market, please keep that in mind.

Get Ṁ600 play money
Sort by:

"Emotions are bad and dumb. Smart people try to ignore them." being at 7% is basically the same as "Emotions are not bad or dumb. Smart people don't try to ignore them." being at 93%, isn't it? Do people actually think this is going to be the biggest difference between rationalists and non-rationalists? I am assuming it's actually that people are still confused by the resolution criteria, would anyone be willing to try explaining it better than I can?

bought Ṁ10 Answer #6xyt8m2twf YES

I think you're right that it's just people being confused by the resolution criteria. I expect this one to have near-universal disagreement among both groups, so it should be close to 50%.

The belief "I am smarter than other people" isn't a good belief for a rationalist to hold IMO, it implies that one can gauge their own intelligence in some kind of objective way which is an extraordinary claim. What about smart snap decisions, and especially decisions in a domain you don't have experience? I think it is certainly interesting how many people are sharing tips for personal growth and finding their own limits - this is wonderful! Maybe it does push you above average, at least marginally - now, would it be a good idea to hold the belief "I am smarter than other people"? Absolutely not. Intelligence is clearly a "spiky" profile. If you think you're smarter, you're just looking at the spikes. Someone with that view can delude themselves into thinking their real intelligence is more general than it really is. I'd argue that resisting having strong beliefs about one's relative intelligence is a more rational idea, only comparing yourself to yourself.

It seems like you're objecting more to the vagueness of the statement than the truth of the statement. But although it is a vague statement, it's still definitely true that some people are smarter than others, and it's often easy to realize when this is the case.

People usually don't want to affirm a statement like, "I am smarter than most people," because it's seen as arrogant, even when it's obviously true (It has to be true for 50% of people after all). But rationalists care much more about having true beliefs than the social acceptability of those beliefs, so they are more likely to affirm it when it is actually true.

That is a good point about the spiky profile. Any rationalist should take this into account when assessing their own intelligence. But I don't think that should lead to a blanket prescription against believing you're smarter than others. If you've accounted for the bias and still think you're smarter, it can be a rational thing to believe.

Rationality is the art of winning; if there's a behavior you think is rational, but it tends to lose, you're not actually being rational. This includes properly modeling other people who are not rational.
bought Ṁ1 Rationality is the a... NO

This is misguided. Rationality is a tool. It is objective-agnostic

But this statement is objective-agnostic. "Winning" means "achieving your objectives, whatever they are."

There exists objectives that don't require winning as an agent

If you claim winning here doesn't mean winning as an agent then that whole statement is vague

Seems correctly priced to me, since this is a quote from the sequences, and I expect most rationalists to interpret it as saying just “you should actually achieve your goals, not just strive to act rationally”, as that is the context in which the saying was originally introduced.

I have no idea what you mean by "winning as an agent", but "winning" in that statement is synonymous with "achieving your goals". It comes from the sequences, where it's clear that that's what is meant by "winning".

bought Ṁ40 Answer #6xyt8m2twf YES

This is actually explicitly the opposite of the rationalist belief (I see you have already linked the straw vulcan, so I don't have to), but I also don't expect many non-rationalists to believe it, so it should be around 50%

@KarlK I do think people sometimes go too far in their criticisms of therapy culture. Good submission, Karl.

Thanks. I was just trying to think of negative examples since too many things here are >50%!


https://www.lesswrong.com/tag/straw-vulcan

The probability that AI destroys humanity or causes incredible suffering before 2100 is >5%.
bought Ṁ50 The probability that... YES

A lot of rationalist beliefs are things that most people agree with in principle, but I think this belief is much more common among the rationalist than the general public due to Eliezer Yudkowsky's influence.

bought Ṁ15 studying game theory... YES

The one caveat is that rationalists understand probability much better than the general public, so when a rationalist says >5% they actually mean it, whereas a random person might agree just because they think <5% means something close to "impossible".

one should change their beliefs when presented with new information.

@MaxKammerman this isn't a belief, it's an action. I don't think a poll about whether people agree with it would make much sense grammatically, do you have any preference on how it should be rephrased?

EDIT: This answer was changed from "changing ones beliefs based on new information" to "one should change their beliefs when presented with new information" after discussion with @MaxKammerman.

bought Ṁ1 an intuitive underst... NO

Remember that based on the resolution criteria, only one answer will resolve fully YES. So you don't want to bet everything up to a really high percentage

Technically multiple could fully resolve to YES if there is a tie, and even if there's not a tie it's also possible that many will resolve pretty high if I can get a large enough sample size. I think you made a good point and good bets, but I also think many of the people who bet after you made this comment may have misunderstood what you were saying.

bought Ṁ50 studying economics c... YES

Yeah, looks like people saw my comment but still didn't actually read the resolution criteria

"answers with no difference will resolve to 50% while the answer with the most difference will resolve to YES"

And how will the answers where there is a difference, but they're not the most different will resolve?

So it's going to be normalized using the equation xnorm = (x-xmin)/(xmax-xmin), except to remove any incentive for intentionally suggesting bad answers to drive the minimum down, I will be assuming xmin = -xmax, so you can rewrite the equation as xnorm = (x+xmax)/(2*xmax) or xnorm = x/(2*xmax) + 1/2, where x is the difference between between the percentage of rationalists who agree and the percentage of non-rationalists who agree, and xnorm is what I will resolve it to. you can see that if x = 0 then xnorm = .5, if x = xmax then xnorm = 1, if x = -xmax then xnorm = 0, if x = xmax/2 then xnorm = .75, if x = xmax*2/3 then xnorm = .83, etc.

bought Ṁ50 one should change th... NO

What does this mean?

So, I'm going to have the poll where I ask rationalists and non-rationalists whether they agree with the statement, each statement will then have a value that is the percentage of rationalists that agree with the statement minus the percentage of non-rationalists that agree with the statement. In theory this could range from 100% (100% of rationalists agree - 0% of non-rationalists agree) to -100% (0% of rationalists agree - 100% of non-rationalists agree), but I can only resolve 0% to 100%, so one way I could fix that is take the percentage of rationalists that agree with the statement minus the percentage of non-rationalists that agree with the statement and divide it by 2 and then add 50%, that would normalize all the values into the 0% to 100% range that I can resolve to. However, in practice, I do not expect any of the statements to get 100% agreement from one group and 0% agreement from another, I expect the percentage of rationalists that agree with any given statement will be close to the percentage of non-rationalists that agree with that statement, so rather than dividing by 2 and having none of the statements likely to resolve to 1 (or 0), I will choose a different number to divide by that lets at least one statement resolve to 1 (or 0). Does that make sense?

sold Ṁ39 one should change th... YES

Let me make sure that I understand you. If the most pro-rationalist statement was 70% rationalist and 30% gen pop, giving a 40% difference, you would multiply it (and all other results) by 2.5 to get the final value. Is that correct?

Short answer: No, that's not correct. You are off by a factor of 2. Longer answer: it depends what the largest difference is, I think your question was assuming that the 40% difference was the largest, but even then multiplying by 2.5 bring the answers with a 40% difference to the 100% resolution percent, which is correct, but it keeps the answers with a 0% difference at 0% resolution percent, which is incorrect, because some answers will probably have a negative difference (for example: "Emotions are bad and dumb. Smart people try to ignore them."), so instead I would multiply by 1.25 bringing it to 50% and then add another 50% bringing it to 100%