Is it ever rational to assert that someone is wrong despite not being personally able to explain exactly why/how?
14
25
Never closes
Yes
No
Other [Explain in comments]

E.g. you wander into a debate with a flat Earther, and they present you with a series of arguments that you personally don't know how to debunk. Are you required by principles of rationality to return to a temporary state of agnosticism until you can determine the correct rebuttal, or might there be rational justifications to nevertheless assert that they are wrong?

Get Ṁ600 play money
Sort by:

Two people's unsupported claims can have different weights of evidence. You'd be correct to update harder on people who often guess correctly based on vibes, or who have more domain knowledge, or who seem generally more sane.

I'm getting a vague memory of a recent-ish Scott Alexander Atral Codex Ten blog post about epistemology that made an argument like that about two people saying simply "I don't know" about the existence of God.

Actually, despite all past experience, I'm going to try to find what I'm referring to on that website.

Aha! Success.

Likewise, is there a God? Maybe you ask the world’s top philosopher of religion, who has spent his entire life thinking about this question, and he says “I’m not sure”. Then you ask a random teenager who has given it two seconds’ thought, and she also says “I’m not sure”. Neither of these people has done anything wrong. Their identical answers conceal a vastly different amount of thought that’s gone into the question. But it’s your job to ask each person how much thought they put in, not the job of the English language to design a way of saying the words “I’m not sure” that communicates level of effort and expertise.

https://www.astralcodexten.com/p/in-continued-defense-of-non-frequentist

This isn't exactly the subject of this question, for that, I'll have to find that one time Eliezer talked about shoelaces and police detectives and Bayesian evidence...

We need some kinda service that takes "I vaguely remember this person saying something about X, Y and Z" and returns "here's the link/page number."

If your eyes and brain work correctly, your beliefs will end up entangled with the facts. Rational thought produces beliefs which are themselves evidence.

If your tongue speaks truly, your rational beliefs, which are themselves evidence, can act as evidence for someone else. Entanglement can be transmitted through chains of cause and effect—and if you speak, and another hears, that too is cause and effect. When you say “My shoelaces are untied” over a cellphone, you’re sharing your entanglement with your shoelaces with a friend.

Therefore rational beliefs are contagious, among honest folk who believe each other to be honest. And it’s why a claim that your beliefs are not contagious—that you believe for private reasons which are not transmissible—is so suspicious. If your beliefs are entangled with reality, they should be contagious among honest folk.

https://www.lesswrong.com/posts/6s3xABaXKPdFwA3FS/what-is-evidence

And from the next post in the sequence:

Suppose that your good friend, the police commissioner, tells you in strictest confidence that the crime kingpin of your city is Wulky Wilkinsen. As a rationalist, are you licensed to believe this statement? Put it this way: if you go ahead and insult Wulky, I’d call you foolhardy. Since it is prudent to act as if Wulky has a substantially higher-than-default probability of being a crime boss, the police commissioner’s statement must have been strong Bayesian evidence.

Our legal system will not imprison Wulky on the basis of the police commissioner’s statement. It is not admissible as legal evidence. Maybe if you locked up every person accused of being a crime boss by a police commissioner, you’d initially catch a lot of crime bosses, and relatively few people the commissioner just didn’t like. But unrestrained power attracts corruption like honey attracts flies: over time, you’d catch fewer and fewer real crime bosses (who would go to greater lengths to ensure anonymity), and more and more innocent victims.

This does not mean that the police commissioner’s statement is not rational evidence. It still has a lopsided likelihood ratio, and you’d still be a fool to insult Wulky. But on a social level, in pursuit of a social goal, we deliberately define “legal evidence” to include only particular kinds of evidence, such as the police commissioner’s own observations on the night of April 4th. All legal evidence should ideally be rational evidence, but not the other way around. We impose special, strong, additional standards before we anoint rational evidence as “legal evidence.”

https://www.lesswrong.com/s/zpCiuR4T343j9WkcK/p/fhojYBGGiYAFcryHZ

"Rational" to me is not something humans DO. It's a way of describing the ideal, abstract, mathematical shape of how beliefs shift along with evidence.

The question I think you're getting at is "should we give 'ah yes, you are very rational' social credit to people who can't share any specific details of why they believe something?"

And I say: "I don't care about the praise or social credit! If someone says 'Ive got a feeling this is true' and that statement surprises me, will I do better in my predictions if I update on the noises that person just made - and how strongly should I treat that evidence?"

If they just flip a coin before saying whether they do or don't believe something, I would put earplugs in whenever talking to them. If they're literally always wrong on binary propositions like truth/falsehood, I'll listen as closely as if they're an Oracle, since that's either a hugely unlikely coincidence or they have a powerful engine which they've then hooked the transmission up to backwards.

To make the word "rational" feel more intuitive, there's an excellent more recent piece by Eliezer about Solomonoff Induction... I'll just go find that as well...

https://www.lesswrong.com/posts/EL4HNa92Z95FKL9R2/a-semitechnical-introductory-dialogue-on-solomonoff-1

The format of this is something that tickles my brain, and it's a good way to kinda get into the mindset of "what is all this talk about rationality and ideal agents and abstract mathematics for which we have internally only distorted reflections of scattered shards?"

And, in the service of dissolving this question, think not of whether to praise the speaker for their reasoning skills, but how much you should change your mind based on what noises that specific part of reality makes under what conditions.

The question in the title and in the text are opposite. To be clear, I answered according to the title. So my 'No' means: No, it's never rational to assert that someone is wrong despite not being able to explain why.

That doesn't mean I can't make predictions about what I would believe if I had examined new evidence or arguments I have not examined. I can't say that the flat earther is wrong before I can rebut them, but I can predict that I will be able to do so, and I can make decisions based on that prediction, so I don't have to act as if the earth could just as well be flat as round every time a flat earther presents me with a new argument. I think it would be very intellectually dishonest to call the other person wrong based on such a prediction, however.

@J89502 Yeah, I noticed the discrepancy between the title question and description question after you voted, and I've edited the description question so that it is no longer a simple Yes/No.