Wednesday 16 March 2022

The Nameless Fallacy

    I haven't seen this particular fallacy described elsewhere. It happens often enough that I was thinking it ought to have a name, and I was thinking of calling it the fallacy fallacy, but that refers to a different fallacy, and it seems to me it's better to leave it unnamed, in light of its nature.

    So what is it? It's the belief that calling out the name of a fallacy is an argument. "That's a straw man" or "That's a No True Scotsman" are things you'll hear all the time in debates, but it's almost never a good idea to use the name of a fallacy in an argument, for several reasons.

    First, it's very often misused. Knowing the names of fallacies is no guarantee that you actually know what makes them invalid. Many people seem to think that ad hominem just refers to name-calling, for example, or will call out "No True Scotsman" if you try to define a term in a way they don't like. 

    Second, it's lazy. Even if you do happen to properly understand what the fallacy is and why it's a fallacy, simply naming it is seldom an efficient shortcut to making clear your opponent's error, in part because (remember the first reason) there's a good chance your opponent either won't, in which case you're wasting words.

    Third, it very often leads to completely unnecessary side-arguments about the definition of the fallacy itself. "What? That wasn't an ad hominemAd hominem is when I say you're stupid, therefore your argument is wrong." "No, ad hominem means a personal attack!"

    Fourth, it's likely to be seen (often correctly) as showing off, an attempt to telegraph that you know something about the technical aspects of argumentation and therefore are not to be messed with, you master of logic you. 

    Finally, it's almost always completely unnecessary. Remember that a fallacy is a flawed argument, an error in reasoning that makes it vulnerable. You don't need to name the flaw in order to attack it. For example, recognizing that an ad hominem is a form of non sequitur where the premise ("You're stupid") does not lead to the conclusion ("therefore, you're wrong"), you can note that you do not need to refute the premise. "I may or may not be stupid, but stupid people can be right and smart people can be wrong. Show that my argument is wrong." 

    That is, by the way, why it's definitely useful to have names for the various types of fallacies, so we can discuss and analyze them and learn how to recognize and counter them, and to avoid committing them ourselves. Shop talk about rhetoric benefits greatly from having terms of art like these. But naming them in the middle of an actual argument is rarely a good move. 

    

Friday 4 March 2022

Understanding over Believing

  I've been thinking about this authoritarian epistemology idea for some time now, and I think I have identified a potential remedy. The idea is to stop thinking in terms of knowing (or believing-what-is-true), and focus instead on understanding.

We have long put a great deal of emphasis on the value of knowledge. For most purposes this is a perfectly serviceable value, and the pursuit of knowledge is certainly a noble calling.

But the problem arises when we take the authoritarian mindset into account. As I argued before, the authoritarian tends to see such things in terms of power, and in particular construes the act of telling someone something and being believed as an exercise of power. That's not a completely irrational model; the ability to persuade someone to do something is very much a kind of power. And as Voltaire said, anyone who can make you believe absurdities can make you commit atrocities. So it's not at all unreasonable to be wary of anyone trying to assert that kind of control over you, even if it's something as innocent as informing you what time it is.

The problem is this: when you regard the transmission of information as an attempted exercise of power, it becomes very tempting to resist being informed of anything, because being told something and believing it becomes a kind of surrender of autonomy, a failure of will. You feel like someone may be laughing at you behind your back, as if it's April Fool's Day and you're not in on the joke. And so disbelieving whatever you are told becomes an act of defiance and a demonstration of personal strength. What matters isn't so much whether you're right or wrong in any objective sense, but how bravely you stand against a more powerful opponent, which is why this sort of conspiracy theory tends to target The Government, The Mainstream Media, The Academic Elite, or whoever else might represent the dominant establishment view. Which is all well and good, because the establishment really ought to be challenged regularly, but when you adopt your beliefs just to be contrary, you're very likely to be wrong most of the time. Worse, you deprive yourself of the capacity to be right, because the more evidence is compiled in favour of the opposing view, the more your clinging to your wrongness feels like a grand display of heroic resolve. 


    That sense of heroic resolve isn't nothing. It's a legitimate emotional motivation, that need to feel some pride in one's accomplishments or worthiness, so maybe we can find a more constructive way to satisfy it.


    That's why I propose instead to focus on understanding instead of believing/knowing. Understanding is not a failure of will, but an accomplishment of intellect. If I explain my view to you, and you succeed in making sense of it and how my claims fit into your model of the world, that is your triumph; you own that understanding, and while I may have helped you attain it by making my explanation as clear and accessible to you as possible, I cannot command you to understand me. It is something you can only ever do for yourself.

    Moreover, once you understand a proposition, it's up to you to decide for yourself how likely it is to be true, given all the other things you understand about the world. Deciding that what I say is probably true is no longer a surrender of your autonomy to mine, but your own independent conclusion that you can take ownership of and can revise as you see fit, and thus not a sign of weakness. You can and should feel proud that you have successfully made sense of what someone else has to say, especially if it's something you might have been initially inclined to dismiss as obviously wrong.


    None of this is to say that you shouldn't be aware of the possibility of people lying. Quite the opposite: you should be always alert to the unreliability of anyone and everyone's testimony, whether they be lying, mistaken, confused or otherwise fallibly human. This is itself an important part of the process of understanding. It's just that you shouldn't fall into the laziness trap of dismissing everyone who disagrees with you as either dishonest or stupid, because that's generally just an excuse not to bother trying to actually understand them. They might well be dishonest or stupid, but it's dangerous to underestimate an opponent, or to see them as an opponent when they might not actually be one. 


    Nor should you completely abandon the possibility that the person might be mistaken in some way. You should try to start from the presumption that they're at least as smart as you are, and that if something seems wrong it's probably because you haven't properly understood it yet, but they might well have made some error in reasoning they haven't noticed yet. If you can identify and articulate exactly what this error is, that's a particularly glorious victory, but don't be too tempted to take shortcuts to it; you still have to really understand what they're trying to say to be able to pull this off effectively. 


    Once you shift from knowing to understanding, your real opponent is no longer the person you're arguing with, but the problem you're arguing about. If you can defeat the problem, you'll never lose an argument.