Chapter 5: Common Fallacies in Argumentation
“Forget the politicians. The politicians are put there to give you the idea you have freedom of choice. You don’t. You have no choice. You have owners. They own you. They own everything. They own all the important land; they own and control the corporations that’ve long since bought and paid for the senate, the congress, the state houses, the city halls; they got the judges in their back pocket, and they own all the big media companies, so they control just about all of the news and the information you get to hear. They’ve got you by the balls. They spend billions of dollars every year lobbying to get what they want. Well, we know what they want. They want more for themselves and less for everybody else. But I’ll tell you what they don’t want. They don’t want a population of citizens capable of critical thinking. They don’t want well-informed, well-educated people capable of critical thinking. They’re not interested in that. That doesn’t help them.”
–Part of George Carlin’s “Owners” stand-up comedy
Common Fallacies
A key skill for any critical thinker is bullshit detection. (1) Comedian George Carlin was particularly good at seeing through bullshit, and we should take seriously his warning that some interests in society would prefer to see less rather than more critical thinking. An important component of bullshit detection is the ability to spot common fallacies. Whether you are making arguments or evaluating them in public discourse, you should be familiar with common fallacies. A fallacy is an argument that is faulty, logically invalid, or deceptive. They can be quite persuasive, but you should not use them. As entrepreneur and author Michael Boylen puts it, “When we consciously choose to use fallacy to persuade others, we degrade ourselves. When we negligently allow ourselves to be duped by logical fallacy, we similarly degrade ourselves.” (2) There are many more fallacies than are listed here, but make sure you know the following:
Ad hominem—Although ad hominem means “against the man,” it would apply to any person. Ad hominem fallacies usually happen when we attack the person who made the argument in an attempt to discredit what they said or wrote, instead of attacking the argument on its merits. If convicted former Enron CEO Jeffrey Skilling makes an argument about how the United States could be more energy efficient by relying on ethanol, I would commit the ad hominem fallacy if I were to say, “Why should we believe anything this ethically-challenged guy says?” I haven’t addressed his argument on its merits; I’ve merely tried to discredit it for my audience by casting aspersions on Skilling. Ad hominem critiques can be directed at individuals, religious sects, races, ethnic groups, and even nations. (3)
Reductive fallacy—We commit this fallacy when we try to address complex issues with simple solutions. Someone might say that the key to solving poverty is to “make lazy people work.” Or, to end terrorism, we should “bomb the Middle East back to the Stone Age.” Or, to have a world-class public education system, we just need to “get back to the basics.” Issues like poverty, terrorism, and educational achievement gaps are tremendously complicated, and we are not likely to solve them with the public policy equivalents of bumper sticker slogans.
Post hoc ergo propter hoc—This fallacy literally means “after this, therefore because of this.” We make this error when we assert that A caused B simply because A preceded B. If the Ajax Corporation makes a donation to Senator Jones’ campaign fund, and a week later Senator Jones votes for a tax code bill that benefits the Ajax Corporation, we might be tempted to conclude that the campaign donation caused Jones to vote for the bill. If we were to do so, we would be guilty of committing the post hoc fallacy. The two actions may indeed be related in some way, but there is no way to know for certain without other evidence. Here’s a more basic example: You accidentally drop a large book on the floor, and an earthquake starts immediately afterwards. It’s certainly an odd coincidence, but you did not cause the earthquake by dropping your book.
Non sequitur—This means “it does not follow,” and refers to an argument whose evidence and conclusion don’t match the original claim. For example, let’s say a defendant has been accused of child abuse and the prosecutor’s case can be boiled down to the following points: Child abuse is an awful crime; the victim in this case has suffered horribly; therefore, the defendant is guilty of child abuse. (4) Such a tactic might appeal to the emotions of the jurors, but the prosecutor has not actually presented any evidence to support a guilty claim.
Appeal to majority—It is a fallacy to say that when many people believe a claim to be true, it is evidence of its truth. Actually, an idea’s popularity says nothing about its validity, otherwise we might still believe that the earth is the center of the solar system and the universe. The majority can be wrong. Indeed, it often is.
Straw man—The straw man tactic involves distorting an opponent’s position by stating it in an oversimplified or extreme form, and then refuting the distorted position instead of the real one. Unfortunately, this happens frequently in political debate. For example, critics of President Bush’s No Child Left Behind Act derided it as simply no child left untested because it emphasized standardized testing. Of course, the legislation was more complicated than that, but it was easier to boil the initiative down to one of its high-profile parts.
Red herring—Similar to the straw man, invoking a red herring is when someone brings up a non-related issue to make their case. For example, if Jack is making an argument that gay marriage should be constitutional under the Equal Protection Clause of the Fourteenth Amendment, Jane might try to counter-argue by saying that we must preserve eroding family values. Jane’s counter argument does not address Jack’s constitutional point but diverts the discussion into a separate issue. Apparently, the term red herring comes from the fact that herring turns red when it rots, and a rotten fish has such a strong smell that dragging it across a game trail can throw the dogs off the scent. (5)
Begging the question—When we beg the question, we use evidence that is essentially the same as the claim. We also refer to begging the question as a circular argument. As Ted Talk speaker David Kelley reminds us, “The point of reasoning is to throw light on the truth or falsity of a proposition…by relating it to other propositions…that we already have some basis for believing to be true. If our reasoning does nothing more than relate [the proposition] to itself, then it hasn’t gained us anything.”
For example, says Kelley, a circular argument might run like this: “Society has an obligation to support the needy, because people who cannot provide for themselves have a right to the resources of the community.” Everything in that sentence after the “because” is simply a restatement—in different form—of the proposition that society has an obligation to support the needy. (6) Another example: I say God exists. Why? Because the Bible says so. Why should you believe that source? I say that you should believe it because the Bible is the word of God. A convenient, anti-intellectual circle.
Black/white thinking—Black/white thinking goes by many names, the most common of which are false dichotomy and false dilemma. When we commit this fallacy, we shrink the world of possibilities down to two choices—one of which we favor—and insist that everyone must choose between them. Invariably, the other choice is some extreme or disastrous possibility that no one in their right mind would choose. Some examples: You either support our policies or you support the terrorists; America—love it or leave it; if we don’t execute murderers, they’ll be running free and killing more people. Such thinking is fallacious because it eliminates reasonable alternatives, distorts reality, and tries to stampede people into choosing a course without fully examining all possibilities.
Ecological fallacy—This is an easy fallacy to commit. It refers to making conclusions about a person based on aggregate data that is relevant to that person. Aggregate data is information compiled into summaries for public reporting and cannot be used to make definitive statements about an individual. Here’s a great example: “Imagine two small towns, each with only one hundred people. Town A has ninety-nine people earning $80,000 a year and one super-wealthy person who struck oil on their property, earning $5,000,000 a year. Town B has fifty people earning $100,000 a year and fifty people earning $140,000. The mean income of Town A is $129,200, and the mean income of Town B is $120,000. Although Town A has a higher mean income, in ninety-nine out of one hundred cases, any individual you select randomly from Town B will have a higher income than an individual selected randomly from Town A. The ecological fallacy is thinking that if you select someone at random from the group with the higher mean, that individual is likely to have a higher income.” (7)
References
- Harry G. Frankfurt, On Bullshit. Princeton, NJ: Princeton University Press, 2005.
- Michael Boylen, Critical Inquiry. The Process of Argument. Boulder, Colorado: Westview Press, 2010. Page 38.
- Loren J. Thompson, Habits of the Mind. Critical Thinking in the Classroom. New York: University Press of America, 1995. Pages 112-114.
- David Kelley, The Art of Reasoning. New York: W. W. Norton, 1988. Page 131.
- Kelley, The Art of Reasoning. Page 132.
- Kelley, The Art of Reasoning. Page 123.
- Daniel J. Levitin, A Field Guide to Lies. Critical Thinking in the Information Age. New York: Dutton, 2016. Pages 18-19.