0% found this document useful (0 votes)
467 views23 pages

Logical Fallacy

The document outlines 19 types of logical fallacies including ad hominem, appeal to force, appeal to pity, appeal to tradition, begging the question, cause and effect, circular argument, equivocation, false dilemma, genetic fallacy, guilt by association, non sequitur, poisoning the well, red herring, special pleading, straw man argument, and category mistake. It also discusses argumentum ad ignorantiam, argumentum ad logicam, and argumentum ad misericordiam providing examples of each fallacy type.
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
467 views23 pages

Logical Fallacy

The document outlines 19 types of logical fallacies including ad hominem, appeal to force, appeal to pity, appeal to tradition, begging the question, cause and effect, circular argument, equivocation, false dilemma, genetic fallacy, guilt by association, non sequitur, poisoning the well, red herring, special pleading, straw man argument, and category mistake. It also discusses argumentum ad ignorantiam, argumentum ad logicam, and argumentum ad misericordiam providing examples of each fallacy type.
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 23

1. Ad Hominem - Attacking the individual instead of the argument. A.

Example: You are so stupid your argument couldn't possibly be true. B. Example: I figured that you couldn't possibly get it right, so I ignored your comment. 2. Appeal to Force - Telling the hearer that something bad will happen to him if he does not accept the argument. A. Example: If you don't want to get beaten up, you will agree with what I say. B. Example: Convert or die. 3. Appeal to Pity - Urging the hearer to accept the argument based upon an appeal to emotions, sympathy, etc. A. Example: You owe me big time because I really stuck my neck out for you. B. Example: Oh come on, I've been sick. That's why I missed the deadline. 4. Appeal to the Popular - Urging the hearer to accept a position because a majority of people hold to it. A. Example: The majority of people like soda. Therefore, soda is good. B. Example: Everyone else is doing it. Why shouldn't you? 5. Appeal to Tradition - Trying to get someone to accept something because it has been done or believed for a long time. A. Example: This is the way we've always done it. Therefore, it is the right way. B. Example: The Catholic church's tradition demonstrates that this doctrine is true. 6. Begging the Question - Assuming the thing to be true that you are trying to prove. It is circular. A. Example: God exists because the Bible says so. The Bible is inspired. Therefore, we know that God exists. B. Example: I am a good worker because Frank says so. How can we trust Frank? Simple: I will vouch for him. 7. Cause and Effect - Assuming that the effect is related to a cause because the events occur together. A. Example: When the rooster crows, the sun rises. Therefore, the rooster causes the sun to rise. B. Example: When the fuel light goes on in my car, I soon run out of gas. Therefore, the fuel light causes my car to run out of gas. 8. Circular Argument - See Begging the Question 9. Fallacy of Division - Assuming that what is true of the whole is true for the parts. A. Example: That car is blue. Therefore, its engine is blue. B. Example: Your family is weird. That means that you are weird too. 10.Fallacy of Equivocation - Using the same term in an argument in different places but the word has different meanings. A. Example: A bird in the hand is worth two in the bush. Therefore, a bird is worth more than President Bush. B. Example: Evolution states that one species can change into another. We see that cars have evolved into different styles. Therefore, since evolution is a fact in cars, it is true in species. 11.False Dilemma - Giving two choices when in actuality there could be more choices possible. A. Example: You either did knock the glass over or you did not. Which is it? (Someone else could have knocked the glass over) B. Example: Do you still beat your wife? 12.Genetic Fallacy - Attempting to endorse or disqualify a claim because of the origin or irrelevant history of the claim.

A. Example: The Nazi regime developed the Volkswagen Beetle. Therefore, you should not buy a VW Beetle because of who started it. B. Example: Frank just got out of jail last year; since it was his idea to start the hardware store, I can't trust him. 13.Guilt by Association - Rejecting an argument or claim because the person proposing it likes someone whom is disliked by another. A. Example: Hitler liked dogs. Therefore dogs are bad. B. Example: Your friend is a thief. Therefore, I cannot trust you. 14.Non Sequitur - Comments or information that do not logically follow from a premise or the conclusion. A. Example: We know why it rained today: because I washed my car. B. Example: I don't care what you say. We don't need any more bookshelves. As long as the carpet is clean, we are fine. 15.Poisoning the Well - Presenting negative information about a person before he/she speaks so as to discredit the person's argument. A. Example: Frank is pompous, arrogant, and thinks he knows everything. So, let's hear what Frank has to say about the subject. B. Example: Don't listen to him because he is a loser. 16.Red Herring - Introducing a topic not related to the subject at hand. A. Example: I know your car isn't working right. But, if you had gone to the store one day earlier, you'd not be having problems. B. Example: I know I forgot to deposit the check into the bank yesterday. But, nothing I do pleases you. 17.Special Pleading (double standard) - Applying a standard to another that is different from a standard applied to oneself. A. Example: You can't possibly understand menopause because you are a man. B. Example: Those rules don't apply to me since I am older than you. 18.Straw Man Argument - Producing an argument about a weaker representation of the truth and attacking it. A. Example: The government doesn't take care of the poor because it doesn't have a tax specifically to support the poor. B. Example: We know that evolution is false because we did not evolve from monkeys. 19.Category Mistake - Attributing a property to something that could not possibly have that property. Attributing facts of one kind are attributed to another kind. Attributing to one category that which can only be properly attributed to another. A. Example: Blue sleeps faster than Wednesday. B. Example: Saying logic is transcendental is like saying cars would exist if matter didn't.

Whether or not an argumentum ad ignorantiam is really fallacious depends crucially upon the burden of proof. In an American courtroom, where the burden of proof rests with the prosecution, it would be fallacious for the prosecution to argue, "The defendant has no alibi, therefore he must have committed the crime." But it would be perfectly valid for the defense to argue, "The prosecution has not proven the

defendant committed the crime, therefore you should declare him not guilty." Both statements have the form of an argumentum ad ignorantiam; the difference is the burden of proof. In debate, the proposing team in a debate round is usually (but not always) assumed to have the burden of proof, which means that if the team fails to prove the proposition to the satisfaction of the judge, the opposition wins. In a sense, the opposition team's case is assumed true until proven false. But the burden of proof can sometimes be shifted; for example, in some forms of debate, the proposing team can shift the burden of proof to the opposing team by presenting a prima facie case that would, in the absence of refutation, be sufficient to affirm the proposition. Still, the higher burden generally rests with the proposing team, which means that only the opposition is in a position to make an accusation of argumentum ad ignorantiam with respect to proving the proposition. Argumentum ad logicam (argument to logic). This is the fallacy of assuming that something is false simply because a proof or argument that someone has offered for it is invalid; this reasoning is fallacious because there may be another proof or argument that successfully supports the proposition. This fallacy often appears in the context of a straw man argument. This is another case in which the burden of proof determines whether it is actually a fallacy or not. If a proposing team fails to provide sufficient support for its case, the burden of proof dictates they should lose the debate, even if there exist other arguments (not presented by the proposing team) that could have supported the case successfully. Moreover, it is common practice in debate for judges to give no weight to a point supported by an argument that has been proven invalid by the other team, even if there might be a valid argument the team failed to make that would have supported the same point; this is because the implicit burden of proof rests with the team that brought up the argument. For further commentary on burdens of proof, see argumentum ad ignorantiam, above. Argumentum ad misericordiam (argument or appeal to pity). The English translation pretty much says it all. Example: "Think of all the poor, starving Ethiopian children! How could we be so cruel as not to help them?" The problem with such an argument is that no amount of special pleading can make the impossible possible, the false true, the expensive costless, etc. It is, of course, perfectly legitimate to point out the severity of a problem as part of the justification for adopting a proposed solution. The fallacy comes in when other aspects of the proposed solution (such as whether it is possible, how much it costs, who else might be harmed by adopting the policy) are ignored or responded to only

with more impassioned pleas. You should not call your opposition down for committing this fallacy unless they rely on appeals to pity to the exclusion of the other necessary arguments. It is perfectly acceptable to use appeal to pity in order to argue that the benefits of the proposed policy are greater than they might at first appear (and hence capable of justifying larger costs). Argumentum ad nauseam (argument to the point of disgust; i.e., by repitition). This is the fallacy of trying to prove something by saying it again and again. But no matter how many times you repeat something, it will not become any more or less true than it was in the first place. Of course, it is not a fallacy to state the truth again and again; what is fallacious is to expect the repitition alone to substitute for real arguments. Nonetheless, this is a very popular fallacy in debate, and with good reason: the more times you say something, the more likely it is that the judge will remember it. The first thing they'll teach you in any public speaking course is that you should "Tell 'em what you're gonna tell 'em, then tell 'em, and then tell 'em what you told 'em." Unfortunately, some debaters think that's all there is to it, with no substantiation necessary! The appropriate time to mention argumentum ad nauseam in a debate round is when the other team has made some assertion, failed to justify it, and then stated it again and again. The Latin wording is particularly nice here, since it is evocative of what the opposition's assertions make you want to do: retch. "Sir, our opponents tell us drugs are wrong, drugs are wrong, drugs are wrong, again and again and again. But this argumentum ad nauseam can't and won't win this debate for them, because they've given us no justification for their bald assertions!" Argumentum ad numerum (argument or appeal to numbers). This fallacy is the attempt to prove something by showing how many people think that it's true. But no matter how many people believe something, that doesn't necessarily make it true or right. Example: "At least 70% of all Americans support restrictions on access to abortions." Well, maybe 70% of Americans are wrong! This fallacy is very similar to argumentum ad populum, the appeal to the people or to popularity. When a distinction is made between the two, ad populum is construed narrowly to designate an appeal to the opinions of people in the immediate vicinity, perhaps in hope of getting others (such as judges) to jump on the bandwagon, whereas ad numerum is used to designate appeals based purely on the number of people who hold a particular belief. The distinction is a fine one, and in general the terms can be used interchangeably in debate rounds. (I've found that ad populum has better rhetorical effect.)

Argumentum ad populum (argument or appeal to the public). This is the fallacy of trying to prove something by showing that the public agrees with you. For an example, see above. This fallacy is nearly identical to argumentum ad numerum, which you should see for more details. Argumentum ad verecundiam (argument or appeal to authority). This fallacy occurs when someone tries to demonstrate the truth of a proposition by citing some person who agrees, even though that person may have no expertise in the given area. For instance, some people like to quote Einstein's opinions about politics (he tended to have fairly left-wing views), as though Einstein were a political philosopher rather than a physicist. Of course, it is not a fallacy at all to rely on authorities whose expertise relates to the question at hand, especially with regard to questions of fact that could not easily be answered by a layman -- for instance, it makes perfect sense to quote Stephen Hawking on the subject of black holes. At least in some forms of debate, quoting various sources to support one's position is not just acceptable but mandatory. In general, there is nothing wrong with doing so. Even if the person quoted has no particular expertise in the area, he may have had a particularly eloquent way of saying something that makes for a more persuasive speech. In general, debaters should be called down for committingargumentum ad verecundiam only when (a) they rely on an unqualified source for information about facts without other (qualified) sources of verification, or (b) they imply that some policy must be right simply because so-and-so thought so. Circulus in demonstrando (circular argument). Circular argumentation occurs when someone uses what they are trying to prove as part of the proof of that thing. Here is one of my favorite examples (in pared down form): "Marijuana is illegal in every state in the nation. And we all know that you shouldn't violate the law. Since smoking pot is illegal, you shouldn't smoke pot. And since you shouldn't smoke pot, it is the duty of the government to stop people from smoking it, which is why marijuana is illegal!" Circular arguments appear a lot in debate, but they are not always so easy to spot as the example above. They are always illegitimate, though, and pointing them out in a debate round looks really good if you can do it. The best strategy for pointing out a circular argument is to make sure you can state clearly the proposition being proven, and then pinpoint where that proposition appears in the proof. A good summing up statement is, "In other words, they are trying to tell us that X is true because X is true! But they have yet to tell us why it's true." Complex question. A complex question is a question that implicitly assumes something to be true by its construction, such as "Have you stopped beating your

wife?" A question like this is fallacious only if the thing presumed true (in this case, that you beat your wife) has not been established. Complex questions are a well established and time-honored practice in debate, although they are rarely so bald-faced as the example just given. Complex questions usually appear in cross-examination or points of information when the questioner wants the questionee to inadvertently admit something that she might not admit if asked directly. For instance, one might say, "Inasmuch as the majority of black Americans live in poverty, do you really think that self-help within the black community is sufficient to address their problems?" Of course, the introductory clause about the majority of black Americans living in poverty may not be true (in fact, it is false), but an unwary debater might not think quickly enough to notice that the stowaway statement is questionable. This is a sneaky tactic, but debate is sometimes a sneaky business. You wouldn't want to put a question like that in your master's thesis, but it might work in a debate. But be careful -- if you try to pull a fast one on someone who is alert enough to catch you, you'll look stupid. "The assumption behind your question is simply false. The majority of blacks do not live in poverty. Get your facts straight before you interrupt me again!" Cum hoc ergo propter hoc (with this, therefore because of this). This is the familiar fallacy of mistaking correlation for causation -- i.e., thinking that because two things occur simultaneously, one must be a cause of the other. A popular example of this fallacy is the argument that "President Clinton has great economic policies; just look at how well the economy is doing while he's in office!" The problem here is that two things may happen at the same time merely by coincidence (e.g., the President may have a negligible effect on the economy, and the real driving force is technological growth), or the causative link between one thing and another may be lagged in time (e.g., the current economy's health is determined by the actions of previous presidents), or the two things may be unconnected to each other but related to a common cause (e.g., downsizing upset a lot of voters, causing them to elect a new president just before the economy began to benefit from the downsizing). It is always fallacious to suppose that there is a causative link between two things simply because they coexist. But a correlation is usually considered acceptable supporting evidence for theories that argue for a causative link between two things. For instance, some economic theories suggest that substantially reducing the federal budget deficit should cause the economy to do better (loosely speaking), so the coincidence of deficit reductions under Clinton and the economy's relative health might be taken as evidence in favor of those economic theories. In debate rounds, what this means is that it is acceptable to demonstrate a correlation between two phenomenon and to say one caused the other if you can also come up with convincing reasons why the correlation is no accident.

Cum hoc ergo propter hoc is very similar to post hoc ergo propter hoc, below. The two terms can be used almost interchangeably, post hoc (as it is affectionately called) being the preferred term. Dicto simpliciter (spoken simply, i.e., sweeping generalization). This is the fallacy of making a sweeping statement and expecting it to be true of every specific case -- in other words, stereotyping. Example: "Women are on average not as strong as men and less able to carry a gun. Therefore women can't pull their weight in a military unit." The problem is that the sweeping statement may be true (on average, women are indeed weaker than men), but it is not necessarily true for every member of the group in question (there are some women who are much stronger than the average). As the example indicates, dicto simpliciter is fairly common in debate rounds. Most of the time, it is not necessary to call an opposing debater down for making this fallacy -it is enough to point out why the sweeping generalization they have made fails to prove their point. Since everybody knows what a sweeping generalization is, using the Latin in this case will usually sound condescending. It is also important to note that some generalizations are perfectly valid and apply directly to all individual cases, and therefore do not commit the fallacy of dicto simpliciter (for example, "All human males have a Y chromosome" is, to my knowledge, absolutely correct). Nature, appeal to. This is the fallacy of assuming that whatever is "natural" or consistent with "nature" (somehow defined) is good, or that whatever conflicts with nature is bad. For example, "Sodomy is unnatural; anal sex is not the evolutionary function of a penis or an anus. Therefore sodomy is wrong." But aside from the difficulty of defining what "natural" even means, there is no particular reason to suppose that unnatural and wrong are the same thing. After all, wearing clothes, tilling the soil, and using fire might be considered unnatural since no other animals do so, but humans do these things all the time and to great benefit. The appeal to nature appears occasionally in debate, often in the form of naive environmentalist arguments for preserving pristine wilderness or resources. The argument is very weak and should always be shot down. It can, however, be made stronger by showing why at least in specific cases, there may be a (possibly unspecifiable) benefit to preserving nature as it is. A typical ecological argument along these lines is that human beings are part of a complex biological system that is highly sensitive to shocks, and therefore it is dangerous for humans to engage in activities that might damage the system in ways we cannot predict. Note, however, that this approach no longer appeals to nature itself, but to the value of human survival. For further comment on this subject, see the naturalistic fallacy.

Naturalistic fallacy. This is the fallacy of trying to derive conclusions about what is right or good (that is, about values) from statements of fact alone. This is invalid because no matter how many statements of fact you assemble, any logical inference from them will be another statement of fact, not a statement of value. If you wish to reach conclusions about values, then you must include amongst your assumptions (or axioms, or premises) a statement of value. Once you have an axiomatic statement of value, then you may use it in conjunction with statements of fact to reach value-laden conclusions. For example, someone might argue that the premise, "This medicine will prevent you from dying" immediately leads to the conclusion, "You should take this medicine." But this reasoning is invalid, because the former statement is a statement of fact, while the latter is a statement of value. To reach the conclusion that you ought to take the medicine, you would need at least one more premise: "You ought to try to preserve your life whenever possible." The naturalistic fallacy appears in many forms. Two examples are argumentum ad antiquitatem (saying something's right because it's always been done that way) and the appeal to nature (saying something's right because it's natural). In both of these fallacies, the speaker is trying to reach a conclusion about what we ought to do or ought to value based solely on what is the case. David Hume called this trying to bridge the "is-ought gap," which is a nice phrase to use in debate rounds where your opponent is committing the naturalistic fallacy. One unsettling implication of taking the naturalistic fallacy seriously is that, in order to reach any conclusions of value, one must be willing to posit some initial statement or statements of value that will be treated as axioms, and which cannot themselves be justified on purely logical grounds. Fortunately, debate does not restrict itself to purely logical grounds of argumentation. For example, suppose your opponent has stated axiomatically that "whatever is natural is good." Inasmuch as this statement is an axiom rather than the conclusion of a logical proof, there can be no purely logical argument against it. But some nonetheless appropriate responses to such an absolute statement of value include: (a) questioning whether anyone -- you, your judge, or even your opponent himself -- really believes that "whatever is natural is good"; (b) stating a competing axiomatic value statement, like "whatever enhances human life is good," and forcing the judge to choose between them; and (c) pointing out logical implications of the statement "whatever is natural is good" that conflict with our most basic intuitions about right and wrong. Non Sequitur ("It does not follow"). This is the simple fallacy of stating, as a conclusion, something that does not strictly follow from the premises. For example, "Racism is wrong. Therefore, we need affirmative action." Obviously, there is at least

one missing step in this argument, because the wrongness of racism does not imply a need for affirmative action without some additional support (such as, "Racism is common," "Affirmative action would reduce racism," "There are no superior alternatives to affirmative action," etc.). Not surprisingly, debate rounds are rife with non sequitur. But that is partly just a result of having to work within the time constraints of a debate round, and partly a result of using good strategy. A debate team arguing for affirmative action would be foolish to say in their first speech, "We also believe that affirmative action does not lead to a racist backlash," because doing so might give the other side a hint about a good argument to make. A better strategy (usually) is to wait for the other team to bring up an argument, and then refute it; that way, you don't end up wasting your time by refuting arguments that the opposition has never made in the first place. (This strategy is not always preferable, though, because some counterarguments are so obvious and important that it makes sense to address them early and nip them in the bud.) For these reasons, it is generally bad form to scream "non sequitur" just because your opposition has failed to anticipate every counterargument you might make. The best time to point out a non sequituris when your opposition is trying to construct a chain of causation (A leads to B leads to C, etc.) without justifying each step in the chain. For each step in the chain they fail to justify, point out the non sequitur, so that it is obvious by the end that the alleged chain of causation is tenuous and implausible. Petitio principii (begging the question). This is the fallacy of assuming, when trying to prove something, what it is that you are trying prove. For all practical purposes, this fallacy is indistinguishable from circular argumentation. The main thing to remember about this fallacy is that the term "begging the question" has a very specific meaning. It is common to hear debaters saying things like, "They say pornography should be legal because it is a form of free expression. But this begs the question of what free expression means." This is a misuse of terminology. Something may inspire or motivate us to ask a particular question without begging the question. A question has been begged only if the question has been asked before in the same discussion, and then a conclusion is reached on a related matter without the question having been answered. If somebody said, "The fact that we believe pornography should be legal means that it is a valid form of free expression. And since it's free expression, it shouldn't be banned," thatwould be begging the question. Post hoc ergo propter hoc (after this, therefore because of this). This is the fallacy of assuming that A caused B simply because A happened prior to B. A favorite example: "Most rapists read pornography when they were teenagers; obviously,

pornography causes violence toward women." The conclusion is invalid, because there can be a correlation between two phenomena without one causing the other. Often, this is because both phenomena may be linked to the same cause. In the example given, it is possible that some psychological factor -- say, a frustrated sex drive -- might cause both a tendency toward sexual violence and a desire for pornographic material, in which case the pornography would not be the true cause of the violence. Post hoc ergo propter hoc is nearly identical to cum hoc ergo propter hoc, which you should see for further details. Red herring. This means exactly what you think it means: introducing irrelevant facts or arguments to distract from the question at hand. For example, "The opposition claims that welfare dependency leads to higher crime rates -- but how are poor people supposed to keep a roof over their heads without our help?" It is perfectly valid to ask this question as part of the broader debate, but to pose it as a response to the argument about welfare leading to crime is fallacious. (There is also an element of ad misericordiam in this example.) It is not fallacious, however, to argue that benefits of one kind may justify incurring costs of another kind. In the example given, concern about providing shelter for the poor would not refute concerns about crime, but one could plausibly argue that a somewhat higher level of crime is a justifiable price given the need to alleviate poverty. This is a debatable point of view, but it is no longer a fallacious one. The term red herring is sometimes used loosely to refer to any kind of diversionary tactic, such as presenting relatively unimportant arguments that will use up the other debaters' speaking time and distract them from more important issues. This kind of a red herring is a wonderful strategic maneuver with which every debater should be familiar. Slippery slope. A slippery slope argument is not always a fallacy. A slippery slope fallacy is an argument that says adopting one policy or taking one action will lead to a series of other policies or actions also being taken, without showing a causal connection between the advocated policy and the consequent policies. A popular example of the slippery slope fallacy is, "If we legalize marijuana, the next thing you know we'll legalize heroin, LSD, and crack cocaine." This slippery slope is a form of non sequitur, because no reason has been provided for why legalization of one thing leads to legalization of another. Tobacco and alcohol are currently legal, and yet other drugs have somehow remained illegal.

There are a variety of ways to turn a slippery slope fallacy into a valid (or at least plausible) argument. All you need to do is provide some reason why the adoption of one policy will lead to the adoption of another. For example, you could argue that legalizing marijuana would cause more people to consider the use of mind-altering drugs acceptable, and those people will support more permissive drug policies across the board. An alternative to the slippery slope argument is simply to point out that the principles espoused by your opposition imply the acceptability of certain other policies, so if we don't like those other policies, we should question whether we really buy those principles. For instance, if the proposing team argued for legalizing marijuana by saying, "individuals should be able to do whatever they want with their own bodies," the opposition could point out that that principle would also justify legalizing a variety of other drugs -- so if we don't support legalizing other drugs, then maybe we don't really believe in that principle. Straw man. This is the fallacy of refuting a caricatured or extreme version of somebody's argument, rather than the actual argument they've made. Often this fallacy involves putting words into somebody's mouth by saying they've made arguments they haven't actually made, in which case the straw man argument is a veiled version of argumentum ad logicam. One example of a straw man argument would be to say, "Mr. Jones thinks that capitalism is good because everybody earns whatever wealth they have, but this is clearly false because many people just inherit their fortunes," when in fact Mr. Jones had not made the "earnings" argument and had instead argued, say, that capitalism gives most people an incentive to work and save. The fact that some arguments made for a policy are wrong does not imply that the policy itself is wrong. In debate, strategic use of a straw man can be very effective. A carefully constructed straw man can sometimes entice an unsuspecting opponent into defending a silly argument that he would not have tried to defend otherwise. But this strategy only works if the straw man is not too different from the arguments your opponent has actually made, because a really outrageous straw man will be recognized as just that. The best straw man is not, in fact, a fallacy at all, but simply a logical extension or amplification of an argument your opponent has made. Tu quoque ("you too"). This is the fallacy of defending an error in one's reasoning by pointing out that one's opponent has made the same error. An error is still an error, regardless of how many people make it. For example, "They accuse us of making unjustified assertions. But they asserted a lot of things, too!" Although clearly fallacious, tu quoque arguments play an important role in debate because they may help establish who has done a better job of debating (setting aside the issue of whether the proposition is true or not). If both teams have engaged in ad

hominem attacks, or both teams have made a few appeals to pity, then it would hardly be fair to penalize one team for it but not the other. In addition, it is not fallacious at all to point out that certain advantages or disadvantages may apply equally to both positions presented in a debate, and therefore they cannot provide a reason for favoring one position over the other (such disadvantages are referred to as "nonunique"). In general, using tu quoque statements is a good way to assure that judges make decisions based only on factors that distinguish between the two sides.
_____________________________________________________________________________________ -

FALLACIES OF RELEVANCE: These fallacies appeal to evidence or examples that are


not relevant to the argument at hand. Appeal to Force (Argumentum Ad Baculum or the "Might-Makes-Right" Fallacy): This argument uses force, the threat of force, or some other unpleasant backlash to make the audience accept a conclusion. It commonly appears as a last resort when evidence or rational arguments fail to convince a reader. If the debate is about whether or not 2+2=4, an opponent's argument that he will smash your nose in if you don't agree with his claim doesn't change the truth of an issue. Logically, this consideration has nothing to do with the points under consideration. The fallacy is not limited to threats of violence, however. The fallacy includes threats of any unpleasant backlash--financial, professional, and so on. Example: "Superintendent, you should cut the school budget by $16,000. I need not remind you that past school boards have fired superintendents who cannot keep down costs." While intimidation may force the superintendent to conform, it does not convince him that the choice to cut the budget was the most beneficial for the school or community. Lobbyists use this method when they remind legislators that they represent so many thousand votes in the legislators' constituencies and threaten to throw the politician out of office if he doesn't vote the way they want. Teachers use this method if they state that students should hold the same political or philosophical position as the teachers or risk failing the class. Note that it is isn't a logical fallacy, however, to assert that students must fulfill certain requirements in the course or risk failing the class! Genetic Fallacy: The genetic fallacy is the claim that an idea, product, or person must be untrustworthy because of its racial, geographic, or ethnic origin. "That car can't possibly be any good! It was made in Japan!" Or, "Why should I listen to her argument? She comes from California, and we all know those people are flakes." Or, "Ha! I'm not reading that book. It was published in Tennessee, and we know all Tennessee folk are hillbillies and rednecks!" This type of fallacy is closely related to the fallacy of argumentum ad hominem or personal attack, appearing immediately below. Personal Attack (Argumentum Ad Hominem, literally, "argument toward the man." Also called "Poisoning the Well"): Attacking or praising the people who make an argument, rather than discussing the argument itself. This practice is fallacious because the personal character of an individual is logically irrelevant to the truth or falseness of the argument itself. The statement

"2+2=4" is true regardless if is stated by criminals, congressmen, or pastors. There are two subcategories: (1) Abusive: To argue that proposals, assertions, or arguments must be false or dangerous because they originate with atheists, Christians, Communists, capitalists, the John Birch Society, Catholics, anti-Catholics, racists, anti-racists, feminists, misogynists (or any other group) is fallacious. This persuasion comes from irrational psychological transference rather than from an appeal to evidence or logic concerning the issue at hand. This is similar to the genetic fallacy, and only an anti-intellectual would argue otherwise. (2) Circumstantial: To argue that an opponent should accept or reject an argument because of circumstances in his or her life. If one's adversary is a clergyman, suggesting that he should accept a particular argument because not to do so would be incompatible with the scriptures is such a fallacy. To argue that, because the reader is a Republican or Democrat, she must vote for a specific measure is likewise a circumstantial fallacy. The opponent's special circumstances have no control over the truth or untruth of a specific contention. The speaker or writer must find additional evidence beyond that to make a strong case. This is also similar to the genetic fallacy in some ways. If you are a college student who wants to learn rational thought, you simply must avoid circumstantial fallacies. Argumentum ad Populum (Literally "Argument to the People"): Using an appeal to popular assent, often by arousing the feelings and enthusiasm of the multitude rather than building an argument. It is a favorite device with the propagandist, the demagogue, and the advertiser. An example of this type of argument is Shakespeare's version of Mark Antony's funeral oration for Julius Caesar. There are three basic approaches: (1) Bandwagon Approach: Everybody is doing it. This argumentum ad populum asserts that, since the majority of people believes an argument or chooses a particular course of action, the argument must be true, or the course of action must be followed, or the decision must be the best choice. For instance, 85% of consumers purchase IBM computers rather than Macintosh; all those people cant be wrong. IBM must make the best computers. Popular acceptance of any argument does not prove it to be valid, nor does popular use of any product necessarily prove it is the best one. After all, 85% of people may once have thought planet earth was flat, but that majority's belief didn't mean the earth really was flat when they believed it! Keep this in mind, and remember that everybody should avoid this type of logical fallacy. (2) Patriotic Approach: "Draping oneself in the flag." This argument asserts that a certain stance is true or correct because it is somehow patriotic, and that those who disagree are unpatriotic. It overlaps with pathos and argumentum ad hominem to a certain extent. The best way to spot it is to look for emotionally charged terms like Americanism, rugged individualism, motherhood, patriotism, godless communism, etc. A true American would never use this approach. And a truly free man will exercise his American right to drink beer, since beer belongs in this great country of ours.This approach is unworthy of a good citizen. (3) Snob Approach: This type of argumentum ad populum doesnt assert everybody is doing it, but rather that all the best people are doing it. For instance, Any true intellectual would

recognize the necessity for studying logical fallacies. The implication is that anyone who fails to recognize the truth of the authors assertion is not an intellectual, and thus the reader had best recognize that necessity. In all three of these examples, the rhetorician does not supply evidence that an argument is true; he merely makes assertions about people who agree or disagree with the argument. For Christian students in religious schools like Carson-Newman, we might add a fourth category, "Covering Oneself in the Cross." This argument asserts that a certain political or denominational stance is true or correct because it is somehow "Christian," and that anyone who disagrees is behaving in an "un-Christian" or "godless" manner. (It is similar to the patriotic approach except it substitutes a gloss of piety instead of patriotism.) Examples include the various "Christian Voting Guides" that appear near election time, many of them published by non-Church related organizations with hidden financial/political agendas, or the stereotypical crooked used-car salesman who keeps a pair of bibles on his dashboard in order to win the trust of those he would fleece. Keep in mind Moliere's question in Tartuffe: "Is not a face quite different than a mask?" Is not the appearance of Christianity quite different than actual Christianity? Christians should beware of such manipulation since they are especially vulnerable to it. Appeal to Tradition (Argumentum ad Traditio): This line of thought asserts that a premise must be true because people have always believed it or done it. Alternatively, it may conclude that the premise has always worked in the past and will thus always work in the future: Jefferson City has kept its urban growth boundary at six miles for the past thirty years. That has been good enough for thirty years, so why should we change it now? If it aint broke, dont fix it. Such an argument is appealing in that it seems to be common sense, but it ignores important questions. Might an alternative policy work even better than the old one? Are there drawbacks to that longstanding policy? Are circumstances changing from the way they were thirty years ago? Appeal to Improper Authority (Argumentum Ad Verecundium, literally "argument from that which is improper"): An appeal to an improper authority, such as a famous person or a source that may not be reliable. This fallacy attempts to capitalize upon feelings of respect or familiarity with a famous individual. It is not fallacious to refer to an admitted authority if the individuals expertise is within a strict field of knowledge. On the other hand, to cite Einstein to settle an argument about education or economics is fallacious. To cite Darwin, an authority on biology, on religious matters is fallacious. To cite Cardinal Spellman on legal problems is fallacious. The worst offenders usually involve movie stars and psychic hotlines. A subcategory is the Appeal to Biased Authority. In this sort of appeal, the authority is one who actually isknowledgeable on the matter, but one who may have professional or personal motivations that render his professional judgment suspect: for instance, "To determine whether fraternities are beneficial to this campus, we interviewed all the frat presidents." Or again, "To find out whether or not sludge-mining really is endangering the Tuskogee salamander's breeding grounds, we interviewed the supervisors of the sludge-mines, who declared there is no problem." Indeed, it is important to get "both viewpoints" on an argument, but basing a substantial part of your argument on a source that has personal, professional, or financial interests at stake may lead to biased arguments.

Appeal to Emotion (Argumentum Ad Misericordiam, literally, "argument from pity"): An emotional appeal concerning what should be a logical issue during a debate. While pathos generally works to reinforce a readers sense of duty or outrage at some abuse, if a writer tries to use emotion merely for the sake of getting the reader to accept what should be a logical conclusion, the argument is a fallacy. For example, in the 1880s, prosecutors in a Virginia court presented overwhelming proof that a boy was guilty of murdering his parents with an ax. The defense presented a "not-guilty" plea for on the grounds that the boy was now an orphan, with no one to look after his interests if the court was not lenient. This appeal to emotion obviously seems misplaced, and the argument is irrelevant to the question of whether or not he did the crime. Argument from Adverse Consequences: Asserting that an argument must be false because the implications of it being true would create negative results. For instance, The medical tests show that Grandma has advanced cancer. However, that cant be true because then she would die! I refuse to believe it! The argument is illogical because truth and falsity are not contingent based upon how much we like or dislike the consequences of that truth. Grandma, indeed, might have cancer, in spite of how negative that fact may be or how it may affect us. Argument from Personal Incredulity: Asserting that opponents argument must be false because you personally dont understand it or cant follow its technicalities. For instance, one person might assert, I dont understand that engineers argument about how airplanes can fly. Therefore, I cannot believe that airplanes are able to fly. Au contraire, that speakers own mental limitations do not limit the physical worldso airplanes may very well be able to fly in spite of his or her inability to understand how they work. One persons comprehension is not relevant to the truth of a matter.

COMPONENT FALLACIES: Component fallacies are errors in inductive and deductive


reasoning or in syllogistic terms that fail to overlap. Begging the Question (also called Petitio Principii, this term is sometimes used interchangeably with Circular Reasoning): If writers assume as evidence for their argument the very conclusion they are attempting to prove, they engage in the fallacy of begging the question. The most common form of this fallacy is when the first claim is initially loaded with the very conclusion one has yet to prove. For instance, suppose a particular student group states, "Useless courses like English 101 should be dropped from the college's curriculum." The members of the student group then immediately move on in the argument, illustrating that spending money on a useless course is something nobody wants. Yes, we all agree that spending money on useless courses is a bad thing. However, those students never did prove that English 101 was itself a useless course-they merely "begged the question" and moved on to the next "safe" part of the argument, skipping over the part that's the real controversy, the heart of the matter, the most important component. Begging the question is often hidden in the form of a complex question (see below).

Circular Reasoning is closely related to begging the question. Often the writers using this fallacy word take one idea and phrase it in two statements. The assertions differ sufficiently to obscure the fact that that the same proposition occurs as both a premise and a conclusion. The speaker or author then tries to "prove" his or her assertion by merely repeating it in different words. Richard Whately wrote in Elements of Logic (London 1826): To allow every man unbounded freedom of speech must always be on the whole, advantageous to the state; for it is highly conducive to the interest of the community that each individual should enjoy a liberty perfectly unlimited of expressing his sentiments. Obviously the premise is not logically irrelevant to the conclusion, for if the premise is true the conclusion must also be true. It is, however, logically irrelevant in proving the conclusion. In the example, the author is repeating the same point in different words, and then attempting to "prove" the first assertion with the second one. A more complex but equally fallacious type of circular reasoning is to create a circular chain of reasoning like this one: "God exists." "How do you know that God exists?" "The Bible says so." "Why should I believe the Bible?" "Because it's the inspired word of God." If we draw this out as a chart, it looks like this:

The so-called "final proof" relies on unproven evidence set forth initially as the subject of debate. Basically, the argument goes in an endless circle, with each step of the argument relying on a previous one, which in turn relies on the first argument yet to be proven. Surely God deserves a more intelligible argument than the circular reasoning proposed in this example! Hasty Generalization (Dicto Simpliciter, also called Jumping to Conclusions, "Converse Accident"): Mistaken use of inductive reasoning when there are too few samples to prove a point. Example: "Susan failed Biology 101. Herman failed Biology 101. Egbert failed Biology 101. I therefore conclude that most students who take Biology 101 will fail it." In understanding and characterizing general situations, a logician cannot normally examine every single example. However, the examples used in inductive reasoning should be typical of the problem or situation at hand. Maybe Susan, Herman, and Egbert are exceptionally poor students. Maybe they were sick and missed too many lectures that term to pass. If a logician wants to make the case that most students will fail Biology 101, she should (a) get a very large sample--at least one larger

than three--or (b) if that isn't possible, she will need to go out of his way to prove to the reader that her three samples are somehow representative of the norm. If a logician considers only exceptional or dramatic cases and generalizes a rule that fits these alone, the author commits the fallacy of hasty generalization. One common type of hasty generalization is the Fallacy of Accident. This error occurs when one applies a general rule to a particular case when accidental circumstances render the general rule inapplicable. For example, in Platos Republic, Plato finds an exception to the general rule that one should return what one has borrowed: Suppose that a friend when in his right mind has deposited arms with me and asks for them when he is not in his right mind. Ought I to give the weapons back to him? No one would say that I ought or that I should be right in doing so. . . . What is true in general may not be true universally and without qualification. So remember, generalizations are bad. All of them. Every single last one. Except, of course, for those that are not. Another common example of this fallacy is the misleading statistic. Suppose an individual argues that women must be incompetent drivers, and he points out that last Tuesday at the Department of Motor Vehicles, 50% of the women who took the driving test failed. That would seem to be compelling evidence from the way the statistic is set forth. However, if only two women took the test that day, the results would be far less clear-cut. Incidentally, the cartoon Dilbert makes much of an incompetent manager who cannot perceive misleading statistics. He does a statistical study of when employees call in sick and cannot come to work during the five-day work week. He becomes furious to learn that 40% of office "sick-days" occur on Mondays (20%) and Fridays (20%)--just in time to create a three-day weekend. Suspecting fraud, he decides to punish his workers. The irony, of course, is that these two days compose 40% of a five day work week, so the numbers are completely average. Similar nonsense emerges when parents or teachers complain that "50% of students perform at or below the national average on standardized tests in mathematics and verbal aptitude." Of course they do! The very nature of an average implies that! False Cause: This fallacy establishes a cause/effect relationship that does not exist. There are various Latin names for various analyses of the fallacy. The two most common include these types: (1) Non Causa Pro Causa (Literally, "Not the cause for a cause"): A general, catch-all category for mistaking a false cause of an event for the real cause. (2) Post Hoc, Ergo Propter Hoc (Literally: "After this, therefore because of this"): This type of false cause occurs when the writer mistakenly assumes that, because the first event preceded the second event, it must mean the first event caused the later one. Sometimes it does, but sometimes it doesn't. It is the honest writer's job to establish clearly that connection rather than merely assert it exists. Example: "A black cat crossed my path at noon. An hour later, my mother had a heartattack. Because the first event occurred earlier, it must have caused the bad luck later." This is how superstitions begin.

The most common examples are arguments that viewing a particular movie or show, or listening to a particular type of music caused the listener to perform an antisocial act--to snort coke, shoot classmates, or take up a life of crime. These may be potential suspects for the cause, but the mere fact that an individual did these acts and subsequently behaved in a certain way does not yet conclusively rule out other causes. Perhaps the listener had an abusive home-life or school-life, suffered from a chemical imbalance leading to depression and paranoia, or made a bad choice in his companions. Other potential causes must be examined before asserting that only one event or circumstance alone earlier in time caused a event or behavior later. For more information, see correlation and causation. Irrelevant Conclusion (Ignorantio Elenchi): This fallacy occurs when a rhetorician adapts an argument purporting to establish a particular conclusion and directs it to prove a different conclusion. For example, when a particular proposal for housing legislation is under consideration, a legislator may argue that decent housing for all people is desirable. Everyone, presumably, will agree. However, the question at hand concerns a particular measure. The question really isn't, "Is it good to have decent housing?" The question really is, "Will this particular measure actually provide it or is there a better alternative?" This type of fallacy is a common one in student papers when students use a shared assumption--such as the fact that decent housing is a desirable thing to have--and then spend the bulk of their essays focused on that fact rather than the real question at issue. It's similar to begging the question, above. One of the most common forms of Ignorantio Elenchi is the "Red Herring." A red herring is a deliberate attempt to change the subject or divert the argument from the real question at issue to some side-point; for instance, Senator Jones should not be held accountable for cheating on his income tax. After all, there are other senators who have done far worse things. Another example: I should not pay a fine for reckless driving. There are many other people on the street who are dangerous criminals and rapists, and the police should be chasing them, not harassing a decent tax-paying citizen like me. Certainly, worse criminals do exist, but that it is another issue! The questions at hand are (1) did the speaker drive recklessly, and (2) should he pay a fine for it? Another similar example of the red herring is the fallacy known as Tu Quoque (Latin for "And you too!"), which asserts that the advice or argument must be false simply because the person presenting the advice doesn't follow it herself. For instance, "Reverend Jeremias claims that theft is wrong, but how can theft be wrong if Jeremias himself admits he stole objects when he was a child?" Straw Man Argument: A subtype of the red herring, this fallacy includes any lame attempt to "prove" an argument by overstating, exaggerating, or over-simplifying the arguments of the opposing side. Such an approach is building a straw man argument. The name comes from the idea of a boxer or fighter who meticulously fashions a false opponent out of straw, like a scarecrow, and then easily knocks it over in the ring before his admiring audience. His "victory" is a hollow mockery, of course, because the straw-stuffed opponent is incapable of fighting back. When a writer makes a cartoon-like caricature of the opposing argument, ignoring the real or subtle points of contention, and then proceeds to knock down each "fake" point one-by-one, he has created a straw man argument.

For instance, one speaker might be engaged in a debate concerning welfare. The opponent argues, "Tennessee should increase funding to unemployed single mothers during the first year after childbirth because they need sufficient money to provide medical care for their newborn children." The second speaker retorts, "My opponent believes that some parasites who don't work should get a free ride from the tax money of hard-working honest citizens. I'll show you why he's wrong . . ." In this example, the second speaker is engaging in a straw man strategy, distorting the opposition's statement about medical care for newborn children into an oversimplified form so he can more easily appear to "win." However, the second speaker is only defeating a dummy-argument rather than honestly engaging in the real nuances of the debate. Non Sequitur (literally, "It does not follow"): A non sequitur is any argument that does not follow from the previous statements. Usually what happened is that the writer leaped from A to B and then jumped to D, leaving out step C of an argument she thought through in her head, but did not put down on paper. The phrase is applicable in general to any type of logical fallacy, but logicians use the term particularly in reference to syllogistic errors such as the undistributed middle term, non causa pro causa, and ignorantio elenchi. A common example would be an argument along these lines: "Giving up our nuclear arsenal in the 1980's weakened the United States' military. Giving up nuclear weaponry also weakened China in the 1990s. For this reason, it is wrong to try to outlaw pistols and rifles in the United States today." There's obviously a step or two missing here. The "Slippery Slope" Fallacy (also called "The Camel's Nose Fallacy") is a non sequitur in which the speaker argues that, once the first step is undertaken, a second or third step will inevitably follow, much like the way one step on a slippery incline will cause a person to fall and slide all the way to the bottom. It is also called "the Camel's Nose Fallacy" because of the image of a sheik who let his camel stick its nose into his tent on a cold night. The idea is that the sheik is afraid to let the camel stick its nose into the tent because once the beast sticks in its nose, it will inevitably stick in its head, and then its neck, and eventually its whole body. However, this sort of thinking does not allow for any possibility of stopping the process. It simply assumes that, once the nose is in, the rest must follow--that the sheik can't stop the progression once it has begun--and thus the argument is a logical fallacy. For instance, if one were to argue, "If we allow the government to infringe upon our right to privacy on the Internet, it will then feel free to infringe upon our privacy on the telephone. After that, FBI agents will be reading our mail. Then they will be placing cameras in our houses. We must not let any governmental agency interfere with our Internet communications, or privacy will completely vanish in the United States." Such thinking is fallacious; no logical proof has been provided yet that infringement in one area will necessarily lead to infringement in another, no more than a person buying a single can of CocaCola in a grocery store would indicate the person will inevitably go on to buy every item available in the store, helpless to stop herself. So remember to avoid the slippery slope fallacy; once you use one, you may find yourself using more and more logical fallacies. Either/Or Fallacy (also called "the Black-and-White Fallacy," "Excluded Middle," "False Dilemma," or "False Dichotomy"): This fallacy occurs when a writer builds an argument upon the assumption that there are only two choices or possible outcomes when actually there are several. Outcomes are seldom so simple. This fallacy most frequently appears in connection to sweeping generalizations: Either we must ban X or the American way of life will collapse.

"We go to war with Canada, or else Canada will eventually grow in population and overwhelm the United States." "Either you drink Burpsy Cola, or you will have no friends and no social life." Either you must avoid either/or fallacies, or everyone will think you are foolish. Faulty Analogy: Relying only on comparisons to prove a point rather than arguing deductively and inductively. For example, education is like cake; a small amount tastes sweet, but eat too much and your teeth will rot out. Likewise, more than two years of education is bad for a student. The analogy is only acceptable to the degree a reader thinks that education is similar to cake. As you can see, faulty analogies are like flimsy wood, and just as no carpenter would build a house out of flimsy wood, no writer should ever construct an argument out of flimsy material. Undistributed Middle Term: A specific type of error in deductive reasoning in which the minor premise and the major premise of a syllogism might or might not overlap. Consider these two examples: (1) All reptiles are cold-blooded. All snakes are reptiles. All snakes are coldblooded. In the first example, the middle term snakes fits in the categories of both reptile and things-that-are-cold-blooded. (2) All snails are cold-blooded. All snakes are coldblooded. All snails are snakes. In the second example, the middle term of snakes does not fit into the categories of both things-that-are-cold-blooded and snails. Sometimes, equivocation (see below) leads to an undistributed middle term. Contradictory Premises (also known as a logical paradox): Establishing a premise in such a way that it contradicts another, earlier premise. For instance, "If God can do anything, he can make a stone so heavy that he can't lift it." The first premise establishes a deity that has the irresistible capacity to move other objects. The second premise establishes an immovable object impervious to any movement. If the first object capable of moving anything exists, by definition, the immovable object cannot exist, and vice-versa. Closely related is the fallacy of Special Pleading, in which the writer creates a universal principle, then insists that principle does not for some reason apply to the issue at hand. For instance, Everything must have a source or creator. Therefore God must have created the world. What? Who created God? Well, God is eternal and unchanging--He has no source or creator. In such an assertion, either God must have His own source or creator, or else the universal principle of everything having a source or creator must be set asidethe person making the argument cant have it both ways.

FALLACIES OF AMBIGUITY: These errors occur with ambiguous words or phrases, the
meanings of which shift and change in the course of discussion. Such more or less subtle changes can render arguments fallacious. Equivocation: Using a word in a different way than the author used it in the original premise, or changing definitions halfway through a discussion. When we use the same word or phrase in different senses within one line of argument, we commit the fallacy of equivocation. Consider this example: Plato says the end of a thing is its perfection; I say that death is the end of life; hence, death is the perfection of life. Here the word end means "goal" in Plato's usage, but it

means "last event" or "termination" in the author's second usage. Clearly, the speaker is twisting Plato's meaning of the word to draw a very different conclusion. Compare with amphiboly, below. Amphiboly (from the Greek word "indeterminate"): This fallacy is similar to equivocation. Here, the ambiguity results from grammatical construction. A statement may be true according to one interpretation of how each word functions in a sentence and false according to another. When a premise works with an interpretation that is true, but the conclusion uses the secondary "false" interpretation, we have the fallacy of amphiboly on our hands. In the command, "Save soap and waste paper," the amphibolous use of "waste" results in the problem of determining whether "waste" functions as a verb or as an adjective. Composition: This fallacy is a result of reasoning from the properties of the parts of the whole to the properties of the whole itself--it is an inductive error. Such an argument might hold that, because every individual part of a large tractor is lightweight, the entire machine also must be lightweight. This fallacy is similar to Hasty Generalization (see above), but it focuses on parts of a single whole rather than using too few examples to create a categorical generalization. Also compare it withDivision (see below). Division: This fallacy is the reverse of composition. It is the misapplication of deductive reasoning. One fallacy of division argues falsely that what is true of the whole must be true of individual parts. Such an argument notes that, "Microtech is a company with great influence in the California legislature. Egbert Smith works at Microtech. He must have great influence in the California legislature." This is not necessarily true. Egbert might work as a graveyard shift security guard or as the copy-machine repairman at Microtech--positions requiring little interaction with the California legislature. Another fallacy of division attributes the properties of the whole to the individual member of the whole: "Sunsurf is a company that sells environmentally safe products. Susan Jones is a worker at Sunsurf. She must be an environmentally minded individual." (Perhaps she is motivated by money alone?) Fallacy of Reification (Also called Fallacy of Misplaced Concreteness by Alfred North Whitehead): The fallacy of treating a word or an idea as equivalent to the actual thing represented by that word or idea, or the fallacy of treating an abstraction or process as equivalent to a concrete object or thing. In the first case, we might imagine a reformer trying to eliminate illicit lust by banning all mention of extra-marital affairs or certain sexual acts in publications. The problem is that eliminating the words for these deeds is not the same as eliminating the deeds themselves. In the second case, we might imagine a person or declaring a war on poverty. In this case, the fallacy comes from the fact that war implies a concrete struggle with another nation. Poverty, however is an abstraction that cannot surrender or sign peace treaties, cannot be shot or bombed, etc. Reification of the concept merely confuses the issue of what policies to follow. It is closely related to and overlaps with faulty analogy and equivocation.

FALLACIES OF OMISSION: These errors occur because the logician leaves out necessary material in an argument or misdirects others from missing information.

Stacking the Deck: In this fallacy, the speaker "stacks the deck" in her favor by ignoring examples that disprove the point, and listing only those examples that support her case. This fallacy is closely related to hasty generalization, but the term usually implies deliberate deception rather than an accidental logical error. Contrast it with thestraw man argument. Argument from the Negative: Arguing from the negative asserts that, since one position is untenable, the opposite stance must be true. This fallacy is often used interchangeably with Argumentum Ad Ignorantium (listed below) and the either/or fallacy (listed above). For instance, one might mistakenly argue that, since the Newtonian theory of mathematics is not one hundred percent accurate, Einsteins theory of relativity must be true. Perhaps not. Perhaps the theories of quantum mechanics are more accurate, and Einsteins theory is flawed. Perhaps they are all wrong. Disproving an opponents argument does not necessarily mean your own argument must be true automatically, no more than disproving your opponent's assertion that 2+2=5 would automatically mean your argument that 2+2=7 must be the correct one. Appeal to a Lack of Evidence (Argumentum Ad Ignorantium, literally "Argument from Ignorance"): Appealing to a lack of information to prove a point, or arguing that, since the opposition cannot disprove a claim, the opposite stance must be true. An example of such an argument is the assertion that ghosts must exist because no one has been able to prove that they do not exist. Logicians know this is a logical fallacy because no competing argument has yet revealed itself. Hypothesis Contrary to Fact (Argumentum Ad Speculum): Trying to prove something in the real world by using imaginary examples alone, or asserting that, if hypothetically X had occurred, Y would have been the result. For instance, suppose an individual asserts that if Einstein had been aborted in utero, the world would never have learned about relativity, or that if Monet had been trained as a butcher rather than going to college, the impressionistic movement would have never influenced modern art. Such hypotheses are misleading lines of argument because it is often possible that some other individual would have solved the relativistic equations or introduced an impressionistic art style. The speculation might make an interesting thought-experiment, but it is simply useless when it comes to actually proving anything about the real world. A common example is the idea that one "owes" her success to another individual who taught her. For instance, "You owe me part of your increased salary. If I hadn't taught you how to recognize logical fallacies, you would be flipping hamburgers at McDonald's for minimum wages right now instead of taking in hundreds of thousands of dollars as a lawyer." Perhaps. But perhaps the audience would have learned about logical fallacies elsewhere, so the hypothetical situation described is meaningless. Complex Question (Also called the "Loaded Question"): Phrasing a question or statement in such as way as to imply another unproven statement is true without evidence or discussion. This fallacy often overlaps with begging the question (above), since it also presupposes a definite answer to a previous, unstated question. For instance, if I were to ask you Have you stopped taking drugs yet? my hidden supposition is that you have been taking drugs. Such a question cannot be answered with a simple yes or no answer. It is not a simple question but consists of several questions rolled into one. In this case the unstated question is, Have you taken drugs in the past? followed by, If you have taken drugs in the past, have you stopped taking them

now? In cross-examination, a lawyer might ask a flustered witness, Where did you hide the evidence? or "when did you stop beating your wife?" The intelligent procedure when faced with such a question is to analyze its component parts. If one answers or discusses the prior, implicit question first, the explicit question may dissolve. Complex questions appear in written argument frequently. A student might write, Why is private development of resources so much more efficient than any public control? The rhetorical question leads directly into his next argument. However, an observant reader may disagree, recognizing the prior, implicit question remains unaddressed. That question is, of course, whether private development of resources really is more efficient in all cases, a point which the author is skipping entirely and merely assuming to be true without discussion.

You might also like