search this site

Vigilant Listening

The Importance of Developing Critical Listening Skills

contributed by Kevin N. Daniel

In order to avoid being taken in by slick presentations, unsound reasoning and subtle falsehood, it is important to acquire an understanding of the persuasion techniques commonly employed. Listed below are certain methods that have been (and are) used by speakers to convince, and often mislead, individuals and groups. Unfortunately, modern listeners seldom have any educational background in logical criticism or oratorical techniques, and are all too often vulnerable to and unaware of the traps being laid for them. Though this article focuses on developing listening skills, the same critical processes which can be used to analyze the validity of spoken material may also be employed in reading or self examination.

As a listener, it is important to think and apply the same good sense you use in everyday survival to every area of your life. It is tragic that so many choose to set aside their reasoning skills in large, important areas of life and, instead, settle for being manipulated—for uncritically adopting someone else's pre-packaged viewpoints. It may be easy, comfortable and pleasant to allow others to do your thinking, but it is neither wise nor commendable. Since this is neither a book on logic nor a public speaking primer, the main focus will be on flaws, both on the speaker's part and in the way audiences behave.

Sociologists have conclusively shown in study after study, that people tend to be influenced more by how something is said, than what is actually said. We tend to respond emotionally, instead of logically, particularly when in a group listening situation where there is neither debate, nor opportunity to question.

The main goal of any kind of public speaking is to lead an audience to adopt the speaker's preconceived view, in part or in total. Think about the terms you've commonly heard used to describe polished speakers: "Captivating," "motivating," "mesmerizing," "enthralling," "moving," "charming," "seductive," "enchanting" -- all of which refer to being under the control of the speaker; "Dynamic," "profound," "great," "deep" -- which connote the speaker's authority and superiority with relationship to his audience; "Awesome," "divine," "charismatic," "wonderful," "phenomenal," "spectacular," "amazing," "extraordinary," "marvelous" -- which describe the speaker as having at least superhuman ability; and/or "Fantastic," "unbelievable," "unreal," "fabulous," "incredible" -- the original definitions of which meant to expose the speaker as being too polished to be true or trusted. While many con-artists find rich pickings the use of oratorical and logical tricks, this is not to suggest that every speaker who misleads his/her audience does so intentionally. But, unfortunately, as long as audiences respond to being manipulated, there will be those who will manipulate. Nevertheless, in order to retain any kind of standard of sound thinking, the observant listener must never lose sight of the speaker's objective (whatever his/her motive) and resist being "taken captive." And ideally we should be particularly wary of those speakers with whom we are most inclined to identify and agree.

Although many of the following principles can well be used to identify flaws in the arguments presented in debates or discussions, they should come into play most forcefully and be used as necessary tools, in listening to monolog presentations. Debate and discussion, unless venally contrived, by their very nature tend to present more than one side of issues and air objections to and flaws in the arguments presented. These safeguards are completely lacking when the speech is delivered from the dais, pulpit or rostrum. Monolog oratory inherently raises the speaker to a position of authority, both physically by putting him/her in front of and above the audience (either by standing or on a platform) and psychologically (by the lack of challenge on the part of the "awed" listeners).

Modern society and education have generally neglected inculcating critical listening skills, with results that are everywhere evident and predictable. Politicians no longer debate issues, but react to constituent predispositions and emotions; the religious do not seek or follow God, but rather artful oratory and platitudes; moral standards are discarded in favor of situational ethics; absolute truth has been replaced with circumstantial plausibilities; individuality and independence have mostly become empty cliche‚s, masking conformity and subordination to peer values. And people are content with this garbage! Most members of any audience attend because they expect that what they've come to hear will affirm their basic prejudices, beliefs and values, and that they will get some sort of enjoyment from the process. The lecturer is quite aware of this, and knows that he must please the listeners in order to lead them into accepting his/her propositions. And the fact that people usually operate under the assumption that those who are misled into wrong beliefs are always members of some other group, makes the lecturer's job all the easier. While there is always a great, natural reluctance to question ourselves and those who we allow to take positions of authority over us, lack of examination does not lead us closer to truth, no matter how content we may become in our delusions.

Hopefully, by being aware of some of the methods used, you will be better prepared to defend yourself against those who, intentionally or otherwise, peddle error to the unsuspecting and irresponsible.

Preliminary Questions

Before anyone submits himself/herself to the influence of a speaker, two questions should be answered regarding the reasons for giving up control and the importance of any effects the experience might produce.

  1. Do I want to place myself under this speaker's influence ...
    • To be entertained?
    • Because my interest has been piqued? (advertising)
    • To be informed? (authority figure on subject)
    • To be emotionally moved?
    • Because I'm frugal? (it's a free lecture or I've paid for tickets)
    • To avoid reality? (comforted by reaffirmation of your views)
    • To be liked? (approval, win friends)
    • In expectation of a reward?
    • To show solidarity with speaker/leader? (political rally)
  2. Do I regard potential import of the experience to affect my life as ...
    • Possibly enhancing my life?
    • Interesting but trivial?
    • Merely amusing?
    • A life or death matter?
    • Useful, but not vital?
    • Perhaps affecting my outlook or relationship with others?

The answers you choose should determine how much control and credulity you are willing to suspend. The more important your reason for attending, the more you must be prepared to critically examine what you are being told. You are placing yourself in a situation where you are allowing others to manipulate you (for good or ill) and must consciously make a point to exercise discernment.

You don't necessarily need to be as certain that a speaker is making accurate, logical and valid statements if your sole purpose is to listen for entertainment (to a comedian for instance) where the import of the experience consists entirely of transitory amusement. On the other hand, if the subject deals with survival skills, religion or your finances, it would be reckless to neglect a careful examination of the truth, logic and quality of the speaker's statements.

If you have decided that it will be profitable to attend a discourse on some important subject, you must be prepared to examine how the main point or points are supported and the validity of each justification given. This implies some knowledge and preparation on your part. Half truths and unsound logic are only acceptable substitutes for facts and sound reasoning in fairy tales or fiction.

Be Wary of Oratorical Techniques

Many of the devices used by speakers can be found in books and courses dealing with propaganda, rhetoric, debate, and public speaking. You may also be able to catch many techniques by dispassionately dissecting advertising materials, political arguments, etc. The following does not purport to be an all-inclusive survey of methods used.

Many of the methods listed below are used (either singly or in combination) by speakers to increase their effectiveness, that is, to more easily convince the audience that the position being promoted is reasonable and true. The problem here is that none of these techniques actually support the validity of any speaker's argument, and in fact, are only used to disguise weaknesses and falsehoods by getting the audience to accept what is said uncritically. Instead of informing and getting truth across, such a speaker settles for inflaming or deceiving the listener. The audience leaves, perhaps thinking that something has been learned and proven, yet in reality, they are probably as ignorant or more ignorant of the truth than when they arrived. Some of the techniques of delivery covered in this section have counterparts under the subsequent heading dealing with logical fallacies. An oratorical trick is used to disarm an audience's exercise of its powers of discrimination. The logical trap is a system to introduce, disguise or justify falsehood.

Sophistry— This is the use of specious, yet ultimately erroneous and/or misleading, argument in a display of the speaker's capacity for ingenious reasoning and ability to manipulate the audience with clever arguments. One who uses sophistic tricks is more concerned with the overall effect produced in the audience than in the accuracy and validity of his or her arguments. An expert "sophist" may be a dazzling, moving speaker; but the message delivered is actually empty or fraudulent.

Pre-empting Objections— This procedure is often used by cults to provide answers to valid criticisms of the position/doctrine. This insulates the cult member from outside influences, and usually helps to alienate the member from those outside the group. Thus dependence on the group represented by the speaker is reinforced.

Citing Established Prejudices and Error as Fact— This identifies the speaker with the audience and often plays to audience insecurities about the validity of their tenets. Assumption that a prejudice or widely believed error is valid may please an audience, but annuls the argument and calls the speaker's integrity into question.

Audience Agreement Solicited— An effective speaker will often research group prejudices and use this information to formulate a series of statements with which the audience will readily agree. Directly or indirectly, some sort of confirmation will be expected (a show of hands, dead silence, an "amen," nodding heads, a laugh, etc.) to reinforce group conformity. This makes it seem to the individual audience member that the speaker has the entire support and agreement of the audience, making it extraordinarily difficult to dissent (see Group Dynamics heading below). Once the speaker has the audience in a pattern of agreement, he/she can easily introduce and peddle controversial or heretical viewpoints that the audience would otherwise be adverse to tolerating. Since it is assumed that the whole group agrees with the speaker, rarely does anyone dare to object, thus strengthening the effect.

Use of "We"— Sometimes referred to as "royal we," this venerable device allows the orator to speak for the audience and imply agreement. The audience usually follows along (see Audience agreement above). When used to excess, this appears slightly dated and ridiculous, but it is still commonly, and very effectively employed when used subtly and in moderation.

Fast Talk— Humans need time to examine what is being said. If one loses ability to apprehend the soundness of the speaker's supporting premises, the tendency is to accept them as valid by default, especially if the speaker has established himself in the eyes of the audience as an authority figure or is perceived as having the agreement of the group.

Ritual— The same sequence of events repeated is known as ritual. This need not have religious overtones, the format of a meeting or how a meal is taken can be considered rituals. This can also be extended to repetitive phrases. Ritual inevitably produces monotony which, over time, exercises an almost hypnotic effect. Most people who drive the same route to work or market have experienced times when they cannot remember if they had stopped at a particular light or noticed other important features during a portion of the trip. This occurs because the routine has become so familiar (a ritual) that the brain slips into an "automatic" mode which trivializes important, repeated occurrences and can make the extraordinary commonplace. The mind in such a situation discards stimuli which it has recorded from past experience. The danger here is the assumption that warning beacons are being noted, when in fact they may not be coming at all into play. Hence, error can slip unnoticed into an individual's conclusions.

Praise/Flattery— Congratulating or praising the group or individual is an effective way of stifling dissent. Most people feel disloyal in criticizing someone who has held them up to high regard, and vaguely sense that condemning a speaker who has praised them publicly will, inevitably, seem to reflect on their own credibility. An example of a listener's thought process might be: The speaker commends me, I accept the commendation, therefore the speaker and I agree. If I criticize the speaker I am criticizing those who agree with the speaker, therefore I am seen to criticize myself. Our vanity predisposes us to take the easy road and dispose with our objections, thus not tainting the praise or the praise-giver.

Stereotyping— This is a useful speaker's tool which is often misused. Reducing complex situations to easily grasped concepts, speakers will often generalize categories and groups as holding characteristics in common. This, at best, falls short of complete accuracy and can easily be used to mislead. Stereotypes should be identified and avoided by the listener, as they almost always set up a false premise. The more important the subject at hand, the more necessary it is to root out these categorizations. Acceptance of a stereotype as being true is known as a prejudice. These generalizations are often used to reinforce group solidarity by setting up agreement in opposing others (usually to whom are attributed "bad" characteristics) who supposedly disagree. This produces an us-versus-them reaction which can be used to slander other tenets supposedly held by the opposition the speaker has invented.

Emotional Delivery— Anger/hate/pity/shame/love/fear. Emotions are not facts, yet can be used in place of facts by a skilled speaker. A highly emotional appeal can move individuals and groups to abandon their discrimination. The emotion evoked becomes the "message" and the audience takes in the speaker's points indiscriminately. This can be a particularly effective trap in group situations. Usually, the speaker (who will have already established a pattern of audience agreement) will initiate the response by a verbal display of the desired emotion, counting on the listeners' empathy to reflect his/her prompting. The response is enhanced by fellow audience members. Thus an emotional, not logical, sequence produces the effect and affirmation of the speaker's position. Such contrived, yet shared, emotions often leave the audience with a sense of common purpose or "brotherhood" quite independent of the message delivered. Many wildly popular speakers ranging from television preachers to Adolf Hitler have displayed a well developed talent for producing emotional responses in an audience at will (from compulsion to hysteria).

There is a correct use for emotional content in public speaking, but it has no place in the process of arguing a case or informing. Once the facts have been logically and unimpeachably established, then an emotional appeal can be a powerful motivator to action. It is unfortunate that few speakers limit emotion to its proper place, and instead, substitute it for facts, use it to disguise false or vapid reasoning, or to lull the audience to complaisant acceptance. Substituting propaganda for rational reasoning and emotional appeals for proof is a technique used by speakers who regard the audience as little more than children incapable of understanding, and/or who wish to keep them in an ignorant or deluded state.

Ridicule— Often used (humorously or not) to reinforce stereotypes (see above). Can also be used to suppress dissenters by identifying them with groups holding views antagonistic to those held by the audience. Ridicule, which relies on supposition, prejudice and group influences, is not the same as a logical refutation. Sometimes, ridicule will be slipped into a positive assertion: e.g., "No one in his right mind would support the mayor's expensive highway proposal." In this example, opponents are personally tainted as to their mental competence, when the issue at hand is actually the highway proposal.

Glib Generalities— An assumption that, because one or more premises in an opposing view are wrong—all the other supporting premises in that view are automatically invalidated, is an easily employed stereotyping device (see above), particularly in monolog situations where rebuttal or defense by a member of the opposition is not allowed. Actually, all that disproving one or more supports shows is that an argument has been shakily constructed. Conversely, just because it has been shown that one or more pillars of a position are unquestionably valid, this of itself does not warrant acceptance of either the position or its other supporting premises. We humans can be lazy thinkers, particularly when we are rushed to form conclusions without adequate examination. It is all too easy to latch onto a proven fallacious support and presume both conclusion and any support for that conclusion are also false.

Rhetoric— A method of speaking and writing characterized by the use of figures of speech, flowery language, and/or exaggeration. As with the sophist, the speaker using rhetorical techniques is more concerned with producing an effect in the audience than with making an accurate and logical presentation.

The Rhetorical Question— A rhetorical question is a device used to produce an effect in the audience, and is not intended to prompt an answer. This ploy is widely used to elicit audience agreement (see above) or an emotional response. Rhetorical questions often presuppose apparent, prejudiced or previously supplied responses on the part of the audience. This allows the audience to feel good about being "right" and the speaker to look good for backing up the audience's viewpoint. Other than show, this ploy adds nothing of substance to the speaker's argument and should be discounted. Unanswered questions say nothing.

Outrageous Statements— A strategy which is used to introduce error or get the audience to accept a "lesser evil." Typically, the outrageous statement will be followed up by a premise the audience would not otherwise be prepared to accept, but couched in vague and/or reassuring terms. Skillfully employed, this causes the audience to lose discrimination and sense of proportion. The outrageous statement itself loses its impact when repeated, and may itself eventually be rendered palatable.

Humor— In approaching strange situations or persons, people are naturally cautious. Humor is a recognized method for getting audiences to drop their natural defensiveness. Amusing anecdotes prompt the audience to identify with the speaker.

Rhyming, Rhythmic and/or Monotonous Style— An ancient mnemonic device which can have much the same effect as ritual (see above) in a shorter time frame. Hypnotists use this device for gaining the confidence of patients and implanting subconscious messages. If the speaker is subtle enough, one may not realize the use of this technique until one detects it in a parody, children's parroting, or a foreign speaker.

Bait-and-Switch— A notorious ploy, which can nevertheless be successful in the hands of a skilled technician, this is to present an attractive proposal the speaker knows is desired by the listeners and then to subtly denigrate it to the point where a less palatable (or less valid) point of view can be accepted by the audience. E.g., Feeding starving children may be the theme of a fund raiser, but the speaker may wish instead to convince the donors to contribute to funding a new organizational headquarters building on the premise that the foundation cannot reach more starving children without the office space.

Rationalization— Explaining away logical errors or weak points detrimental to speaker's case by using invalid comparisons or constructing twisted arguments. Although rationalization can be sold to an audience, its appearance will be avoided by most speakers—fraudulent proofs usually being more effective, simpler to present and easier to disguise. Having to offer excuses for a position, instead of presenting unassailable supporting facts, is a strong signal that the speaker's case is very seriously flawed.

Jargon— Overloading the audience with undefined concepts and terminology (jargon) is often used in weak portions of an argument. A speaker should avoid use of shorthand terms with which the audience is unfamiliar. When esoteric abbreviations, words or phrases must be used, the speaker has a responsibility to define these usages. Jargon is usually employed sparingly in order to disguise weaknesses (after all, it's hard to refute what you don't understand) or make the speaker look knowledgeable. Over-use is normally avoided as this may cause the audience to become distracted or nod off.

Wandering Definitions— Words should be used in the same sense of meaning throughout a discourse. When the same term is used to indicate different things or concepts, the audience becomes bogged down as it soon becomes impossible to examine what is being said. E.g., a cleric might use the term "divine" to describe godhood at one point and later refer to Biblical passages in which angels are called "divine beings," confounding the first point.

Use of Invalid Analogy— An analogy should compare two entirely equal cases. Thus the properties of one object can be extrapolated from the actions of an identical object performing under identical conditions. This is terribly easy to abuse. Many objects or conditions which seem the same are often subtly different, thus rendering the analogy false. Be very careful of being persuaded to draw any conclusion from such fault-prone presentations!

Analogy is often used to introduce a pattern of reasoning. The example is not part of the argument, nor is it a proof. It also does not necessarily mean that the subject at hand can be proven in the same manner. Analogy is sometimes used to suggest that reasonable alternatives to certain statements or conditions might exist, without proving that the alternatives represent actual situations or possibilities.

Use of Anecdotal Instead of Hard Evidences— Faulty logic and lack of supporting evidences can be effectively disguised by telling a story in order to produce the same effect in the audience as would the speaker's argument, had it been valid. By introducing a contrived comparison between the faulty argument and a plausible story or example, a speaker avoids exposing the erroneous assertion to examination. Factual arguments should stand on their own logical merit and evidences, not on example stories.

Similarly, a real-life occurrence is often used as evidence to support an argument being true. This is a form of faulty generalization—arguing that because something occurred when a certain factor was present, that factor must have been instrumental in producing the occurrence. E.g., False medical claims are often based on instances of people feeling better or getting well when they follow a certain regimen. However, to prove that the regimen cured the disease, science demands that rigorous double-blind testing show that more people get better who actually undergo the treatment than get better when undertaking a placebo treatment.

The "Straw Man" Ploy— Setting up a weak opposing view that can easily be destroyed by the speaker is an old debater's trick that sidetracks the audience from the question at hand, and makes the speaker look good. This enhances the speaker's perceived authority and suppresses objections.

Audience Plants— Where institutionalized, this ploy is also known popularly as the "cheerleading section," "amen corner," etc. It is a method of getting audience agreement (see above) where the disposition of the audience may be unknown, skeptical or hostile. It is also used to reinforce solidarity in an audience which is favorably disposed. By instantly providing agreement responses, these agents give the impression of agreement and stimulate others to follow suit. People in group situations are unlikely to object and risk calling down the wrath of any group especially when outnumbered. It is far easier to conform with the perceived speaker/group's ideals. Take note of who starts the applause, the agreeing, etc.

Demagoguery— This ploy is represented by an often emotional, often loud appeal to audience preconceptions, prejudice and emotions (see Audience agreement above).

Red Herring— When a dangerously weak or false position may be vulnerable, a closely related story or subject will be introduced and used to throw the audience off track and lead them onto some other train of thought. The term is derived from the practice of dragging a fish across a trail to divert bloodhounds or hunting dogs from discovering the quarry.

False Humility— Some speakers will cite their poverty, adopt a homey or ignorant demeanor, etc. in an attempt to enable the audience to more readily identify ("relate to") with him/her. The speaker will usually assert an authoritarian tone or attitude later in the presentation in order to drive home the real message.

The Big Lie— This term comes via Hitler's propaganda minister, Joseph Goebbels, and refers to repeating a grossly false conclusion and its contrived supporting premises so often that they are accepted as factual. At a future point, these newly minted "truths" can be called into play to support actions or even greater lies. This can be seen in cult groups which, despite clear evidence to the contrary that the tenets are false, accept "on faith" tenets which have become ingrained in the members' thinking through "brain washing" techniques. Note, however, that any lie must contain at least a bit of truth, or it will not be seen as plausible—in fact, a lie can consist entirely of truisms, but stated in a logically twisted way.

Double Talk and Double Speak— This is speech which seems to be meaningful and in earnest, but is actually a conglomeration of sound reasoning, nonsense and contradiction. It is often inflated, ambiguous and intricately constructed.

New Speak— Similar to double talk, this Orwellian term is used for the practice of substituting soothing, positive and/or ambiguous terminology for unpleasant or disturbing fact. The specific object is to prevent opposition and limit the scope of (or prevent) audience thought. E.g., In the 1980's, the Pentagon requested funding, not for neutron bombs, but for "radiation enhancement devices."

Mercenary Statements— Statements that are perfectly true can be used in an illogical fashion to leave the listener with an entirely false impression. In itself, this type of statement is useless unless the listener inserts the desired connotation. E.g., "Last month Susan didn't steal anything at all." —implying that she is used to stealing. Or how about "Our church doesn't condone human sacrifice" —implying others do.

Interpretation— It is nearly impossible in restating someone else's point of view not to insert a different emphasis or twist the meaning. It should be obvious that reference should be avoided to any statement that cannot stand on its own logic and factualness. If a statement attributed to another is soundly framed and factual, it should be allowed to stand as is. To interpret it is to put words in another's mouth. A speaker who gives his/her version of what someone (another authority) has said is committing a couple of logical errors up front (trying to avoid proving the position to be proved, and/or citing that person's opinion in an excerpted, bastardized form which does not permit audience examination of the validity or truth of the original argument). In any case, the interpretation prejudices the audience to adopt a certain view or reading as to what the original actually said (see Pre-empting objections above).

Be Wary of Logical Traps

To use logic is simply to examine the adequacy of the proof backing up an assertion. We are all logical thinkers in the main. Were we not, we would be incapable of making informed decisions and would probably be institutionalized. We have, however, all developed bad thinking habits.

First, a few definitions of terms used here. The term argument describes the steps or process used to reach a conclusion. Arguments are either valid or invalid. A conclusion is a statement supported by reasons. Reasons (also premises, evidences or assumptions) are facts presented as proof. In a longer argument, a premise may consist of a previously proven conclusion. Statements of reasons or conclusion are true or false. In an argument, the reasoning must be valid and the supporting facts must be true in order for the case to be acceptable.

Let's look at two sets of simple arguments, the first two thought processes are logical and the second two are flawed:

Do you see how the structure of the logic makes the difference between a case that is valid and one that only seems to be valid? In the second two arguments, because we know that the invalid conclusion is probably a true statement, we might be inclined to accept the faulty argument and see it as reasonable. In the last example, the invalid conclusion might seem to work if we substituted "zebra" for X, but doesn't stand if we substitute "woolly worm." An illogical argument does not prove anything, whether it seems to or not.

It is quite possible for a fallacious argument to use sound logic and be confounded by one or more false facts. Equally, a fallacy may consist of true premises, but use unsound logic. Or both logic and facts may be wrong.

If on examination, a statement seems prone to being exposed as false or without factual support, a speaker may resort to twisting the logic of the argument so that the premise appears to be valid. These traps will be found in the method the speaker uses in drawing conclusions. Fallacious reasoning is an integral to many oratory techniques listed in the previous section. Aside from the common deceptions listed below, books and courses on logic can also help you identify contradictions and logical flaws.

To quote Lionel Ruby's The Art of Making Sense, "It is difficult to think well in fields which involve our emotions and self-interest. We often simply forget that we ought to exercise our critical powers. We become dogmatic, and make positive and arrogant assertions without proof. We may become blind fanatics, and stop thinking altogether. We become blind followers of authorities, without ever inquiring as to whether their pronouncements can be justified by the evidence."

Common Logical Errors in Arguments:

Guilt by Association— Citing mere common characteristics does not prove that two cases are identical. E.g., "John believes workers should have a safe work place, be paid a fair living wage, not be exploited by employers, and have job security; therefore, John must support the union's demands, since all union members hold these same ideals."

False Premise — This is an assumption which is introduced to the audience as fact and is either false, not proven, or backed up with unsound reasoning or false statements. The speaker goes on to use this as an evidence in support of another premise. E.g., "Cynthia supports a woman's right to choose whether or not to terminate her pregnancy; the Bible states that abortion is murder and a sin; therefore, in God's eyes, Cynthia has sided with those who condone committing murder and sinning."

Circular Argument— When an initial premise is supported by a second, and in turn the second is backed up by a third, and the third is backed up by the initial statement, then the speaker is (overtly or secretly) attempting to prove his fact by itself. This a presumption that the very conclusion being argued is proven while trying to prove it. E.g.,

Accidental Fallacies— If a special case or condition is cited in an argument, and then used as a general rule to support a conclusion, the argument is invalid: e.g., "The boiling point of water at sea level is 100 degrees Centigrade, therefore, water boiling in Yellowstone's geysers must have reached 100 degrees Centigrade." (actually the boiling point varies according to elevation: it is less at points above sea level.) Conversely, the argument is also invalid if a general rule is cited to prove a special case: e.g., "Drugs are beneficial to mankind; cocaine is a drug; therefore cocaine is beneficial to cocaine addicts."

Presumption— This is a general classification of all fallacies in which a premise either avoids proving the issue at hand or secretly tries to use the conclusion (the thing to be proved) as evidence. This is also referred to as a "material fallacy" as opposed to a "formal fallacy."

Appeal to Awe— This fallacy of relevance cites the opinion of respected "experts" instead of proving the reasoning of the speaker. Just because "Mr. Big-wig says thus-and-so" is no reason to assume that an argument is valid, unless proofs are offered to back up Mr. Big-Wig's statements. Three closely related fallacies:

  1. quoting conclusions from another source without (or with only vague) attribution—as this also denies opportunity to examine the reasoning behind the borrowed premise (e.g., "Four out of five doctors said that Zoe's pills help...");
  2. citing a tradition as proof (e.g., "All I can tell you is that it was good enough for great-granddaddy Smith ..."); and
  3. use of testimonials as evidence—in order to be suitable for inclusion in an argument, the testimony must first be proved just as would any other assertion. Competence and lack of prejudice on the part of the testifier must also be addressed here. (E.g., "Since I started coming to these meetings, I've been blessed with riches beyond compare. I've never been happier. I know this is right!)"

Appeal to Ignorance— This second fallacy of relevance is sometimes referred to as "Negative Proof." It often appears in the forms: "This is true because no one can (or has) shown otherwise." and, "Since this thing has not been shown satisfactorily to be true, therefore, its reverse must be true." An "unknowable" proviso lends no support to any argument. Showing that something is not conclusively established merely shows that it has not been confirmed, and this cannot, of itself, be used to justify anything else. E.g., "We can accept the story about George Washington and the cherry tree, because you haven't shown us that it didn't happen." or, "Since it hasn't been proven beyond doubting that there is a God, we are warranted in the assumption that God does not exist."

Appeal to the People— A third fallacy of relevance offers the holding up of popular ideals (such as liberty, or fairness) instead of offering logical reasons for a conclusion. E.g., "This hallowed ground, sanctified by the blood of freedom-loving patriots and the tears of bereaved motherhood, must never become a forum in which those critics who challenge our common ideals are allowed to exist and spread their filthy lies."

Appeal to Pity— A fourth fallacy of relevance consists of winning the audience to sympathy for the speaker's case instead (again) of offering reasonable arguments consisting of proven fact: e.g., "Mother Elsa hasn't a cent to her name, she spends hours scrubbing floors in tuberculosis wards, she gives half of her meager rations and income to help support her ailing sister, her back is bent from carrying buckets of cement to complete the new orphanage. Surely this saintly woman, who has suffered so much for others, would never fraudulently solicit legacy bequests to benefit herself. And what will become of her and those who depend on her if she has to go to prison?"

Personal Attack— Instead of providing proof for the position taken, the speaker turns the issue into an attack on the character or conduct of an opponent, e.g., "Senator Smythe would have us increase funding for school lunches: this from a man notorious for chasing skirts all over Washington, yet who supported equal rights for women; a man who was investigated for taking kickbacks from contractors in 1985, yet who voted for the latest ethics bill; a man who opposed waste regulations while owning one of the worlds largest landfill operations. My friends, Senator Smythe is simply not credible." This can also be used to intimidate the opposing side, casting an aspersion instead of providing valid counter-arguments, e.g., "Anyone who would ask such a thing should be ashamed!" Sometimes, a speaker will attempt to disqualify an argument from an opponent by referring to circumstances which might incline the opponent to take that position, e.g., "Jane has been pointing to serious error in our church. Jane's points should be dismissed because she left our church because she could not submit to our leadership. She is motivated by bitterness and a bad spirit, so of course finds fault with us." Note that the latter argument does not address or refute the charges brought forth, but merely attempts to blacken Jane's motives, which, in any case, are not relevant to the issue at hand. Finally, a speaker may attempt to shift the perception of guilt back onto an opponent by charging him/her with a similar offense (sometimes implying hypocrisy.) E.g., "I believe that the B-4 bomber base proposed for my district should be kept in the budget. I can't understand why Representative Vorn has a problem with this, since his district has benefited from billions in pork barrel defense projects over the past ten years."

Threat— The last error in relevance dealt with here consists of using an implied or explicit threat in place of reasoned proofs. E.g., a cleric might say: "We're doing things in God's only way. If you don't believe that, you might as well leave us and enter the separation and damnation deserved by those who malign our Truth."

Unfair Questions— Sometimes, more than one pertinent question will be combined to form a single query in such a way that the single answer required is inadequate. E.g., "Do you like vegetables?" (Carrots yes, but not Brussels sprouts.) A question can also be framed in the form of a "loaded question" which leaves a false implication no matter what the answer: e.g., "Did John ever stop cheating on his exams?" —any answer given sounds like John cheated.

False Alternatives— Sometimes the speaker gives choices which are inadequate and present a contrived dilemma. E.g., "A man can either support my government's foreign policy or he is a traitor." —the reality might lie somewhere in between, but has not been presented to the listener as a possibility.

Inconsistency— A position supported by premises which are directly or indirectly contradictory or false is invalid unless and until true explanations reconcile or replace the conflicting items. E.g., "We accept Biblical passages which state that God will directly teach all believers, and that there are no longer human mediators between God and men. It is obvious, however, that in the real world God must teach us through our (inspired) ministers, and that it is through them that we have knowledge of Him. Thus, in a way, we see God teaching His people in the way He said He would."

Note also that deductions cannot validly be formed unless the logical approach used in forming the conclusion is consistent.

"Hypocrisy" or a "Double Standard" results from drawing conclusions based on evaluation or judgment of two or more things or groups according to differing, inconsistent standards: e.g., "Television is evil, because it introduces worldly influences into the home." —yet the same speaker might read newspapers, listen to radio, have a VCR, read steamy novels, subscribe to magazines, occasionally even hire a set or rent a room with a T.V., etc. In the above example, the speaker does not judge television by the same standards used to judge other media. The idea that a person who is able to exercise discrimination in choosing and reading news and magazines could not exercise the same self-control with regard to television has not been shown, and in fact may be contradicted by the actions of the speaker. And it is all too easy to demand that others adhere to higher standards than we require of ourselves or of people with which we identify. If we rationalize reasons for violating any standards ourselves, we must excuse others also: e.g., "If any among you is without sin, let him cast the first stone."

Non Sequitur (literally, "it does not follow")— This term can be used to classify a broad range of fallacious statements, however, the reference is usually reserved for instances in which there is no connection between the reasons put forth and the conclusion. E.g., "Dave fell out of a large tree last week when a branch broke; the tree was a fast-growing Chinese Elm; thus, Chinese Elms are prone to disease."

Equivocation— Basically, equivocation is the use of the same term in two or more different ways in different parts of the argument: e.g., "Mary is smart [stylish]; Smart [intelligent] people do well in school; therefore, Mary does well in school." Figures of speech (or word pictures) lend themselves handsomely to this fallacy: e.g., "Since he stopped gambling, Joe's been on an even keel—what's he been doing on a boat all this time?" A related error can easily occur when a speaker uses words in an unorthodox, esoteric manner. If the speaker uses terms to which he/she attaches obscure or novel definitions which the listener cannot be expected to recognize, then the listener cannot accurately examine or argue with such statements. Thus both the speaker's argument and conclusion are rendered fallacious since the speaker has not communicated to, or has been misinterpreted by, the audience.

Composition— This is an inference that the whole of something has the same attributes as one of its parts: e.g., "The grapes are good this year; thus a crop consisting of good grapes must be a good crop." —not if only five clusters fruited it isn't! Similarly, a too hasty generalization can be drawn about a whole group from limited or insufficient evidence: e.g, "Students born in Burma delivered the valedictory and salutatory addresses at this year's graduation. All of the Burmese students in our class graduated in the top ten percent. Obviously, all Burmese are highly intelligent."

Division— This is the opposite of the composition fallacy and contains the assumption that because the group as a whole has a quality, each member of the group possesses the same quality: e.g., "The grape crop is very large this year; therefore the crop must consist of very large grapes." Neither should it be presupposed that a general rule can be applied to a specific or exceptional case: e.g., "All bureaucracies are inefficient; Mary is a bureaucrat; therefore, Mary must be inefficient."

False Causes— If the several reasons used in an argument can be used to draw contradictory conclusions, the argument is invalidated, even if it is apparently valid as initially presented. This is easily seen in arguments where the validity of the conclusion rests in the sequence in which the facts were presented. The fallacy of false causes also covers situations where an event is misallocated to an unrelated cause: e.g., "A black cat crossed my path last week; and ever since I've had terrible luck; therefore my bad luck was caused by the black cat crossing my path."

Ambiguity— Generally, this refers to a premise in which what is meant has not been clearly stated. A common use of ambiguity is in propounding a statement which manages to say contradictory things at the same time (sometimes prompted by use of a valid word in a metaphorical or figurative sense): e.g., "He was mad when he lost the game." —was he insane? This is often used in forecasting, oracular pronouncements, and where the speaker is uncertain so that the authority figure seems to be correct whatever the outcome or objection, e.g., "Caesar, the senators will kill." Who is supposed to be killed in this simple example? This type of statement says nothing. A related fallacy occurs when this kind of double-talk is used with certain parts stressed to give the statement apparent meaning: e.g., "I like lemonade and sugar." vs., "I like lemonade and sugar." —does the speaker like sugar in the lemonade, lemonade and sugar separately, or both? This is a particularly insidious logical trap when the speaker quotes from another source and by a change in vocal emphasis, distorts (intentionally or not) the original's true meaning. In any of the above vague cases, the premise is invalidated, since the meaning cannot be nailed down with certainty.

Evidence quoted out of context also falls into this category. The statement may be accurately quoted, but the meaning has been rendered false. E.g., A reviewer's satirical comment "The play had wonderful moments, both of which must have occurred backstage during intermission." might be partially quoted on a billboard to say "The play had wonderful moments" by an unscrupulous producer more interested in peddling his product than being accurate.

Domino Fallacy— A process of reasoning which objects to adopting an action or position by stipulating that this will invariably lead to accepting a less desirable action or position. The second action or position will, in its turn, lead inevitably to a third, etc.—until some ultimate, unavoidable abomination is reached. This is sometimes seen in arguments which oppose setting a precedent. This type of argument attempts to persuade the listener that one single "wrong" step will set off an unstoppable chain reaction. In reality, chain reactions can usually be stopped at any point, whenever the conditions allowing the reaction to continue are altered.

Misuse of Statistics— This is a vast, often technical subject beyond the scope of this article. However, two commonly encountered errors will illustrate the pitfalls here:

  1. The Gambler's Fallacy ("If at first you don't succeed ...") i.e., the chances of getting either a one, two, three, four, five or six when rolling a single die are one in six. The odds against hitting the chosen number do not become more favorable with each subsequent toss. No matter how many times you roll the odds for each toss remain 1 in 6.
  2. Selective Emphasis e.g., in a survey, 3 out of 10 spouses said they had cheated on their mates; conclusion: the family is clearly under attack and in decline. The way the statistic is presented de-emphasizes that a large majority (7 out of 10) are remaining faithful, and may also be ignoring long-term historical trends which could show infidelity rates to be cyclical, declining, etc., or that the family values being touted as being historically atypical.

Group Dynamics

We'd all like to think of ourselves as being independent, however it is the rare individual who does not automatically submit to the consensus of the group and is not cowed by those who sport the trappings of authority. A classic sociological experiment, first demonstrated by Stanley Milgram in the 1960's, showed just how little it takes to make people abandon their values—some might say their humanity. In the experiment, the subjects were individually ushered into a room in which were a chair, a microphone, speaker and a machine having series of dials and buttons. A man in a white coat entered the room and explained that the machine was connected by wires to a person in another room. The subject was given a contrived explanation that the experiment was to see if people's learning ability could be improved through negative stimuli, or some such mumbo-jumbo. The man in the white coat instructed the subject to read a series of questions to the person in the other room. Each time a wrong response was given, the subject was to press a button to administer an electrical shock to the person in the other room. For each subsequent wrong answer, the intensity of the shock was increased one increment, all the way to potentially lethal levels. An actor, pretending to be the person being tortured in the other room, supplied both wrong answers and screams of pain over the loudspeaker. The researchers were surprised to discover that the overwhelming majority (something like 92%) of subjects were willing to go all the way and administer torture at potentially lethal levels even though the only pressure they were under to perform was provided by the other man in the room (who was instructed only to answer any reservations that might be expressed with the phrase: "The experiment must continue.") The subjects perceived the other man as an authority figure on just the basis of the white coat he wore, and rarely challenged him or expressed reservations. Not only did this experiment provide disturbing insights as to how easy it is to rationalize atrocities, but it showed just how little it takes to get people to abandon their convictions and support the policy and rationale of a group or leader (qualified or not).

An interesting sidelight to the situation above was provided in a second experiment in which subjects watched re-enactments of the previous experiment and were asked how they felt about people who reacted in different ways. Subjects tended to dislike most, those who objected and who reacted explosively, condemning both the authority figure and torture "experiment" itself. Those who refused to participate also, but did so in a less aggressive, even apologetic manner were seen as more likable. People tend to be impressed with dissenters who are perceived as being consistent, firm and independent.

From childhood, we are all subjected to peer pressure and authority figures. As we age, it is comforting to associate with others sharing some of the same experiences. Most will derive their self identity, at least in part, from the group or groups they join. This can be seen most clearly in the individuality teenagers sacrifice in order to adopt the precepts of gangs and high school cliques. In adults, the same process expresses itself in blind adherence to political party lines, cults and group prejudices. Usually, people will say they believe in truth, but actually settle for a group's consensus. To our shame, we all know that it doesn't matter if the whole world believes something—if it isn't true, it isn't true. Yet, most people find it impossibly difficult to stand alone in a position that challenges a group consensus. In adolescents, we call this "caving in to peer pressure," but it is a characteristic just as evident in adults. The desire to belong is strongly ingrained and inculcated, and like most desires, is exploitable.

As with emotion, group associations can be constructive. Streets can be paved, medicine shipped to flood victims, buildings raised. The trouble comes when we become dependent on the group: when we accept falsehood instead of truth because it is popular, when our perceptions of the world are shaped and narrowed by what others say; when the "facts" we learn are rationalized to fit group tenets; or when we act in a manner the group or its authority figure expects of us. If the point comes that who we are, how we think or how we live is based to any degree on the precepts or dictates of others, then it becomes very difficult to break free and independently establish what is true, logical or ethical.

It has been shown that people find it extraordinarily difficult to object or take a stand on their own. Many, however, will stand up for principles if there are at least one or two others who they can count on to support the same position. In controlled group situations, that support is seldom, if ever, allowed to manifest itself. Instead, and even when there is some support, there is the constant implication that the dissident is rejecting the values held by the group, as expressed by the authority figure. When a perception of being outnumbered is present, the psychological pressure on the dissenter to submit is enormous.

The propensity to submit to the group is exploited by the authority figure. The authority figure (speaker, ruler, priest, etc.) is invested by the group with the right to speak collectively for its members. Generally speaking, objections on the part of individual members are stifled, or dispersed by the awesome influence and presumed opposition of the group the authority represents. Many people will never escape the trap of unquestioning obedience to authority figures and group values because these people's very identity and place in the scheme of things has been formed around the precepts held forth by the group.

Distortion of facts and logic in a group situation are well known mind control techniques. The group can consist of just the authority figure representing an unseen group and the compliant listener (although in this situation the subject of control usually must be under the physical control of the authority). In cult and political situations, the group can be hundreds or millions. What is important is that the group provides the pressure to conform through providing a comforting environment where acceptance of predigested formulas always takes precedence to rational thinking. Leaving this environment becomes something dreaded (whether from fear of losing friends, belief systems, self value, etc.) Submission follows. Persons long exposed to this kind of environment eventually censor their own thoughts and actions to conform to the little world they've entered. Authority figures may even exercise control and discipline over such persons by non-verbal means such as a raised eyebrow, a frown or the tone of voice. People thus "brain washed" become like little children (at least in those areas of life where the group/authority chooses to establish its influence) ruled by authoritarian parents—only they don't grow or develop as do real children.

Be careful!

Questioning What Is Said

It is one thing to understand how people are misled, and quite another to put that knowledge to use in everyday situations in which you are exposed to manipulation. One way is by making it a habit to examine what you hear. Make a note of each point, so you can test the validity of what you have heard:

Listener Defenses

If at all possible, the listener must remove him/herself from the influence of the group and the emotional pull of the speaker. These make it very hard to rationally examine what has been said.

It helps to be exposed to a variety of views. Repeated exposure to the same faulty logic and false claims only compounds the influence these will have on you. Even if your object is to attend a monologue delivered by an amusing comedian, a steady diet of this one viewpoint will have an influence on your perceptions. You lose some of your objectivity.

It is sometimes helpful if the speaker furnishes a transcript ahead of time on which digressions and emphasis can be noted by the listener, and which can be dispassionately examined later. More problematic, is the taking of notes, as these are often incomplete and tend to overemphasize the conclusions of the speaker. In a similar vein, taped speeches often preserve the emotional and oratorical devices employed. A written transcript of taped lectures will be far easier to sort out. Note that having a written copy does not guarantee protection from unsound reasoning, prejudice or emotional influences, but they do allow you the time to examine and re-examine. Some types of propaganda have been shown to be more effective when presented in a written rather than oral format. However, any of the above is better than relying on what the listener thinks was said.

Another listening defense has to do with the attitude you take going into lectures or even discussions. Most people go into a lecture situation (class, sermon, speech, etc.) predisposed to agree with, submit to, and/or learn from the point of view to be expressed—why else would they bother attending? Rather than the natural position (expecting to agree), try taking and maintaining an adversarial position while listening. This will help you to separate yourself from audience influence, at least somewhat. If you take stereotypes and diatribes as being directed at you personally, you'll better be able to judge the speaker's fairness. If, when prompted to agree, you disagree and construct possible objections, you'll be in a position to partially reconstruct the missing other side of the argument.

And there is nothing wrong with standing up and asking a question or challenging falsehood. Many cultures do not share our passion for appearing to be polite and uncritically tolerant. Do you actually believe that it is polite to mutely allow lies to go unchallenged, to let others be led astray by unsound reasoning, to permit insults and prejudice to pass without objection, or to sit by quietly and thus lend your tacit agreement and support to the promulgation of error? This may require more courage than most of us possess, as we are conditioned to submit to authority figures. But we can flee such situations; at the very least, we can walk out. There is no law, no rule of conduct, that requires us to allow ourselves to be utilized and manipulated by others.

We also too readily attribute a "good" motive to the speaker. However, we must bear in mind that, even if we have judged the speaker rightly, he/she may still be wrong and quite sincerely believe the delusions he/she is trying to inculcate into the minds of the audience. Sincerity is never a measure of the truth of any presentation or its content.

If the subject is going to influence your life at all, it is also vital that you do your own research. The man or woman on the dais or behind the podium is not a god or some divinely inspired being with knowledge unavailable to anyone else. If such material were the province only of speakers, they wouldn't have any business trying to disclose it to the rest of us. Their theories are supposed to be backed up by real evidences. Check out the facts for yourself. Examine the logic they've used in constructing their position. One false support (or lack of valid support) taints the speaker's entire conclusion. A tainted position is not necessarily wrong, but must be discarded until it can be validly reproved from the ground up. To accept statements on the basis of someone's word—because you like and trust the speaker, because you don't want to exercise your brain, because the statement appeals to you, because it is more comfortable to believe than to question—is to prefer ignorance and delusion to knowledge and truth.

Ideally, all public speaking would be conducted as open discussions rather than monologs. An atmosphere in which questioning and debate are encouraged is far more informative and less prone to abuse.

Without hard facts, you can only test the logic of statements you hear. And, however sound you find the logic, you must also determine if the facts used are true in order to accept the statement as valid. One does not stand without the other. But what if you, as do many, insist that it isn't so important that everything add up, and that the general drift and how you feel about what was said will do for proof. In that case, your motive seems seriously compromised and you should determine whether truth is your real goal, or if you actually search for something else.

We are targets in other ways. Public relations firms and professional opinion managers regularly fine-tune messages and images through psychological testing to produce the effect they wish. The ads we read, the politicians we hear, the causes we are enjoined to support base their appeals on the fact that most of the intended audience will react in predictable ways. They know that most would rather react than think. They know that even the knowledgeable are more predisposed towards a blind acceptance of pre-packaged emotional presentations, myths and wishful thinking, than to analyzing a sober presentation of fact. They know that, even if a person realizes that he or she is being manipulated (especially in a group situation), chances are very small that he or she will take the time to defend themselves, much less protest. They know that people prefer to believe what is comfortable and attractive—things that do not require self examination and which affirm their and their peer's preconceptions—no matter what the message may lack in soundness of reasoning or truth.

Finally, we must continually re-examine our own positions and reasoning. A lot of people boast about being "open minded." What they usually mean is that they'll allow others to stick to an opposing opinion, while not giving up or changing their own conclusion. This kind of attitude may be tolerant, but it falls far short of seeking the truth. We should, instead, be always "open minded" to facts and new perspectives. Just because we've grown up with, or based our thinking on certain concepts is no reason to ignore challenges to any premise we hold dear. We should be anxious to know if we're wrong and, if so, why we are wrong—whether or not it means we have to rethink our whole outlook. This may be extremely difficult for some to do—it is always easier, more comfortable and pleasant to go about in the same old habits, ignore reality, conform to the peer group and live in a fool's paradise. Error is everywhere, it's easy to believe, it's seductive, and it will become part of our very thinking process if we don't resist it.

Click Here to Go to the Top of this Page


| Home Page | | Overview | | History | | Articles | | Downloads | | Other Sources | | Books |