Rhetoric is ‘the art of persuading through means other than reasoning’ (Chatfield, 2018, p. 154). There is a misconception that rhetoric is a malicious and manipulative: it can be used in this way if one chooses, but ultimately, rhetoric is a little more complicated than this. Persuasion in and of itself is not a bad thing and is not something that we can switch on and off as it is central to human communication.
Ethos, Logos and Pathos
The Greek philosopher Aristotle described successful acts of persuasion as comprised of three components: ethos, logos and pathos. Ethos establishes the reliability of the source of the author; logos demonstrates the ideas that are being conveyed; and pathos is the emotional appeal of the message and the way it is delivered (Chatfield, 2018). This is the final principle in the act of persuasion is Kairos as it refers to the opportune moment for a persuasive message to be delivered to have the biggest impact. This technique is used in persuasive industries such as advertising.
Fix it: Impartiality
- Avoid highly emotive language
- Be clear about relevant facts
- Show your awareness of the differing beliefs about significance of the facts
- Evaluate how reasonable a belief is
(Chatfield, 2018, p. 162)
A rhetorical device is a technique that is used to make a message more persuasive and enhance its appeal (Chatfield, 2018). You need to be aware of rhetorical devices and the affects they can have on a reader or listener. Here are some of the common techniques used, with examples:
- Rhetorical questions – a question that is asked to emphasise a point rather than in anticipation of an answer.
- Example: Do you really need me to convince you why stealing is wrong?
- Jargon – words that are only familiar to experts but used in a context to impress or exclude
- Example: Each stakeholder should be given due diligence before an action plan can commence, and a review undertaken to ascertain whether sweat equity is preferred over financial recompense.
- Buzzwords – words that are popular and used to create an appeal of insight, experience and expertise, but ultimately, lack real substance
- Example: The team met yesterday for a think outside the box session.
- Smokescreens – concealing an idea or a key point beneath a large string of irrelevant words
- Example: Have I ever taken drugs? I will refer you to my long, honourable career in public service and to the considerable sacrifices my wife and I have made in service to the greater good of the people.
- Euphemisms – replacing negative words with more neutral terms in order to play down the severity or impact of something
- Example: Your father was a bit tipsy at the charity event and is feeling a little delicate this morning.
- Hyperbole – deliberate use of exaggeration to convince or appeal
- Example: You can’t listen to the Greens about global warming; everything they say is completely untrue and they can’t be trusted.
- Litotes – deliberate use of understatement to convince or appeal
- Example: He’s not the smartest person I have ever met.
- Paralepsis – deliberate introduction of an idea whilst claiming that you do not wish to discuss it, so it can be discussed but responsibility for the discussion is withheld. Politicians are excellent at using this, as displayed here by President Trump:
- Example: I refuse to call Megyn Kelly a bimbo, because that would not be politically correct. (Example taken from Romm, 2016).
(Chatfield, 2018, p. 164-167)
Bias & Heuristic devices
Being aware of bias
Our own intuition can be an excellent tool in life; but when thinking critically, we need to know when our intuition can’t be trusted. As we have seen, there are a number of times when our thinking can become biased, or when our expectations differ from reality. Alexander Pope said that ‘to err is human’; therefore, in order to become competent critical thinkers, we should be wary of erroneous reasoning in order to make strong arguments and sense of the world. These biases are examined in depth in Chatfield (2018) in terms of three categories:
- The attachment of undue significance to random events or coincidence
- The disregard of events that have not happened
- Believing that things are simpler and more predictable than they really are
Further reading: Chapter 10: Overcoming Bias in Yourself and Others in Chatfield, T. (2018). Critical thinking. London, United Kingdom, Sage.
A heuristic device is ‘a cognitive short cut or ‘rule of thumb’, allowing for quick decision-making and judgment’ (Chatfield, 2018, p. 199). These devices are essential for everyday thinking as they allow us to make practical decisions in time-constrained scenarios where success does not need to be guaranteed, but we can have a high level of confidence in a particular outcome. However, it is important to note that heuristic devices are useful, but when used in a situation that requires deeper critical thinking, they can produce faults and flaws in judgments, known as cognitive biases.
There are four major heuristic devices which are extremely effective in most circumstances.
The affect heuristic – using the strength of positive or negative emotional reactions as a decision-making rule. For instance, if you are in a good mood, you might be more amenable to trying new things, for instance, skydiving: whereas if you are in a bad mood, you might be more reluctant to try new things, meaning you might change your mind about the skydiving after a bad day!
The availability heuristic – using how easily something springs to mind in order to influence a decision or to assess options. For instance, people are likely to overestimate the likelihood of death through shark attacks because of the amount of media coverage these events receive, when in fact you are more likely to be killed by falling coconuts!
The anchoring heuristic – using a starting value or frame of reference to influence your subsequent judgments, even when they are unrelated. For instance, if you were haggling the price of something with a seller at a market, you would both make offers relative to the first offer. This is a technique used in the retail and food industry.
The representativeness heuristic – using the plausibility of a story to make a decision, rather than the probability or underlying facts. For instance, consider the following: I am a young woman from the UK in her thirties who enjoys shopping, glamour, make-up and drinking cocktails. Is it more likely that I work in: a. health and social care; b. beauty therapy and hairdressing or c. finance and commerce?
If you chose option b then you are using the description given to make a decision on how closely it conforms to a stereotype, rather than assessing the facts. This form of mental shortcut is the representativeness heuristic. If you were to consider this logically then you might ask how many people in the UK work in each sector, and then make a decision based on this information.
What use are heuristics?
Heuristics are a useful tool for decision-making and have many scenarios when they are of use to human thinking, for example:
- Interaction on a local, human scale
- Clear choices based on reliable information
- Decision-making in particular areas of expertise
(Chatfield, 2018, p. 207)
As humans have evolved, the number of situations where heuristics are not applicable has grown, for instance, online interaction. For example, when engaging online, people are largely unknown and at a considerable distance, information is often inadequate or overwhelming for decision-making, and are not experts, through the very nature of our evolution in an online word (Chatfield, 2018).
Cognitive bias does not refer to the typical notion of bias; for example, favouring a candidate at interview because you like their dress rather than whether they can do the job. Cognitive bias is “a systematic error in thinking – part of our brain’s hard-wiring – that causes us to act repeatedly in an irrational way. Most people are unaware of these subconscious biases but often we’re all making the same irrational mistakes because of them” (Conceptually, 2019). There are many types of cognitive bias, most of which can be viewed here.
Below are a few cognitive biases that you might come across in the work of others or in your own thinking about academic assignments, and how you can take steps to avoid these in your written arguments:
The framing effect – presenting the same phenomena from specific angles in order to influence understanding through context and delivery. This cognitive bias can affect judgment and alter preference. This is used a lot in advertising; for example, consider which of the following is more appealing:
New spreadable cheese
With 10% fat
New spreadable cheese
Organic and delicious
You can see that the same product is being described, just in two different ways. It is likely that Option B was more appealing due to the way it has been framed: the percentage is higher, so it appeals to us on an emotional level, despite being exactly the same as Option A. We have seen in heuristics that our judgments can be affected by emotional appeals, so the framing effect appeals to this, by making something seem more appealing by altering the angle of presentation. In an academic context, it is necessary to look out for the framing effect in political, media and marketing rhetoric in order to make sound judgments on arguments and the truth.
Confirmation bias – This is a tendency to pay attention to the things that confirm our pre-existing ideas or beliefs, and in turn, ignore or dismiss the ideas or beliefs that don’t fit our own. This can be a common problem when putting together an academic argument, as we are likely to begin with our opinion, and only find evidence that supports this belief. You can avoid confirmation bias in your arguments by seeking alternative views to your own and evaluating them objectively in your assignments, and through the identification of flaws in your own argument too. This creates a balanced, objective argument as you will demonstrate to your reader that you have considered and analysed other perspectives to make a stronger case for your own.
The clustering illusion – This is similar to confirmation bias, as it is a tendency to see a pattern where none exists, especially after an event, whilst ignoring whatever doesn’t fit (Chatfield, 2018). This might be through the cherry-picking of data to fit a conclusion whilst ignoring evidence to the contrary; something that is easily done in research when our results do not support our conclusions. You can avoid this by presenting a true picture of results and conducting a meaningful investigation as to why the results are not consistent with any predictions; this level of critical thinking lends itself more to in-depth analysis and is likely to get you more marks. Remember, misrepresenting data is a form of academic misconduct!
The Dunning-Kruger effect – This type of bias is basically the more knowledgeable you are, the less confident you are likely to be; and vice versa. You might come across this in your reading, whereby an expert seems to be unsure or cautious of committing to an argument: but you should be mindful that this isn’t necessarily an admission of lack of knowledge, but rather a realistic self-assessment of the expert’s abilities. In short, “it takes some knowledge to realize how much you do not know” (Chatfield, 2018, p. 213).
The curse of knowledge – This is the tendency to believe that something is understood by everyone, when we understand it ourselves. In your work it is important to remember that some concepts might not be understood by your reader, so you might need to provide some background information to contextualise your argument, so your reader understands your motivations.
Flawed arguments & logic
One of the best ways to critique the work of an author is to identify any flaws in their argument or logic (logical fallacies). Even the most seasoned academics make mistakes or display faulty reasoning. The world of academia is open to challenge and alternatives, so don’t be afraid to challenge these. Being able to identify flaws in arguments can help build you evaluations of evidence and the works of others, but also ensure that you avoid them when building your own arguments.
Assuming a causal link
It is very easy to reason that when one event occurs alongside another, that there is an immediate cause and effect relationship; when this not always the case! For instance, whenever I make lasagna, my eyes water. Therefore, the lasagna is causing my eyes to water, right? Probably not! Let’s consider this logically. It is more likely that one of the ingredients, say an onion, is affecting my eyes, so whenever I make a lasagna, I chop onions; the onions are causing my eyes to water, not the lasagna! This may seem like an obvious error to avoid, but it’s easy to jump to a conclusion when events coincide.
Assuming a correlation
Similarly, when two trends occur together, it is easy to assume that the two trends are causally linked; but again, sometimes, it is simply coincidence! For example, the per capita consumption of mozzarella cheese correlates closely with the number of civil engineering doctorates awarded in the US (Vigen, 2019). Does this mean that if I scoff mozzarella by the pound that I will become a civil engineer? No; because there is no logical relationship between the cause and effect, there is no third cause and they are not directly linked (Cottrell, 2013).
An analogy is a comparison made between two things to note their similarities (Cottrell, 2013). The comparison of different items is a technique often used in literature to add suspense, shock or drama; but in scientific disciplines, an analogy needs to be true so that the similarities aid our understanding of both concepts. For instance, “coffee is like fuel, as it starts up the workforce in the morning” is a valid comparison, as caffeine is a stimulant, not unlike fuel for a car. However, “people who drink coffee are addicts like alcoholics” is a false analogy, as the comparison between caffeine addiction and alcohol addiction isn’t valid, and this statement assumes that all coffee drinkers have a caffeine dependency.
A tautological argument, also referred to as circular reasoning, is an argument that repeats the same points but in a different order or wording.An argument should always move forward, but in a tautological argument, the conclusion supports the premise, and the premise supports the conclusion, making it a closed argument. For example, we often use tautological arguments with our children:
Father: It’s time for bed.
Father: Because I said so.
This can be easily done in academic writing, so make sure you check your work to ensure that every sentence brings your argument forward.
Further reading: Chapter 7: Does it add up? In Cottrell, S. (2013). Critical thinking skills: Effective analysis, argument and reflection. (3rd ed.). London, UK: Palgrave.
Sometimes the reasoning behind arguments can be flawed or logically incorrect; these flaws are known as logical fallacies, and are used in the media, politics and everyday life to force conclusions that are not well-reasoned or logical yet are used because they are convincing and effective! A good critical thinker can spot a fallacious argument, as the conclusion is not logically generated or linked to the premises. Knowing the different logical fallacies can also help you identify and challenge them in the media and politics and ensure that you are not enticed by the conviction of an argument, but rather, the logic. There are several logical fallacies (see them all here) which can be divided into two major categories:
- Informal Fallacies –
- Fallacies of relevance – these arguments offer reasons to believe or do something, but in fact turn out not to be reasons at all
- Fallacies of unacceptable premises – these arguments attempt to introduce premises that may be relevant, but do not support the conclusion
- Formal fallacies – these arguments have the wrong form or structure, so will be invalid no matter what
(University of Auckland, 2019)
A lot of logical fallacies have Latin names; don’t be put off by this! It’s not important that you remember the name of the fallacy, but more importantly, that you can identify it in your own work and the work of others as unreasonable and illogical. Here are a few of the informal fallacies you are likely to encounter in your research:
Tu Quoque or ‘Who are you to talk?’ – This type of fallacy counters an argument by attacking the person presenting the argument for not practising what they preach. For example:
MP: I plan to tackle the obesity problem that has been caused by fast food.
Constituent: Hypocrite! You were caught last week coming out of McDonalds!
Here, the argument made by the constituent is irrelevant to the proposition. The MP is outlining an election campaign in the interest of their constituents, not in terms of his own physical health.
Red Herrings or ‘Appealing to…’ – This type of fallacy side tracks an argument by relying on premises that aren’t relevant or are appropriately relevant for the conclusion. These types of fallacious argument appeal to different things, such as authority, force, sympathy, popularity, nature, tradition; even the unbelievable and the ignorant (Chatfield, 2018) in order to divert attention from the true argument. Here are a few examples:
- This lawnmower is top of the range and used by Brad Pitt (Appeal to authority – what does Brad Pitt have to do with lawnmower quality?)
- The Anti-Vaccination movement has over 480 websites, so they must be right (Appeal to popularity by suggesting that the argument is stronger by how many people believe it).
- The earth is flat and if you don’t agree, then we can’t be friends anymore (Appeal to force by suggesting there will be forceful consequences for disbelief, so the conclusion is imposed by the person making the argument, despite it having nothing at all to do with the premise!)
- The Big Bang was theorised in the 1920s, whereas Christianity has been around for millennia, so the religion must be true (Appeal to tradition by suggesting that the longer something has existed makes it more believable).
Red herring fallacies are used a lot in political rhetoric, especially spin campaigns, where a candidate’s actions or non-response to position or policy is deflected with some sort of appeal, usually in attack of the opposition. It is important that you can spot this, especially when navigating potential fake news, to be able to sift legitimate evidence from fabrication.
Strawman – The strawman fallacy misrepresents an argument in order to weaken it, and then attacks the weakened version of the argument. For example:
MP 1: We need to do something about the CO2 emissions from vehicles; perhaps we should amend fuel efficiency standards for the next 20 years in order to reduce emissions.
MP (Opposition): That’s ridiculous; she is proposing that we all get rid of our cars, so we can’t get to work. It would kill the economy! (Example adapted from University of Auckland, 2019)
The opposition MP is twisting the words of MP1, who hasn’t mentioned anything about getting rid of cars, and in fact, hasn’t even said the word ‘cars’. The opposition MP is weakening the argument of MP 1 and attacking their version of it, to make MP 1’s proposal unreasonable. This tactic is used time and time again in parliamentary debates!
Ad hominem or ‘At the person’ – This type of fallacy rejects an argument by attacking the person making the argument, rather than focusing on the argument itself. For example:
Protestor: Doctor Lisa Webber’s research tells us that vaccinations are safe for our children; but she would say that! Her research is being funded by a big pharmaceutical company. And she doesn’t have children!
Here, the protestor is suggesting that the doctor is untrustworthy as she isn’t a mother, and that her research circumstances (her funding from a pharmaceutical company) are unseemly. Although this may be the case, these aren’t premises for a logical argument, as the protesting is basing their rebuttal on the person making the argument, not the content of the initial argument (which is in the doctor’s research). This can be a tricky fallacy to counter in assignments as we might be tempted to simply accept what is being said by an academic because of their status; but ultimately, we need to be more objective than this and make decisions based on the arguments and evidence presented. Chatfield (2018) recommends stripping these arguments right down to their content alone (so remove names, dates, organisations etc.) and then decide whether you agree or disagree. This will allow you to make a coherent counter-argument based on logic and reason.
False dilemma – This fallacy reduces a situation to two choices, creating a very ‘black and white’ picture of events, without demonstrating other possibilities. For example:
Pastor: Mr Archer had been in a coma for three years when he finally woke up. That’s either medicine or a miracle! I don’t know about you, but I haven’t see a cure for comas!
As you can see, this argument gives only two reasons to explain why the coma patient was able to recover after 3 years, without considering the possibility of other explanations. This is used to coerce the receiver into ‘picking a side’. However, in an argument (especially when constrained by a word count) it isn’t always possible to pick a side that represents your view entirely; nor is it probable to provide every single possibility! However, you can indicate the main options out of several possibilities: it’s all dependent on the language you use to frame this. Having two options contradicts our nature as curious human beings, so false dilemmas are used to invoke a sense of urgency through oversimplification of an issue. Academia strives to break down this ‘either/or’ view of the world but is very unlikely that there are two simple things that contribute to or affect a phenomenon. In fact, most disciplines view the impact of events on a spectrum (‘to what extent’ essay questions) or consider a series of factors that could potentially have an impact. It is worth remembering this when devising a research question for your dissertation.
Formal fallacies are named as such to reflect the fact that they are failures in deduction; the invalid nature of the argument cannot guarantee the truth of the conclusion (Chatfield, 2018). To demonstrate these fallacies, it’s necessary to present the arguments in standard form.
Affirming the consequent – This fallacy assumes that the presence of former will be true, when the latter is true, therefore the presence of the latter is sufficient to assume that the former is also the case (Chatfield, 2018). For example:
If you want to go out with me, then you’ll reply to my email.
Premise 1: If A, then B.
Premise 2: B
Conclusion: Therefore, A
You replied to my email, so you must want to go out with me.
This is an invalid argument as the premises that are provided do not give us enough sufficient information to deduce that the conclusion is correct. If we were to pose A as ‘if and only if you want to go out with me’, this allows us to draw a more definitive conclusion that the presence of B affirms the truth of A.
Denying the antecedent – This fallacy assumes that when one thing follows on from another, the absence of the former confirms the absence of the latter. For example:
If someone sneezes on the bus, then I will catch a cold.
Premise 1: If A, then B
Premise 2: Not A
Conclusion: Therefore, not B
No one has sneezed on the bus, so I won’t catch a cold.
Again, this argument is invalid as we are not provided with enough information within the premises to make an accurate conclusion. We know that there are more circumstances to catching a cold than whether someone sneezes on a bus, and we know that it isn’t a guarantee. We can use the ‘if and only if’ principle here to make this argument logical.
Base rate neglect – This fallacy ignores the underlying frequency of one element (or the statistical evidence of an element) and potentially deduces an incorrect conclusion about the likelihood of something occurring through the application of irrelevant elements. For example:
Most authors are millionaires.
Premise 1: Most As are Cs
Premise 2: Few Bs are Cs
Premise 3: X is a C
Conclusion: Probably, X is also an A
You’re an author, so you are probably a millionaire.
Most authors are not millionaires, only a handful of exceptionally successful authors are. This is the problem with base-rate neglect: the exceptions are being used as representative of an entire population, rather than considering statistical evidence or investigating the relative numbers. This type of fallacy is in play when stereotyping of minority groups happens, as true probabilities are ignored.
Overcome the fallacies
Be charitable – as critical thinkers, we have the potential to go around dismissing every argument we hear. However, the principle of charity is followed by all good critical thinkers. Being charitable means to assume that the argument of another is truthful, so that we can avoid prejudice, understand their argument in it’s strongest possible form and to consider the strengths of their arguments, as well as the weaknesses, in order to make our rebuttal even stronger (Chatfield, 2018).
Use the standard form – some fallacies are trickier to spot than others. Using the standard form allows you to view arguments in a logical form, so you can spot fallacious arguments in a clear way and allows you to counter any false premises.
Substitute examples – it can be useful to substitute the examples in arguments with more extreme examples to decide whether the argument is fallacious.
Hedge – academia is not built on absolutes or proof, but evidence that suggests a certain way of thinking or phenomena. Use hedged languages to indicate the likelihood of something being the case, but not fully committing to something as cold, hard proof.
Useful Sources for Fallacies, Bias & Rhetoric
Thou shalt not commit logical fallacies – website dedicated to the different logical fallacies with simple definitions and examples
Cognitive biases stuffing up your thinking – website dedicated to cognitive biases & heuristics
Super thinking by G. Weinberg & L. McCann – in the library, problem-solving, critical and logical thinking
Six Thinking Hats by E. De Bono – in the library, problem-solving and creative thinking.
Rhetoric is not just rhetorical – great video about the power of rhetoric
Catalog of Biases – great website for all types of bias, with definitions and examples