Fallacies and biases

Decision making and behavioral bias

TitleNotesDetailExample(s)StrategyOriginal link
Bias blind spot
the tendency not to compensate for one’s own cognitive biases, or the tendency to believe that one is less affected by cognitive biases than other people.
   
Choice-supportive bias
the tendency to remember one’s choices as better than they actually were.
   
Confirmation bias
You favor things that confirm your existing beliefs.
We are primed to see and agree with ideas that fit our preconceptions, and to ignore and dismiss information that conflicts with them.
People generally prefer to spend more time looking at information which supports their political stance, while neglecting information that contradicts it.
Think of your ideas and beliefs as software you’re actively trying to find problems with rather than things to be defended. “The first principle is that you must not fool yourself – and you are the easiest person to fool.” – Richard Feynman
Congruence bias
the tendency to test hypotheses exclusively through direct testing, in contrast to tests of possible alternative hypotheses.
   
Contrast effect
the enhancement or diminishment of a weight or other measurement when compared with recently observed contrasting object.
   
Déformation professionnelle
the tendency to look at things according to the conventions of one’s own profession, forgetting any broader point of view.
   
Endowment effect
the fact that people often demand much more to give up an object than they would be willing to pay to acquire it.
   
Exposure-suspicion bias
a knowledge of a subject’s disease in a medical study may influence the search for causes.
   
Extreme aversion
most people will go to great lengths to avoid extremes. People are more likely to choose an option if it is the intermediate choice.
   
Focusing effect
prediction bias occurring when people place too much importance on one aspect of an event; causes error in accurately predicting the utility of a future outcome.
   
Framing effect
You allow yourself to be unduly influenced by context and delivery
We all like to think that we think independently, but the truth is that all of us are, in fact, influenced by delivery, framing and subtle cues. This is why the ad industry is a thing, despite almost everyone believing they’re not affected by advertising messages.
This is why the ad industry is a thing, despite almost everyone believing they’re not affected by advertising messages. The phrasing of how a question is posed, such as for a proposed law being voted on, has been shown to have a significant effect on the outcome.
Only when we have the intellectual humility to accept the fact that we can be manipulated, can we hope to limit how much we are. Try to be mindful of how things are being put to you.
Groupthink
You let the social dynamics of a group situation override the best outcomes.
Dissent can be uncomfortable and dangerous to one’s social standing, and so often the most confident or first voice will determine group decisions.
Groupthink is sometimes stated to occur (more broadly) within natural groups within the community, for example to explain the lifelong different mindsets of those with differing political views (such as “conservatism” and “liberalism” in the U.S. political context [5]) or the purported benefits of team work vs. work conducted in solitude.
Rather than openly contradicting others, seek to facilitate objective means of evaluation and critical thinking practices as a group activity.
Hyperbolic discounting
the tendency for people to have a stronger preference for more immediate payoffs relative to later payoffs, the closer to the present both payoffs are.
   
Illusion of control
the tendency for human beings to believe they can control or at least influence outcomes that they clearly cannot.
   
Impact bias
the tendency for people to overestimate the length or the intensity of the impact of future feeling states.
   
Information bias
In epidemiology, Information bias refers to bias arising from measurement error.[1] Information bias is also referred to as observational bias and misclassification.
   
Irrational escalation
the tendency to make irrational decisions based upon rational decisions in the past or to justify actions already taken.
   
Loss aversion
the disutility of giving up an object is greater than the utility associated with acquiring it”.[note 2] (see also sunk cost effects and Endowment effect).
   
Mere exposure effect
the tendency for people to express undue liking for things merely because they are familiar with them.
   
Neglect of probability
the tendency to completely disregard probability when making a decision under uncertainty.
   
Obsequiousness bias
the tendency to systematically alter responses in the direction they perceive desired by the investigator.
   
Omission bias
the tendency to judge harmful actions as worse, or less moral, than equally harmful omissions (inactions).
   
Outcome bias
the tendency to judge a decision by its eventual outcome instead of based on the quality of the decision at the time it was made.
   
Post-purchase rationalization
the tendency to persuade oneself through rational argument that a purchase was a good value.
   
Pseudocertainty effect
the tendency to make risk-averse choices if the expected outcome is positive, but make risk-seeking choices to avoid negative outcomes.
   
Reactance
You’d rather do the opposite of what someone is trying to make you do.
When we feel our liberty is being constrained, our inclination is to resist, however in doing so we can over-compensate.
An example of such behavior can be observed when an individual engages in a prohibited activity in order to deliberately taunt the authority who prohibits it, regardless of the utility or disutility that the activity confers.
Be careful not to lose objectivity when someone is being coercive/manipulative, or trying to force you do something. Wisdom springs from reflection, folly from reaction.
Selective perception
the tendency for expectations to affect perception.
   
Status quo bias
the tendency for people to like things to stay relatively the same (see also Endowment effect and Loss aversion).
   
Sunk cost fallacy
You irrationally cling to things that have already cost you something.
When we’ve invested our time, money, or emotion into something, it hurts us to let it go. This aversion to pain can distort our better judgment and cause us to make unwise investments.
For instance, if you’ve spent money on a meal but you only feel like eating half of it, it’s irrational to continue to stuff your face just because ‘you’ve already paid for it’; especially considering the fact that you’re wasting actual time doing so.
To regain objectivity, ask yourself: had I not already invested something, would I still do so now? What would I counsel a friend to do if they were in the same situation?
Survivorship (graveyard) bias
a form of selection bias focusing on what has survived to the present and ignoring what must have been lost.
   
Unacceptability bias
questions that may embarrass or invade privacy are refused or evaded.
   
Unit bias
the tendency to want to finish a given unit of a task or an item with strong effects on the consumption of food in particular
   
Von Restorff effect
the tendency for an item that “stands out like a sore thumb” to be more likely to be remembered than other items.
   
Zero-risk bias
the preference for reducing a small risk to zero over a greater reduction in a larger risk. It is relevant, for example, to the allocation of public health resources and the debate about nuclear power.

Memory error

TitleNotesDetailExample(s)StrategyOriginal link
Beneffectance
perceiving oneself as responsible for desirable outcomes but not responsible for undesirable ones. (Term coined by Greenwald (1980))
   
Confabulation or false memory
Remembering something that never actually happened.
   
Consistency bias
incorrectly remembering one’s past attitudes and behaviour as resembling present attitudes and behaviour.
   
Cryptomnesia
a form of misattribution where a memory is mistaken for imagination.
   
Declinism
You remember the past as better than it was, and expect the future to be worse than it will likely be.
Despite living in the most peaceful and prosperous time in history, many people believe things are getting worse. The 24 hour news cycle, with its reporting of overtly negative and violent events, may account for some of this effect.
We can also look to the generally optimistic view of the future in the early 20th century as being shifted to a dystopian and apocalyptic expectation after the world wars, and during the cold war. The greatest tragedy of this bias may be that our collective expectation of decline may contribute to a real-world self-fulfilling prophecy.
Instead of relying on nostalgic impressions of how great things used to be, use measurable metrics such as life expectancy, levels of crime and violence, and prosperity statistics.
Egocentric bias
recalling the past in a self-serving manner, e.g. remembering one’s exam grades as being better than they were, or remembering a caught fish as being bigger than it was
   
Hindsight bias
sometimes called the “I-knew-it-all-along” effect: the inclination to see past events as being predictable, based on knowledge of later events.
   
Selective Memory and selective reporting
Selective reporting refers to rare events which are widely reported, thereby altering the perception of how common they actually are.
This is common amongst practitioners of bullshit, where they — and their customers — selectively report their successes but make no mention of their numerous failures. It contrasts to cherry picking in that selective reporting is often unintentional, and concentrates on the reporting and memory of events, while cherry picking is more specific to selecting evidence and actively ignoring evidence that isn’t favourable. Both are mechanisms of confirmation bias and are reasons why anecdotal evidence, no matter how “convincing”, is not accepted as evidence in science and rationalism.
The lottery, school shootings, psychics, nuclear incidents, vaccine complications, recreational drug deaths (see wiki)
 
Serial position effect
Serial position effect is the tendency of a person to recall the first and last items in a series best, and the middle items worst.
   
Suggestibility
a form of misattribution where ideas suggested by a questioner are mistaken for memory. Often a key aspect of hypnotherapy.
   
Observational bias
TitleNotesDetailExample(s)StrategyOriginal link
Information bias
In epidemiology, Information bias refers to bias arising from measurement error.[1] Information bias is also referred to as observational bias and misclassification.
   
Confounding bias
A systematic distortion in the measure of association between exposure and the health outcome caused by mixing the effect of the exposure of primary interest with extraneous risk factors.
   
Selection bias
Selection bias is the bias introduced by the selection of individuals, groups, or data for analysis in such a way that proper randomization is not achieved, thereby failing to ensure that the sample obtained is representative of the population intended to be analyzed.[1] It is sometimes referred to as the selection effect. The phrase “selection bias” most often refers to the distortion of a statistical analysis, resulting from the method of collecting samples. If the selection bias is not taken into account, then some conclusions of the study may be false.
   
The streetlight effect
The streetlight effect, or the drunkard’s search principle, is a type of observational bias that occurs when people only search for something where it is easiest to look.
The anecdote goes back at least to the 1920s,[5][6][7][8] and has been used metaphorically in the social sciences since at least 1964, when Abraham Kaplan referred to it as “the principle of the drunkard’s search”.[9] The anecdote has also been attributed to Nasreddin. According to Idries Shah, this tale is used by many Sufis, commenting upon people who seek exotic sources for enlightenment.[10]
A policeman sees a drunk man searching for something under a streetlight and asks what the drunk has lost. He says he lost his keys and they both look under the streetlight together. After a few minutes the policeman asks if he is sure he lost them here, and the drunk replies, no, and that he lost them in the park. The policeman asks why he is searching here, and the drunk replies, “this is where the light is”
 
 

Probability and belief bias

TitleNotesDetailExample(s)StrategyOriginal link
Hindsight bias
sometimes called the “I-knew-it-all-along” effect: the inclination to see past events as being predictable, based on knowledge of later events.
   
Anchoring
The first thing you judge influences your judgment of all that follows.
Human minds are associative in nature, so the order in which we receive information helps determine the course of our judgments and perceptions.
For instance, the first price offered for a used car sets an ‘anchor’ price which will influence how reasonable or unreasonable a counter-offer might seem. Even if we feel like an initial price is far too high, it can make a slightly less-than-reasonable offer seem entirely reasonable in contrast to the anchor price.
Be especially mindful of this bias during financial negotiations such as houses, cars, and salaries. The initial price offered is proven to have a significant effect.
Anthropic bias
the tendency for one’s evidence to be biased by observation selection effects.
   
Attentional bias
neglect of relevant data when making judgments of a correlation or association.
   
Availability heuristic
Your judgments are influenced by what springs most easily to mind.
How recent, emotionally powerful, or unusual your memories are can make them seem more relevant. This, in turn, can cause you to apply them too readily.
For instance, when we see news reports about homicides, child abductions, and other terrible crimes it can make us believe that these events are much more common and threatening to us than is actually the case.
Try to gain different perspectives and relevant statistical information rather than relying purely on first judgments and emotive influences.
Backfire effect
When your core beliefs are challenged, it can cause you to believe even more strongly.
We can experience being wrong about some ideas as an attack upon our very selves, or our tribal identity. This can lead to motivated reasoning which causes us to double-down, despite disconfirming evidence.
A study which examined parents’ intent to vaccinate their children, found that giving parents who are against vaccination information showing why vaccinating their child is the best course of action, they sometimes become more likely to believe in a link between vaccination and autism.
“It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.” – Mark Twain
Belief bias
If a conclusion supports your existing beliefs, you’ll rationalize anything that supports it.
It’s difficult for us to set aside our existing beliefs to consider the true merits of an argument. In practice this means that our ideas become impervious to criticism, and are perpetually reinforced.
For example we might assign a 95%+ chance that thinking in terms of probability will help us think better, and a less than 1% chance that our existing beliefs have no room for any doubt. Thinking probabalistically forces us to evaluate more rationally.
A useful thing to ask is ‘when and how did I get this belief?’ We tend to automatically defend our ideas without ever really questioning them.
Clustering illusion
the tendency to see patterns where actually none exist, including apophenia and pareidolia
   
Curse of knowledge
Once you understand something you presume it to be obvious to everyone.
Things makes sense once they make sense, so it can be hard to remember why they didn’t. We build complex networks of understanding and forget how intricate the path to our available knowledge really is.
The curse of knowledge can make it difficult for experts to teach novices. Since experts are much more knowledgeable about the topic that they are teaching than their students, they often struggle to teach the material in a way that their students can understand. For example, a math professor might find it difficult to teach first-year students, because it’s hard for the professor to account for the fact that they have a different level of background knowledge than those students.
When teaching someone something new, go slow and explain like they’re ten years old (without being patronizing). Repeat key points and facilitate active practice to help embed knowledge.
Frequency illusion
the phenomenon in which people who just learn or notice something start seeing it everywhere. Also known as the Baader-Meinhof Phenomenon (from the Baader-Meinhof Gang).[4]
   
Hostile media effect
the tendency to perceive news coverage as biased against your position on an issue.
   
Illusory correlation
beliefs that inaccurately suppose a relationship between a certain type of action and an effect.
   
Neglect of prior base rates effect
the tendency to fail to incorporate prior known probabilities which are pertinent to the decision at hand.
   
Observer-expectancy effect
when a researcher expects a given result and therefore unconsciously manipulates an experiment or misinterprets data in order to find it (see also subject-expectancy effect).
   
Optimism bias
You overestimate the likelihood of positive outcomes.
There can be benefits to a positive attitude, but it’s unwise to allow such an attitude to adversely affect our ability to make rational judgments (they’re not mutually exclusive).
Wishful thinking can be a tragic irony insofar as it can create more negative outcomes, such as in the case of problem gambling.
If you make rational, realistic judgments you’ll have a lot more to feel positive about.
Overconfidence effect
the tendency to overestimate one’s own abilities.
   
Pessimism bias
You overestimate the likelihood of negative outcomes.
Pessimism is often a defense mechanism against disappointment, or it can be the result of depression and anxiety disorders.
Pessimists often justify their attitude by saying that they’ll either be vindicated or pleasantly surprised, however a pessimistic attitude may also limit potential positive outcomes.
Perhaps the worst aspect of pessimism is that even if something good happens, you’ll probably feel pessimistic about it anyway.
Placebo effect
If you believe you’re taking medicine it can sometimes ‘work’ even if it’s fake.
The placebo effect can work for stuff that our mind influences (such as pain) but not so much for things like viruses or broken bones.
Homeopathy, acupuncture, and many other forms of natural ‘medicine’ have been proven to be no more effective than placebo.
Homeopathy, acupuncture, and many other forms of natural ‘medicine’ have been proven to be no more effective than placebo. Keep a healthy body and bank balance by using evidence-based medicine from a qualified doctor.
Positive outcome bias
a tendency in prediction to overestimate the probability of good things happening to them (see also wishful thinking, optimism bias and valence effect).
   
Primacy effect
the tendency to weigh initial events more than subsequent events.
   
Recency effect
the tendency to weigh recent events more than earlier events (see also ‘peak-end rule’).
   
Reminiscence bump
the effect that people tend to recall more personal events from adolescence and early adulthood than from other lifetime periods.
   
Rosy retrospection
the tendency to rate past events more positively than they had actually rated them when the event occurred.
   
Spotlight effect
You overestimate how much people notice how you look and act.
Most people are much more concerned about themselves than they are about you. Absent overt prejudices, people generally want to like and get along with you as it gives them validation too.
For example, if a student had an assignment due in class and did not prepare as well as they should have, the student may start to panic and think that simply because they did not prepare well, the teacher will know and call on them for answers.
Instead of worrying about how you’re being judged, consider how you make others feel. They’ll remember this much more, and you’ll make the world a better place.
Subadditivity effect
the tendency to judge probability of the whole to be less than the probabilities of the parts.
   
Telescoping effect
the effect that recent events appear to have occurred more remotely and remote events appear to have occurred more recently.

Social Bias

TitleNotesDetailExample(s)StrategyOriginal link
Egocentric bias
recalling the past in a self-serving manner, e.g. remembering one’s exam grades as being better than they were, or remembering a caught fish as being bigger than it was
   
Actor-observer bias
the tendency for explanations for other individual’s behaviors to overemphasize the influence of their personality and underemphasize the influence of their situation. This is coupled with the opposite tendency for the self in that one’s explanations for their own behaviors overemphasize their situation and underemphasize the influence of their personality. (see also fundamental attribution error).
   
Barnum effect
You see personal specifics in vague statements by filling in the gaps.
Because our minds are given to making connections, it’s easy for us to take nebulous statements and find ways to interpret them so that they seem specific and personal.
Psychics, astrologers and others use this bias to make it seem like they’re telling you something relevant.
Psychics, astrologers and others use this bias to make it seem like they’re telling you something relevant. Consider how things might be interpreted to apply to anyone, not just you.
Bystander effect
You presume someone else is going to do something in an emergency situation.
When something terrible is happening in a public setting we can experience a kind of shock and mental paralysis that distracts us from a sense of personal responsibility. The problem is that everyone can experience this sense of deindividuation in a crowd.
When something terrible is happening in a public setting we can experience a kind of shock and mental paralysis that distracts us from a sense of personal responsibility. The problem is that everyone can experience this sense of deindividuation in a crowd.
If there’s an emergency situation, presume to be the one who will help or call for help. Be the change you want to see in the world.
Dunning-Kruger effect
The more you know, the less confident you’re likely to be.
The more you know, the less confident you’re likely to be. Because experts know just how much they don’t know, they tend to underestimate their ability; but it’s easy to be over-confident when you have only a simple idea of how things are.
One study of high-tech firms discovered that 32-42% of software engineers rated their skills as being in the top 5% of their companies. A nationwide survey found that 21% of Americans believe that it’s ‘very likely’ or ‘fairly likely’ that they’ll become millionaires within the next 10 years. Drivers consistently rate themselves above average. Medical technicians overestimate their knowledge in real-world lab procedures. In a classic study of faculty at the University of Nebraska, 68% rated themselves in the top 25% for teaching ability, and more than 90% rated themselves above average (which I’m sure you’ll notice is mathematically impossible).
“The whole problem with the world is that fools and fanatics are so certain of themselves, yet wiser people so full of doubts.” – Bertrand Russell
False consensus effect
the tendency for people to overestimate the degree to which others agree with them.
   
Forer effect (a.k.a. Barnum Effect)
the tendency to give high accuracy ratings to descriptions of their personality that supposedly are tailored specifically for them, but are in fact vague and general enough to apply to a wide range of people. For example, horoscopes.
   
Fundamental attribution error
You judge others on their character, but yourself on the situation.
If you haven’t had a good night’s sleep, you know why you’re being a bit slow; but if you observe someone else being slow you don’t have such knowledge and so might presume them to just be a slow person.
If you haven’t had a good night’s sleep, you know why you’re being a bit slow; but if you observe someone else being slow you don’t have such knowledge and so you might presume them to just be a slow person.
It’s not only kind to view others’ situations with charity, it’s more objective too. Be mindful to also err on the side of taking personal responsibility rather than justifying and blaming.
Halo effect
How much you like someone, or how attractive they are, influences your other judgments of them.
Our judgments are associative and automatic, and so if we want to be objective we need to consciously control for irrelevant influences. This is especially important in a professional setting.
Things like attractiveness can unduly influence issues as important as a jury deciding someone’s guilt or innocence. If someone is successful or fails in one area, this can also unfairly color our expectations of them in another area.
If you notice that you’re giving consistently high or low marks across the board, it’s worth considering that your judgment may be suffering from the halo effect.
Illusion of asymmetric insight
people perceive their knowledge of their peers to surpass their peers’ knowledge of them.
   
Illusion of transparency
people overestimate others’ ability to know them, and they also overestimate their ability to know others.[6]
   
In-group bias
You unfairly favor those who belong to your group.
We presume that we’re fair and impartial, but the truth is that we automatically favor those who are most like us, or belong to our groups.
Race can be used as an example of in-group and out-group tendencies because society often categorizes individuals into groups based on race (Caucasian, African American, Latino, etc.). One study that examined race and empathy found that participants receiving nasally administered oxytocin had stronger reactions to pictures of in-group members making pained faces than to pictures of out-group members with the same expression.[14] This shows that oxytocin may be implicated in our ability to empathize with individuals of different races, with individuals of one race potentially biased towards helping individuals of the same race than individuals of another race when they are experiencing pain.
Try to imagine yourself in the position of those in out-groups; whilst also attempting to be dispassionate when judging those who belong to your in-groups.
Just world hypothesis
Your preference for a just world makes you presume that it exists.
A world in which people don’t always get what they deserve, hard work doesn’t always pay off, and injustice happens is an uncomfortable one that threatens our preferred narrative. However, it is also the reality.
This bias is often manifest in ideas such as ‘what goes around comes around’ or an expectation of ‘karmic balance’, and can also lead to blaming victims of crime and circumstance.
A more just world requires understanding rather than blame. Remember that everyone has their own life story, we’re all fallible, and bad things happen to good people.
Lake Wobegon effect
the human tendency to report flattering beliefs about oneself and believe that one is above average (see also worse-than-average effect, overconfidence effect. and illusory superiority).
   
Modesty bias
The tendency to blame failures on oneself while attributing successes to situational factors. Opposite of self-serving bias.
   
Negativity bias
You allow negative things to disproportionately influence your thinking.
The pain of loss and hurt are felt more keenly and persistently than the fleeting gratification of pleasant things. We are primed for survival, and our aversion to pain can distort our judgment for a modern world.
We are primed for survival, and our aversion to pain can distort our judgment for a modern world. In an evolutionary context it makes sense for us to be heavily biased to avoid threats, but because this bias affects our judgments in other ways it means we aren’t giving enough weight to the positives.
Pro-and-con lists, as well as thinking in terms of probabilities, can help you evaluate things more objectively than relying on a cognitive impression.
Notational bias
Notational bias is a form of cultural bias in which a notation induces the appearance of a nonexistent natural law.
 
For example, consider a scientific experiment that seeks to measure whether most people keep their cars inside or outside garages. How does such a notation cope with cloth car covers, or carports which consist of a roof with open sides? This is a source of error caused by the available categories. It is a form of notational error.
 
Outgroup homogeneity bias
individuals see members of their own group as being relatively more varied than members of other groups.
   
Projection bias
the tendency to unconsciously assume that others share the same or similar thoughts, beliefs, values, or positions.
   
Self-fulfilling prophecy
the tendency to engage in behaviors that elicit results which will (consciously or subconsciously) confirm our beliefs.
   
Self-serving bias
You believe your failures are due to external factors, yet you’re personally responsible for your successes.
Many of us enjoy unearned privileges, luck and advantages that others do not. It’s easy to tell ourselves that we deserve these things, whilst blaming circumstance when things don’t go our way.
For example, a student who attributes earning a good grade on an exam to their own intelligence and preparation but attributes earning a poor grade to the teacher’s poor teaching ability or unfair test questions might be exhibiting the self-serving bias.
When judging others, be mindful of how this bias interacts with the just-world hypothesis, fundamental attribution error, and the in-group bias.
System justification
the tendency to defend and bolster the status quo, i.e. existing social, economic, and political arrangements tend to be preferred, and alternatives disparaged sometimes even at the expense of individual and collective self-interest.
   
Trait ascription bias
the tendency for people to view themselves as relatively variable in terms of personality, behavior and mood while viewing others as much more predictable.
   
Ultimate attribution error
A sub-type of the fundamental attribution error above, the ultimate attribution error occurs when negative behavior in one’s own group is explained away as circumstantial, but negative behavior among outsiders is believed to be evidence of flaws in character.

Fallacies

TitleNotesDetailExample(s)Original link
Ad hominen
Attacking your opponent’s character or personal traits in an attempt to undermine their argument.
Ad hominem attacks can take the form of overtly attacking somebody, or casting doubt on their character. The result of an ad hominem attack can be to undermine someone without actually engaging with the substance of their argument.
After Sally presents an eloquent and compelling case for a more equitable taxation system, Sam asks the audience whether we should believe anything from a woman who isn’t married and probably eats her own boogers.
Ambiguity
Using double meanings or ambiguities of language to mislead or misrepresent the truth.
Politicians are often guilty of using ambiguity to mislead and will later point to how they were technically not outright lying if they come under scrutiny. It’s a particularly tricky and premeditated fallacy to commit.
When the judge asked the defendant why he hadn’t paid his parking fines, he said that he shouldn’t have to pay them because the sign said ‘Fine for parking here’ and so he naturally presumed that it would be fine to park there.
Anecdotal
Using personal experience or an isolated example instead of a valid argument, especially to dismiss statistics
It’s often much easier for people to believe someone’s testimony as opposed to understanding variation across a continuum. Scientific and statistical measures are almost always more accurate than individual perceptions and experiences.
Jason said that that was all cool and everything, but his grandfather smoked, like, 30 cigarettes a day and lived until 97 – so don’t believe everything you read about meta analyses of sound studies showing proven causal relationships.
Appeal to authority
Saying that because an authority thinks something, it must therefore be true.
It’s important to note that this fallacy should not be used to dismiss the claims of experts, or scientific consensus. Appeals to authority are not valid arguments, but nor is it reasonable to disregard the claims of experts who have a demonstrated depth of knowledge unless one has a similar level of understanding.
Unable to defend his argument that the Earth is flat, Bob said that his friend Terry was a qualified botanist who also believed the Earth to be flat, and had even seen it from up in a tree.
Appeal to emotion
Manipulating an emotional response in place of a valid or compelling argument.
Appeals to emotion include appeals to fear, envy, hatred, pity, guilt, and more. Though a valid, and reasoned, argument may sometimes have an emotional aspect, one must be careful that emotion doesn’t obscure or replace reason.
Luke didn’t want to eat his sheep brains with chopped liver and brussels sprouts, but his father told him to think about the poor, starving children in a third world country who weren’t fortunate enough to have any food at all.
Appeal to nature
Making the argument that because something is ‘natural’ it is therefore valid, justified, inevitable, good, or ideal.
Many ‘natural’ things are also considered ‘good’, and this can bias our thinking; but naturalness itself doesn’t make something good or bad. For instance murder could be seen as very natural, but that doesn’t mean it’s justifiable.
The medicine man rolled into town on his bandwagon offering various natural remedies, such as very special plain water. He said that it was only natural that people should be wary of ‘artificial’ medicines like antibiotics.
Bandwagon
Appealing to popularity or the fact that many people do something as an attempted form of validation.
The flaw in this argument is that the popularity of an idea has absolutely no bearing on its validity. If it did, then the Earth would have made itself flat for most of history to accommodate this popular belief.
Shamus pointed a finger at Sean and asked him to explain how so many people could believe in leprechauns if they’re only a silly old superstition. Sean wondered how so many people could believe in things based on popularity.
Begging the question
A circular argument in which the conclusion is included in the premise.
This logically incoherent argument often arises in situations where people have an assumption that is very ingrained, and therefore taken in their minds as a given. Circular reasoning is bad mostly because it’s not very good.
The word of Zorbo the Great is flawless and perfect. We know this because it says so in The Great and Infallible Book of Zorbo’s Best and Most Truest Things that are Definitely True and Should Not Ever Be Questioned.
Black-or-white
Where two alternative states are presented as the only possibilities, when in fact more possibilities exist.
Also known as the false dilemma, this insidious tactic has the appearance of forming a logical argument, but under closer scrutiny it becomes evident that there are more possibilities than the either/or choice that is presented.
Whilst rallying support for his plan to fundamentally undermine citizens’ rights, the Supreme Leader told the people they were either on his side, or on the side of the enemy.
Burden of proof
Saying that the burden of proof lies not with the person making the claim, but with someone else to disprove.
The burden of proof lies with someone who is making a claim, and is not upon anyone else to disprove. The inability, or disinclination, to disprove a claim does not make it valid (however we must always go by the best available evidence).
Bertrand declares that a teapot is, at this very moment, in orbit around the Sun between the Earth and Mars, and that because no one can prove him wrong his claim is therefore a valid one.
Composition / division
Assuming that what’s true about one part of something has to be applied to all, or other, parts of it.
Often when something is true for the part it does also apply to the whole, but because this isn’t always the case it can’t be presumed to be true. We must show evidence for why a consistency will exist.
Daniel was a precocious child and had a liking for logic. He reasoned that atoms are invisible, and that he was made of atoms and therefore invisible too. Unfortunately, despite his thinky skills, he lost the game of hide and go seek.
Conjunction fallacy
the tendency to assume that specific conditions are more probable than general ones.
  
Epidemiologist fallacy
The epidemiologist fallacy occurs when a scientist says “X causes Y” but where X is never measured and where the cause is “confirmed” by wee p-value. Laugh if you want, but Epidemiology as a subject would almost disappear without this device.
  
False cause
Presuming that a real or perceived relationship between things means that one is the cause of the other.
Many people confuse correlation (things happening together or in sequence) for causation (that one thing actually causes the other to happen). Sometimes correlation is coincidental, or it may be attributable to a common cause.
Pointing to a fancy chart, Roger shows how temperatures have been rising over the past few centuries, whilst at the same time the numbers of pirates have been decreasing; thus pirates cool the world and global warming is a hoax.
Gambler’s fallacy
the tendency to assume that independent random  events are influenced by previous random events. For example, “I’ve flipped heads with this coin five times consecutively, so the chance of tails coming out on the sixth flip is much greater than heads.”
  
Genetic
Judging something good or bad on the basis of where it comes from, or from whom it comes.
To appeal to prejudices surrounding something’s origin is another red herring fallacy. This fallacy has the same function as an ad hominem, but applies instead to perceptions surrounding something’s source or context.
Accused on the 6 o’clock news of corruption and taking bribes, the senator said that we should all be very wary of the things we hear in the media, because we all know how very unreliable the media can be.
Loaded question
Asking a question that has an assumption built into it so that it can’t be answered without appearing guilty.
Loaded question fallacies are particularly effective at derailing rational debates because of their inflammatory nature – recipients of a loaded question are compelled to defend themselves and may appear flustered or on the back foot.
Grace and Helen were both romantically interested in Brad. One day, with Brad sitting within earshot, Grace asked in an inquisitive tone whether Helen was having any problems with a fungal infection.
Ludic fallacy
the analysis of chance-related problems with the narrow frame of games. Ignoring the complexity of reality, and the non-Gaussian distribution of many things.
  
McNamara fallacy
The McNamara fallacy (also known as the quantitative fallacy[1]), named for Robert McNamara, the US Secretary of Defense from 1961 to 1968, involves making a decision based solely on quantitative observations (or metrics) and ignoring all others. The reason given is often that these other observations cannot be proven.
The first step is to measure whatever can be easily measured. This is OK as far as it goes. The second step is to disregard that which can’t be easily measured or to give it an arbitrary quantitative value. This is artificial and misleading. The third step is to presume that what can’t be measured easily really isn’t important. This is blindness. The fourth step is to say that what can’t be easily measured really doesn’t exist. This is suicide. — Daniel Yankelovich, “Corporate Priorities: A continuing study of the new demands on business” (1972).
The fallacy refers to McNamara’s belief as to what led the United States to defeat in the Vietnam War—specifically, his quantification of success in the war (e.g., in terms of enemy body count), ignoring other variables.[2]
Middle ground
Saying that a compromise, or middle point, between two extremes must be the truth.
Much of the time the truth does indeed lie between two extreme points, but this can bias our thinking: sometimes a thing is simply untrue and a compromise of it is also untrue. Half way between truth and a lie, is still a lie.
Holly said that vaccinations caused autism in children, but her scientifically well-read friend Caleb said that this claim had been debunked and proven false. Their friend Alice offered a compromise that vaccinations cause some autism.
No true Scotsman
Making what could be called an appeal to purity as a way to dismiss relevant criticisms or flaws of an argument.
This fallacy is often employed as a measure of last resort when a point has been lost. Seeing that a criticism is valid, yet not wanting to admit it, new criteria are invoked to dissociate oneself or one’s argument.
Angus declares that Scotsmen do not put sugar on their porridge, to which Lachlan points out that he is a Scotsman and puts sugar on his porridge. Furious, like a true Scot, Angus yells that no true Scotsman sugars his porridge.
Personal incredulity
Saying that because one finds something dicult to understand, it’s therefore not true.
Subjects such as biological evolution via the process of natural selection require a good amount of understanding before one is able to properly grasp them; this fallacy is usually used in place of that understanding.
Kirk drew a picture of a fish and a human and with effusive disdain asked Richard if he really thought we were stupid enough to believe that a fish somehow turned into a human through just, like, random things happening over time.
Planning fallacy
the tendency to underestimate task-completion times. Also formulated as Hofstadter’s Law: “It always takes longer than you expect, even when you take into account Hofstadter’s Law.”
  
Slippery slope
Asserting that if we allow A to happen, then Z will consequently happen too, therefore A should not happen.
The problem with this reasoning is that it avoids engaging with the issue at hand, and instead shifts attention to baseless extreme hypotheticals. The merits of the original argument are then tainted by unsubstantiated conjecture.
Colin asserts that if we allow children to play video games, then the next thing you know we’ll be living in a post-apocalyptic zombie wasteland with no money for guard rails to protect people from slippery slopes.
Special pleading
Moving the goalposts or making up exceptions when a claim is shown to be false.
Humans are funny creatures and have a foolish aversion to being wrong. Rather than appreciate the benefits of being able to change one’s mind through better understanding, many will invent ways to cling to old beliefs.
Edward Johns claimed to be psychic, but when his ‘abilities’ were tested under proper scientific conditions, they magically disappeared. Edward explained this saying that one had to have faith in his abilities for them to work.
Strawman
Misrepresenting someone’s argument to make it easier to attack.
Misrepresenting someone’s argument to make it easier to attack. By exaggerating, misrepresenting, or just completely fabricating someone’s argument, it’s much easier to present your own position as being reasonable, but this kind of dishonesty serves to undermine rational debate.
After Will said that we should be nice to kittens because they’re fluffy and cute, Bill says that Will is a mean jerk who wants to be mean to poor defenseless puppies.
The ecological fallacy
An ecological fallacy (also ecological inference fallacy[1] or population fallacy) is a formal fallacy in the interpretation of statistical data that occurs when inferences about the nature of individuals are deduced from inferences about the group to which those individuals belong.
‘Ecological fallacy’ is a term that is sometimes used to describe the fallacy of division, which is not a statistical fallacy. The four common statistical ecological fallacies are: confusion between ecological correlations and individual correlations, confusion between group average and total average, Simpson’s paradox, and confusion between higher average and higher likelihood.
Mean and median, individual and aggregate correlations, Robinson’s paradox (see wiki)
The fallacy fallacy
Presuming a claim to be necessarily wrong because a fallacy has been committed.
It is entirely possible to make a claim that is false yet argue with logical coherency for that claim, just as it is possible to make a claim that is true and justify it with various fallacies and poor arguments.
Recognizing that Amanda had committed a fallacy in arguing that we should eat healthy food because a nutritionist said it was popular, Alyse said we should therefore eat bacon double cheeseburgers every day.
The gambler’s fallacy
Believing that ‘runs’ occur to statistically independent phenomena such as roulette wheel spins.
This commonly believed fallacy can be said to have helped create a city in the desert of Nevada USA. Though the overall odds of a ‘big run’ happening may be low, each spin of the wheel is itself entirely independent from the last.
Red had come up six times in a row on the roulette wheel, so Greg knew that it was close to certain that black would be next up. Suffering an economic form of natural selection with this thinking, he soon lost all of his savings.
The Texas sharpshooter
Cherry-picking data clusters to suit an argument, or finding a pattern to fit a presumption.
This ‘false cause’ fallacy is coined after a marksman shooting at barns and then painting a bullseye target around the spot where the most bullet holes appear. Clusters naturally appear by chance, and don’t necessarily indicate causation.
The makers of Sugarette Candy Drinks point to research showing that of the five countries where Sugarette drinks sell the most units, three of them are in the top ten healthiest countries on Earth, therefore Sugarette drinks are healthy.
Tu quoque
Avoiding having to engage with criticism by turning it back on the accuser – answering criticism with criticism.
Literally translating as ‘you too’ this fallacy is commonly employed as an effective red herring because it takes the heat o the accused having to defend themselves and shifts the focus back onto the accuser themselves.
Nicole identified that Hannah had committed a logical fallacy, but instead of addressing the substance of her claim, Hannah accused Nicole of committing a fallacy earlier on in the conversation.