Useful Pages

Tuesday, June 28, 2011

Informal Logical Fallacies and Cognitive Biases

Just like formal fallacies, both informal logical fallacies and cognitive biases are a threat to clear reasoning and argument. One can consult these for basic lists of informal fallacies and cognitive biases:
Bradley Dowden, “Fallacies,” Internet Encyclopedia of Philosophy.

“List of Fallacies,” Wikipedia.

“List of Cognitive Biases,” Wikipedia.
I. Informal Logical Fallacies
In brief, informal fallacies can be divided into:
(1) Fallacies of relevance;
(2) Fallacies of defective induction;
(3) Fallacies of presumption;
(4) Fallacies of ambiguity (sophism)
A detailed list of the fallacies can be seen here:
(1) Fallacies of Relevance
Irrelevant Appeals
Appeal to emotion (argument ad populum)
Appeal to pity (argument ad misericordiam)
Appeal to Force (argumentum ad baculum)
Appeal to Authority (argumentum ad verecundiam)
Appeal to nature (argument from nature)
Appeal to Ignorance (argumentum ad ignoratiam)
Red herring fallacy
Irrelevant Conclusion (ignoratio elenchi)
Straw man argument
Ad Hominem Argument
Poisoning the well
Guilt by association
Naturalistic fallacy
Moralistic Fallacy
Argument from silence (argumentum ex silentio)
Genetic fallacy
Gambler’s Fallacy
Tu quoque
(2) Fallacies of defective induction
Argument from Ignorance (ad ignorantiam)
Appeal to Inappropriate Authority (ad verecundiam)
False cause
Hasty generalization
Faulty generalization
Other inductive fallacies
Slothful induction
Overwhelming exception
Biased sample
Misleading vividness
Statistical special pleading
(3) Fallacies of presumption (fallacies of illegitimate presumption)
Complex question
False cause
Begging the question
Accident
Converse accident
No True Scotsman Fallacy (fallacy of ambiguity and presumption)
(4) Fallacies of ambiguity (sophisms)
Fallacy of equivocation
Fallacy of amphiboly
Fallacy of accent
Fallacy of composition
Fallacy of Division
Other fallacies relevant to economics include the following:
(1) Paradox of thrift/saving
(2) Paradox of costs
(3) Paradox of debt
(4) Paradox of liquidity
(5) Paradox of tranquillity (Minsky).
II. Cognitive Biases

In modern psychology, an important theory about how we make decisions under uncertainty is the “heuristic and biases” method of Tversky and Kahneman (1974; Kahneman et al. 1982). This posits that human being use heuristics, which are short cuts to solve problems that are not always reliable, and that we are subject to cognitive biases that might impair the success of our decisions. As I have said previously, the modern Post Keynesian theory of subjective expectations can use these theories and focus on how cognitive biases influence the investment decision and how investment fluctuates.

The classification of cognitive biases is below:
(1) Decision-making and Behavioural biases
Anchoring
Attentional Bias
Bandwagon effect
Bias blind spot
Choice-supportive bias
Confirmation bias
Congruence bias
Contrast effect
Denomination effect
Distinction bias
Endowment effect
Experimenter’s or Expectation bias
Focusing effect
Framing effect
Hostile media effect
Hyperbolic discounting
Illusion of control
Impact bias
Information bias
Irrational escalation
Loss aversion
Mere exposure effect
Money illusion
Moral credential effect
Negativity bias
Neglect of probability
Normalcy bias
Omission bias
Outcome bias
Planning fallacy
Post-purchase rationalization
Pseudocertainty effect
Reactance
Restraint bias
Selective perception
Semmelweis reflex
Social comparison bias
Status quo bias
Unit bias
Wishful thinking
Zero-risk bias
(2) Biases in probability and belief
Ambiguity effect
Anchoring effect
Attentional bias
Availability heuristic
Availability cascade
Base rate neglect or Base rate fallacy
Belief bias
Clustering illusion
Conjunction fallacy
Forward Bias
Gambler’s fallacy
Hindsight bias
Illusory correlation
Observer-expectancy effect
Optimism bias
Ostrich effect
Overconfidence effect
Positive outcome bias
Pareidolia
Pessimism bias
Primacy effect
Recency effect
Disregard of regression toward the mean
Stereotyping
Subadditivity effect
Subjective validation
Well travelled road effect
(3) Social biases
Actor–observer bias
Dunning–Kruger effect
Egocentric bias
Forer effect (aka Barnum effect)
False consensus effect
Fundamental attribution error
Halo effect
Illusion of asymmetric insight
Illusion of transparency
Illusory superiority
Ingroup bias
Just-world phenomenon
Moral luck
Outgroup homogeneity bias
Projection bias
Self-serving bias
System justification
Trait ascription bias
Ultimate attribution error
(4) Memory errors
Cryptomnesia
Egocentric bias
False memory
Hindsight bias
Reminiscence bump
Rosy retrospection
Self-serving bias
Suggestibility
Telescoping effect
Von Restorff effect
III. Cognitive Closure?
There are some philosophers like Colin McGinn and even Noam Chomsky who claim that human beings are subject to a phenomenon called cognitive closure, the belief that certain problems are insolvable, and that our minds are “closed”: that our minds are subject to cognitive limitations such that we cannot begin to understand certain ideas and concepts, or solve some problems. The mind-body problem, identity, and free will might be such problems:
“We … [sc. suffer] from what I call ‘cognitive closure’ with respect to the mind-body problem. Just as a dog cannot be expected to solve the problems about space and time and the speed of light that it took a brain like Einstein's to solve, so maybe the human species cannot be expected to understand how the universe contains mind and matter in combination.” (McGinn 2002: 182).
I wonder whether the problem of induction might be added to the list? (although many philosophers might argue that it has in fact been solved: there is no rational justification for it).

However, other philosophers disagree with the “cognitive closure” thesis. Daniel Dennett presents a good case against it in Darwin’s Dangerous Idea: Evolution and the Meanings of Life (London, 1996) pp. 381–383. In essence, Dennett thinks that language makes the difference: as is well known, there are in principle an infinite number of meaningful sentences that could be constructed in any natural language. There is, he thinks, a set of possible sentences that best describe the solution to the mind–body problem, and why could we not understand that set of propositions? If the problem is solvable, then it could be explained in natural language. Although Dennett does not rule out the possibility of ‘cognitive closure,’ he concludes that there is “no evidence of the reality or even likelihood of ‘cognitive closure’ in human beings” (Dennett 1996: 382). That appears to be good news for the power of the human mind and human reason!


BIBLIOGRAPHY

Copi, I. M., Cohen, C. and K. McMahon. 2011. Introduction to Logic (14th edn), Pearson Education, Upper Saddle River, N.J. and Harlow.

Dennett, D. C. 1996. Darwin’s Dangerous Idea: Evolution and the Meanings of Life, Penguin, London.

Kahneman, D., Slovic, P. and A. Tversky (eds), 1982. Judgment Under Uncertainty: Heuristics and Biases, Cambridge University Press, Cambridge.

McGinn, C. 2002. The Making of a Philosopher: My Journey Through Twentieth-Century Philosophy, HarperCollins, New York.

Tversky, A. and D. Kahneman, 1974. “Judgment under Uncertainty: Heuristics and Biases,” Science (American Association for the Advancement of Science) 185 (4157): 1124–1131.

1 comment:

  1. "Cognitive closure" is just Kant's silly "ding an sich" refried. Unfortunately, Kant's worst idea dominated nearly all 20th century philosophy. The latter German Idealists, Fichte, Schelling and Hegel rejected it as the self-declared nothing it is. Philosophy has not yet reattained their depth, coherence and clarity of thought. Their followers, the British Idealist tradition (Brand Blanshard being one of the last in that) and some of the Marxist tradition, including Lenin, were among the too few opponents, apart from scholars of the German Idealists. Dennett is usually right, but he is amazingly ungenerous and incomprehending of his intellectual predecessors.

    ReplyDelete