Epistemology

Epistemology (/ɪˌpɪstɪˈmɒləi/ (About this sound listen); from Greek, Modern ἐπιστήμη, epistēmē, meaning ‘knowledge’, and λόγος, logos, meaning ‘logical discourse’) is the branch of philosophy concerned with the theory of knowledge.[1]

Epistemology is the study of the nature of knowledge, justification, and the rationality of belief. Much debate in epistemology centers on four areas: (1) the philosophical analysis of the nature of knowledge and how it relates to such concepts as truth, belief, and justification,[2][3] (2) various problems of skepticism, (3) the sources and scope of knowledge and justified belief, and (4) the criteria for knowledge and justification. Epistemology addresses such questions as: “What makes justified beliefs justified?”,[4] “What does it mean to say that we know something?”,[5] and fundamentally “How do we know that we know?”.[6]

Metacognition – Thinking about thinking

Metacognition is “cognition about cognition”, “thinking about thinking”, “knowing about knowing”, becoming “aware of one’s awareness” and in general “higher-order thinking”. Meta is a prefix used in English to indicate a concept which is an abstraction behind another concept, used to complete or add to the latter. The term is etymologically derived from Ancient Greek from μετά (metá, cf. metaphysics, q.v. a science of that which transcends the physical, i.e., “higher than, transcending, overarching, dealing with the most fundamental matters”; cf. metacommunication (n.) “a secondary communication that takes place with, or underlies, a more obvious communication”; cf. metalogical (n.) “beyond the sphere of logic, transcending logic” (by 1865).

Veenman, M. V. J., Van Hout-Wolters, B. H. A. M., & Afflerbach, P.. (2006). Metacognition and learning: conceptual and methodological considerations. Metacognition and Learning

Plain numerical DOI: 10.1007/s11409-006-6893-0
DOI URL
directSciHub download

Lai, E. R.. (2011). Metacognition : A Literature Review Research Report. Research Reports

doi.org/10.2307/3069464

Baker, L.. (2010). Metacognition. In International Encyclopedia of Education

Plain numerical DOI: 10.1016/B978-0-08-044894-7.00484-X
DOI URL
directSciHub download

Nelson, T. O.. (1996). Consciousness and Metacognition. American Psychologist

Plain numerical DOI: 10.1037/0003-066X.51.2.102
DOI URL
directSciHub download

Fleming, S. M., & Lau, H. C.. (2014). How to measure metacognition. Frontiers in Human Neuroscience

Plain numerical DOI: 10.3389/fnhum.2014.00443
DOI URL
directSciHub download

Tanner, K. D.. (2012). Promoting student metacognition.. CBE Life Sciences Education

Plain numerical DOI: 10.1187/cbe.12-03-0033
DOI URL
directSciHub download

Thompson, V. A., Prowse Turner, J. A., & Pennycook, G.. (2011). Intuition, reason, and metacognition. Cognitive Psychology

Plain numerical DOI: 10.1016/j.cogpsych.2011.06.001
DOI URL
directSciHub download

Shimamura, A. P.. (2000). Toward a Cognitive Neuroscience of Metacognition. Consciousness and Cognition

Plain numerical DOI: 10.1006/ccog.2000.0450
DOI URL
directSciHub download

Fleming, S. M., & Frith, C. D.. (2014). The cognitive neuroscience of metacognition. The Cognitive Neuroscience of Metacognition

Plain numerical DOI: 10.1007/978-3-642-45190-4
DOI URL
directSciHub download

Garrison, D. R., & Akyol, Z.. (2015). Toward the development of a metacognition construct for communities of inquiry. Internet and Higher Education

Plain numerical DOI: 10.1016/j.iheduc.2014.10.001
DOI URL
directSciHub download

Georghiades, P.. (2004). From the general to the situated: Three decades of metacognition. International Journal of Science Education

Plain numerical DOI: 10.1080/0950069032000119401
DOI URL
directSciHub download

Foote, A. L., & Crystal, J. D.. (2007). Metacognition in the Rat. Current Biology

Plain numerical DOI: 10.1016/j.cub.2007.01.061
DOI URL
directSciHub download

Livingston, J.. (2003). Metacognition: An Overview. Educational resources information center

doi.org/10.1080/0950069032000119401

Akyol, Z., & Garrison, D. R.. (2011). Assessing metacognition in an online community of inquiry. Internet and Higher Education

Plain numerical DOI: 10.1016/j.iheduc.2011.01.005
DOI URL
directSciHub download

Schraw, G.. (1998). Promoting general metacognitive awareness. Instructional Science

Plain numerical DOI: 10.1023/A:1003044231033
DOI URL
directSciHub download

Frith, C. D.. (2012). The role of metacognition in human social interactions. Philosophical Transactions of the Royal Society B: Biological Sciences

Plain numerical DOI: 10.1098/rstb.2012.0123
DOI URL
directSciHub download

Fleming, S. M., Dolan, R. J., & Frith, C. D.. (2012). Metacognition: Computation, biology and function. Philosophical Transactions of the Royal Society B: Biological Sciences

Plain numerical DOI: 10.1098/rstb.2012.0021
DOI URL
directSciHub download

Smith, J. D.. (2009). The study of animal metacognition. Trends in Cognitive Sciences

Plain numerical DOI: 10.1016/j.tics.2009.06.009
DOI URL
directSciHub download

Fox, E., & Riconscente, M.. (2008). Metacognition and self-regulation in James, Piaget, and Vygotsky. Educational Psychology Review

Plain numerical DOI: 10.1007/s10648-008-9079-2
DOI URL
directSciHub download

Brown, A. L.. (1987). Metacognition, executive control, self-regulation, and other more mysterious mechanisms. In Metacognition, motivation, and understanding

Plain numerical DOI: doi: 10.4049/jimmunol.164.3.1416
DOI URL
directSciHub download

Terrace, H. S., & Son, L. K.. (2009). Comparative metacognition. Current Opinion in Neurobiology

Plain numerical DOI: 10.1016/j.conb.2009.06.004
DOI URL
directSciHub download

List of cognitive biases


Click on the image to zoom and pan. You can use the mouse-wheel to navigate the image.

Open a scrollable version of the application in full-screen modus
Open Bias Codex in Lightbox


cognitive-liberty.online/cognitive-biases-zoomify/list-of-cognitive-biases.html


[wl_chord]


Decision-making, belief, and behavioral biases

Many of these biases affect belief formation, business and economic decisions, and human behavior in general.

Name Description
Ambiguity effect The tendency to avoid options for which missing information makes the probability seem “unknown”.[10]
Anchoring or focalism The tendency to rely too heavily, or “anchor”, on one trait or piece of information when making decisions (usually the first piece of information acquired on that subject).[11][12]
Anthropocentric thinking The tendency to use human analogies as a basis for reasoning about other, less familiar, biological phenomena.[13]
Anthropomorphism or personification The tendency to characterize animals, objects, and abstract concepts as possessing human-like traits, emotions, and intentions.[14]
Attentional bias The tendency of perception to be affected by recurring thoughts.[15]
Automation bias The tendency to depend excessively on automated systems which can lead to erroneous automated information overriding correct decisions.[16]
Availability heuristic The tendency to overestimate the likelihood of events with greater “availability” in memory, which can be influenced by how recent the memories are or how unusual or emotionally charged they may be.[17]
Availability cascade A self-reinforcing process in which a collective belief gains more and more plausibility through its increasing repetition in public discourse (or “repeat something long enough and it will become true”).[18]
Backfire effect The reaction to disconfirming evidence by strengthening one’s previous beliefs.[19] cf. Continued influence effect.
Bandwagon effect The tendency to do (or believe) things because many other people do (or believe) the same. Related to groupthink and herd behavior.[20]
Base rate fallacy or Base rate neglect The tendency to ignore base rate information (generic, general information) and focus on specific information (information only pertaining to a certain case).[21]
Belief bias An effect where someone’s evaluation of the logical strength of an argument is biased by the believability of the conclusion.[22]
Ben Franklin effect A person who has performed a favor for someone is more likely to do another favor for that person than they would be if they had received a favor from that person.[23]
Berkson’s paradox The tendency to misinterpret statistical experiments involving conditional probabilities.[24]
Bias blind spot The tendency to see oneself as less biased than other people, or to be able to identify more cognitive biases in others than in oneself.[25]
Bystander effect The tendency to think that others will act in an emergency situation.[26]
Choice-supportive bias The tendency to remember one’s choices as better than they actually were.[27]
Clustering illusion The tendency to overestimate the importance of small runs, streaks, or clusters in large samples of random data (that is, seeing phantom patterns).[12]
Confirmation bias The tendency to search for, interpret, focus on and remember information in a way that confirms one’s preconceptions.[28]
Congruence bias The tendency to test hypotheses exclusively through direct testing, instead of testing possible alternative hypotheses.[12]
Conjunction fallacy The tendency to assume that specific conditions are more probable than general ones.[29]
Conservatism (belief revision) The tendency to revise one’s belief insufficiently when presented with new evidence.[5][30][31]
Continued influence effect The tendency to believe previously learned misinformation even after it has been corrected. Misinformation can still influence inferences one generates after a correction has occurred.[32] cf. Backfire effect
Contrast effect The enhancement or reduction of a certain stimulus’ perception when compared with a recently observed, contrasting object.[33]
Courtesy bias The tendency to give an opinion that is more socially correct than one’s true opinion, so as to avoid offending anyone.[34]
Curse of knowledge When better-informed people find it extremely difficult to think about problems from the perspective of lesser-informed people.[35]
Declinism The predisposition to view the past favorably (rosy retrospection) and future negatively.[36]
Decoy effect Preferences for either option A or B change in favor of option B when option C is presented, which is completely dominated by option B (inferior in all respects) and partially dominated by option A.[37]
Default effect When given a choice between several options, the tendency to favor the default one.[38]
Denomination effect The tendency to spend more money when it is denominated in small amounts (e.g., coins) rather than large amounts (e.g., bills).[39]
Disposition effect The tendency to sell an asset that has accumulated in value and resist selling an asset that has declined in value.[40]
Distinction bias The tendency to view two options as more dissimilar when evaluating them simultaneously than when evaluating them separately.[41]
Dunning–Kruger effect The tendency for unskilled individuals to overestimate their own ability and the tendency for experts to underestimate their own ability.[42]
Duration neglect The neglect of the duration of an episode in determining its value.[43]
Empathy gap The tendency to underestimate the influence or strength of feelings, in either oneself or others.[44]
Endowment effect The tendency for people to demand much more to give up an object than they would be willing to pay to acquire it.[45]
Exaggerated expectation Based on the estimates,[clarification needed] real-world evidence turns out to be less extreme than our expectations (conditionally inverse of the conservatism bias).[unreliable source?][5][46]
Experimenter’s or expectation bias The tendency for experimenters to believe, certify, and publish data that agree with their expectations for the outcome of an experiment, and to disbelieve, discard, or downgrade the corresponding weightings for data that appear to conflict with those expectations.[47]
Focusing effect The tendency to place too much importance on one aspect of an event.[48]
Forer effect or Barnum effect The observation that individuals will give high accuracy ratings to descriptions of their personality that supposedly are tailored specifically for them, but are in fact vague and general enough to apply to a wide range of people. This effect can provide a partial explanation for the widespread acceptance of some beliefs and practices, such as astrology, fortune telling, graphology, and some types of personality tests.[49]
Form function attribution bias In human–robot interaction, the tendency of people to make systematic errors when interacting with a robot. People may base their expectations and perceptions of a robot on its appearance (form) and attribute functions which do not necessarily mirror the true functions of the robot.[50]
Framing effect Drawing different conclusions from the same information, depending on how that information is presented.[51]
Frequency illusion The illusion in which a word, a name, or other thing that has recently come to one’s attention suddenly seems to appear with improbable frequency shortly afterwards (not to be confused with the recency illusion or selection bias).[52] This illusion is sometimes referred to as the Baader-Meinhof phenomenon.[53]
Functional fixedness Limits a person to using an object only in the way it is traditionally used.[54]
Gambler’s fallacy The tendency to think that future probabilities are altered by past events, when in reality they are unchanged. The fallacy arises from an erroneous conceptualization of the law of large numbers. For example, “I’ve flipped heads with this coin five times consecutively, so the chance of tails coming out on the sixth flip is much greater than heads.”[55]
Hard–easy effect Based on a specific level of task difficulty, the confidence in judgments is too conservative and not extreme enough.[5][56][57][58]
Hindsight bias Sometimes called the “I-knew-it-all-along” effect, the tendency to see past events as being predictable[59] at the time those events happened.
Hostile attribution bias The “hostile attribution bias” is the tendency to interpret others’ behaviors as having hostile intent, even when the behavior is ambiguous or benign.[60]
Hot-hand fallacy The “hot-hand fallacy” (also known as the “hot hand phenomenon” or “hot hand”) is the belief that a person who has experienced success with a random event has a greater chance of further success in additional attempts.
Hyperbolic discounting Discounting is the tendency for people to have a stronger preference for more immediate payoffs relative to later payoffs. Hyperbolic discounting leads to choices that are inconsistent over time – people make choices today that their future selves would prefer not to have made, despite using the same reasoning.[61] Also known as current moment bias, present-bias, and related to Dynamic inconsistency. A good example of this: a study showed that when making food choices for the coming week, 74% of participants chose fruit, whereas when the food choice was for the current day, 70% chose chocolate.
Identifiable victim effect The tendency to respond more strongly to a single identified person at risk than to a large group of people at risk.[62]
IKEA effect The tendency for people to place a disproportionately high value on objects that they partially assembled themselves, such as furniture from IKEA, regardless of the quality of the end result.[63]
Illicit transference Occurs when a term in the distributive (referring to every member of a class) and collective (referring to the class itself as a whole) sense are treated as equivalent. The two variants of this fallacy are the fallacy of composition and the fallacy of division.
Illusion of control The tendency to overestimate one’s degree of influence over other external events.[64]
Illusion of validity Belief that our judgments are accurate, especially when available information is consistent or inter-correlated.[65]
Illusory correlation Inaccurately perceiving a relationship between two unrelated events.[66][67]
Illusory truth effect A tendency to believe that a statement is true if it is easier to process, or if it has been stated multiple times, regardless of its actual veracity. These are specific cases of truthiness.
Impact bias The tendency to overestimate the length or the intensity of the impact of future feeling states.[68]
Information bias The tendency to seek information even when it cannot affect action.[69]
Insensitivity to sample size The tendency to under-expect variation in small samples.
Irrational escalation The phenomenon where people justify increased investment in a decision, based on the cumulative prior investment, despite new evidence suggesting that the decision was probably wrong. Also known as the sunk cost fallacy.
Law of the instrument An over-reliance on a familiar tool or methods, ignoring or under-valuing alternative approaches. “If all you have is a hammer, everything looks like a nail.”
Less-is-better effect The tendency to prefer a smaller set to a larger set judged separately, but not jointly.
Look-elsewhere effect An apparently statistically significant observation may have actually arisen by chance because of the size of the parameter space to be searched.
Loss aversion The disutility of giving up an object is greater than the utility associated with acquiring it.[70] (see also Sunk cost effects and endowment effect).
Mere exposure effect The tendency to express undue liking for things merely because of familiarity with them.[71]
Money illusion The tendency to concentrate on the nominal value (face value) of money rather than its value in terms of purchasing power.[72]
Moral credential effect The tendency of a track record of non-prejudice to increase subsequent prejudice.
Negativity bias or Negativity effect Psychological phenomenon by which humans have a greater recall of unpleasant memories compared with positive memories.[73][74] (see also actor-observer bias, group attribution error, positivity effect, and negativity effect).[75]
Neglect of probability The tendency to completely disregard probability when making a decision under uncertainty.[76]
Normalcy bias The refusal to plan for, or react to, a disaster which has never happened before.
Not invented here Aversion to contact with or use of products, research, standards, or knowledge developed outside a group. Related to IKEA effect.
Observer-expectancy effect When a researcher expects a given result and therefore unconsciously manipulates an experiment or misinterprets data in order to find it (see also subject-expectancy effect).
Omission bias The tendency to judge harmful actions (commissions) as worse, or less moral, than equally harmful inactions (omissions).[77]
Optimism bias The tendency to be over-optimistic, overestimating favorable and pleasing outcomes (see also wishful thinking, valence effect, positive outcome bias).[78][79]
Ostrich effect Ignoring an obvious (negative) situation.
Outcome bias The tendency to judge a decision by its eventual outcome instead of based on the quality of the decision at the time it was made.
Overconfidence effect Excessive confidence in one’s own answers to questions. For example, for certain types of questions, answers that people rate as “99% certain” turn out to be wrong 40% of the time.[5][80][81][82]
Pareidolia A vague and random stimulus (often an image or sound) is perceived as significant, e.g., seeing images of animals or faces in clouds, the man in the moon, and hearing non-existent hidden messages on records played in reverse.
Pessimism bias The tendency for some people, especially those suffering from depression, to overestimate the likelihood of negative things happening to them.
Placebo effect The belief that a medication works—even if merely a placebo.
Planning fallacy The tendency to underestimate task-completion times.[68]
Post-purchase rationalization The tendency to persuade oneself through rational argument that a purchase was good value.
Pro-innovation bias The tendency to have an excessive optimism towards an invention or innovation’s usefulness throughout society, while often failing to identify its limitations and weaknesses.
Projection bias The tendency to overestimate how much our future selves share one’s current preferences, thoughts and values, thus leading to sub-optimal choices.[83][84][74]
Pseudocertainty effect The tendency to make risk-averse choices if the expected outcome is positive, but make risk-seeking choices to avoid negative outcomes.[85]
Reactance The urge to do the opposite of what someone wants you to do out of a need to resist a perceived attempt to constrain your freedom of choice (see also Reverse psychology).
Reactive devaluation Devaluing proposals only because they purportedly originated with an adversary.
Recency illusion The illusion that a phenomenon one has noticed only recently is itself recent. Often used to refer to linguistic phenomena; the illusion that a word or language usage that one has noticed only recently is an innovation when it is in fact long-established (see also frequency illusion).
Regressive bias A certain state of mind wherein high values and high likelihoods are overestimated while low values and low likelihoods are underestimated.[5][86][87][unreliable source?]
Restraint bias The tendency to overestimate one’s ability to show restraint in the face of temptation.
Rhyme as reason effect Rhyming statements are perceived as more truthful. A famous example being used in the O.J Simpson trial with the defense’s use of the phrase “If the gloves don’t fit, then you must acquit.”
Risk compensation / Peltzman effect The tendency to take greater risks when perceived safety increases.
Selection bias The tendency to notice something more when something causes us to be more aware of it, such as when we buy a car, we tend to notice similar cars more often than we did before. They are not suddenly more common – we just are noticing them more. Also called the Observational Selection Bias.
Selective perception The tendency for expectations to affect perception.
Semmelweis reflex The tendency to reject new evidence that contradicts a paradigm.[31]
Sexual overperception bias / sexual underperception bias The tendency to over-/underestimate sexual interest of another person in oneself.
Social comparison bias The tendency, when making decisions, to favour potential candidates who don’t compete with one’s own particular strengths.[88]
Social desirability bias The tendency to over-report socially desirable characteristics or behaviours in oneself and under-report socially undesirable characteristics or behaviours.[89]
Status quo bias The tendency to like things to stay relatively the same (see also loss aversion, endowment effect, and system justification).[90][91]
Stereotyping Expecting a member of a group to have certain characteristics without having actual information about that individual.
Subadditivity effect The tendency to judge probability of the whole to be less than the probabilities of the parts.[92]
Subjective validation Perception that something is true if a subject’s belief demands it to be true. Also assigns perceived connections between coincidences.
Surrogation Losing sight of the strategic construct that a measure is intended to represent, and subsequently acting as though the measure is the construct of interest.
Survivorship bias Concentrating on the people or things that “survived” some process and inadvertently overlooking those that didn’t because of their lack of visibility.
Time-saving bias Underestimations of the time that could be saved (or lost) when increasing (or decreasing) from a relatively low speed and overestimations of the time that could be saved (or lost) when increasing (or decreasing) from a relatively high speed.
Third-person effect Belief that mass communicated media messages have a greater effect on others than on themselves.
Parkinson’s law of triviality The tendency to give disproportionate weight to trivial issues. Also known as bikeshedding, this bias explains why an organization may avoid specialized or complex subjects, such as the design of a nuclear reactor, and instead focus on something easy to grasp or rewarding to the average participant, such as the design of an adjacent bike shed.[93]
Unit bias The standard suggested amount of consumption (e.g., food serving size) is perceived to be appropriate, and a person would consume it all even if it is too much for this particular person.[94]
Weber–Fechner law Difficulty in comparing small differences in large quantities.
Well travelled road effect Underestimation of the duration taken to traverse oft-traveled routes and overestimation of the duration taken to traverse less familiar routes.
Women are wonderful effect A tendency to associate more positive attributes with women than with men.
Zero-risk bias Preference for reducing a small risk to zero over a greater reduction in a larger risk.
Zero-sum bias A bias whereby a situation is incorrectly perceived to be like a zero-sum game (i.e., one person gains at the expense of another).

Social biases

Most of these biases are labeled as attributional biases.

Name Description
Actor-observer bias The tendency for explanations of other individuals’ behaviors to overemphasize the influence of their personality and underemphasize the influence of their situation (see also Fundamental attribution error), and for explanations of one’s own behaviors to do the opposite (that is, to overemphasize the influence of our situation and underemphasize the influence of our own personality).
Authority bias The tendency to attribute greater accuracy to the opinion of an authority figure (unrelated to its content) and be more influenced by that opinion.[95]
Cheerleader effect The tendency for people to appear more attractive in a group than in isolation.[96]
Defensive attribution hypothesis Attributing more blame to a harm-doer as the outcome becomes more severe or as personal or situational similarity to the victim increases.
Egocentric bias Occurs when people claim more responsibility for themselves for the results of a joint action than an outside observer would credit them with.
Extrinsic incentives bias An exception to the fundamental attribution error, when people view others as having (situational) extrinsic motivations and (dispositional) intrinsic motivations for oneself
False consensus effect The tendency for people to overestimate the degree to which others agree with them.[97]
Forer effect (aka Barnum effect) The tendency to give high accuracy ratings to descriptions of their personality that supposedly are tailored specifically for them, but are in fact vague and general enough to apply to a wide range of people. For example, horoscopes.
Fundamental attribution error The tendency for people to over-emphasize personality-based explanations for behaviors observed in others while under-emphasizing the role and power of situational influences on the same behavior[74] (see also actor-observer bias, group attribution error, positivity effect, and negativity effect).[75]
Group attribution error The biased belief that the characteristics of an individual group member are reflective of the group as a whole or the tendency to assume that group decision outcomes reflect the preferences of group members, even when information is available that clearly suggests otherwise.
Halo effect The tendency for a person’s positive or negative traits to “spill over” from one personality area to another in others’ perceptions of them (see also physical attractiveness stereotype).[98]
Illusion of asymmetric insight People perceive their knowledge of their peers to surpass their peers’ knowledge of them.[99]
Illusion of external agency When people view self-generated preferences as instead being caused by insightful, effective and benevolent agents.
Illusion of transparency People overestimate others’ ability to know them, and they also overestimate their ability to know others.
Illusory superiority Overestimating one’s desirable qualities, and underestimating undesirable qualities, relative to other people. (Also known as “Lake Wobegon effect”, “better-than-average effect”, or “superiority bias“.)[100]
Ingroup bias The tendency for people to give preferential treatment to others they perceive to be members of their own groups.
Just-world hypothesis The tendency for people to want to believe that the world is fundamentally just, causing them to rationalize an otherwise inexplicable injustice as deserved by the victim(s).
Moral luck The tendency for people to ascribe greater or lesser moral standing based on the outcome of an event.
Naïve cynicism Expecting more egocentric bias in others than in oneself.
Naïve realism The belief that we see reality as it really is – objectively and without bias; that the facts are plain for all to see; that rational people will agree with us; and that those who don’t are either uninformed, lazy, irrational, or biased.
Outgroup homogeneity bias Individuals see members of their own group as being relatively more varied than members of other groups.[101]
Self-serving bias The tendency to claim more responsibility for successes than failures. It may also manifest itself as a tendency for people to evaluate ambiguous information in a way beneficial to their interests (see also group-serving bias).[102]
Shared information bias Known as the tendency for group members to spend more time and energy discussing information that all members are already familiar with (i.e., shared information), and less time and energy discussing information that only some members are aware of (i.e., unshared information).[103]
System justification The tendency to defend and bolster the status quo. Existing social, economic, and political arrangements tend to be preferred, and alternatives disparaged, sometimes even at the expense of individual and collective self-interest. (See also status quo bias.)
Trait ascription bias The tendency for people to view themselves as relatively variable in terms of personality, behavior, and mood while viewing others as much more predictable.
Ultimate attribution error Similar to the fundamental attribution error, in this error a person is likely to make an internal attribution to an entire group instead of the individuals within the group.
Worse-than-average effect A tendency to believe ourselves to be worse than others at tasks which are difficult.[104]

Memory errors and biases

In psychology and cognitive science, a memory bias is a cognitive bias that either enhances or impairs the recall of a memory (either the chances that the memory will be recalled at all, or the amount of time it takes for it to be recalled, or both), or that alters the content of a reported memory. There are many types of memory bias, including:

Name Description
Bizarreness effect Bizarre material is better remembered than common material.
Choice-supportive bias In a self-justifying manner retroactively ascribing one’s choices to be more informed than they were when they were made.
Change bias After an investment of effort in producing change, remembering one’s past performance as more difficult than it actually was.[105][unreliable source?]
Childhood amnesia The retention of few memories from before the age of four.
Conservatism or Regressive bias Tendency to remember high values and high likelihoods/probabilities/frequencies as lower than they actually were and low ones as higher than they actually were. Based on the evidence, memories are not extreme enough.[86][87]
Consistency bias Incorrectly remembering one’s past attitudes and behaviour as resembling present attitudes and behaviour.[106]
Context effect That cognition and memory are dependent on context, such that out-of-context memories are more difficult to retrieve than in-context memories (e.g., recall time and accuracy for a work-related memory will be lower at home, and vice versa).
Cross-race effect The tendency for people of one race to have difficulty identifying members of a race other than their own.
Cryptomnesia A form of misattribution where a memory is mistaken for imagination, because there is no subjective experience of it being a memory.[105]
Egocentric bias Recalling the past in a self-serving manner, e.g., remembering one’s exam grades as being better than they were, or remembering a caught fish as bigger than it really was.
Fading affect bias A bias in which the emotion associated with unpleasant memories fades more quickly than the emotion associated with positive events.[107]
False memory A form of misattribution where imagination is mistaken for a memory.
Generation effect (Self-generation effect) That self-generated information is remembered best. For instance, people are better able to recall memories of statements that they have generated than similar statements generated by others.
Google effect The tendency to forget information that can be found readily online by using Internet search engines.
Hindsight bias The inclination to see past events as being more predictable than they actually were; also called the “I-knew-it-all-along” effect.
Humor effect That humorous items are more easily remembered than non-humorous ones, which might be explained by the distinctiveness of humor, the increased cognitive processing time to understand the humor, or the emotional arousal caused by the humor.[108]
Illusion of truth effect That people are more likely to identify as true statements those they have previously heard (even if they cannot consciously remember having heard them), regardless of the actual validity of the statement. In other words, a person is more likely to believe a familiar statement than an unfamiliar one.
Illusory correlation Inaccurately remembering a relationship between two events.[5][67]
Lag effect The phenomenon whereby learning is greater when studying is spread out over time, as opposed to studying the same amount of time in a single session. See also spacing effect.
Leveling and sharpening Memory distortions introduced by the loss of details in a recollection over time, often concurrent with sharpening or selective recollection of certain details that take on exaggerated significance in relation to the details or aspects of the experience lost through leveling. Both biases may be reinforced over time, and by repeated recollection or re-telling of a memory.[109]
Levels-of-processing effect That different methods of encoding information into memory have different levels of effectiveness.[110]
List-length effect A smaller percentage of items are remembered in a longer list, but as the length of the list increases, the absolute number of items remembered increases as well. For example, consider a list of 30 items (“L30”) and a list of 100 items (“L100”). An individual may remember 15 items from L30, or 50%, whereas the individual may remember 40 items from L100, or 40%. Although the percent of L30 items remembered (50%) is greater than the percent of L100 (40%), more L100 items (40) are remembered than L30 items (15).[111][further explanation needed]
Misinformation effect Memory becoming less accurate because of interference from post-event information.[112]
Modality effect That memory recall is higher for the last items of a list when the list items were received via speech than when they were received through writing.
Mood-congruent memory bias The improved recall of information congruent with one’s current mood.
Next-in-line effect People taking turns speaking in a group tend to have diminished recall for the words of others[clarify] who spoke immediately before them.[113]
Part-list cueing effect That being shown some items from a list and later retrieving one item causes it to become harder to retrieve the other items.[114]
Peak-end rule That people seem to perceive not the sum of an experience but the average of how it was at its peak (e.g., pleasant or unpleasant) and how it ended.
Persistence The unwanted recurrence of memories of a traumatic event.[citation needed]
Picture superiority effect The notion that concepts that are learned by viewing pictures are more easily and frequently recalled than are concepts that are learned by viewing their written word form counterparts.[115][116][117][118][119][120]
Positivity effect (Socioemotional selectivity theory) That older adults favor positive over negative information in their memories.
Primacy effect, recency effect & serial position effect That items near the end of a sequence are the easiest to recall, followed by the items at the beginning of a sequence; items in the middle are the least likely to be remembered.[121]
Processing difficulty effect That information that takes longer to read and is thought about more (processed with more difficulty) is more easily remembered.[122]
Reminiscence bump The recalling of more personal events from adolescence and early adulthood than personal events from other lifetime periods.[123]
Rosy retrospection The remembering of the past as having been better than it really was.
Self-relevance effect That memories relating to the self are better recalled than similar information relating to others.
Source confusion Confusing episodic memories with other information, creating distorted memories.[124]
Spacing effect That information is better recalled if exposure to it is repeated over a long span of time rather than a short one.
Spotlight effect The tendency to overestimate the amount that other people notice your appearance or behavior.
Stereotypical bias Memory distorted towards stereotypes (e.g., racial or gender).
Suffix effect Diminishment of the recency effect because a sound item is appended to the list that the subject is not required to recall.[125][126]
Suggestibility A form of misattribution where ideas suggested by a questioner are mistaken for memory.
Tachypsychia When time perceived by the individual either lengthens, making events appear to slow down, or contracts.[127]
Telescoping effect The tendency to displace recent events backward in time and remote events forward in time, so that recent events appear more remote, and remote events, more recent.
Testing effect The fact that you more easily remember information you have read by rewriting it instead of rereading it.[128]
Tip of the tongue phenomenon When a subject is able to recall parts of an item, or related information, but is frustratingly unable to recall the whole item. This is thought to be an instance of “blocking” where multiple similar memories are being recalled and interfere with each other.[105]
Travis Syndrome Overestimating the significance of the present.[129] It is related to the enlightenment Idea of Progress and chronological snobbery with possibly an appeal to novelty logical fallacy being part of the bias.
Verbatim effect That the “gist” of what someone has said is better remembered than the verbatim wording.[130] This is because memories are representations, not exact copies.
von Restorff effect That an item that sticks out is more likely to be remembered than other items.[131]
Zeigarnik effect That uncompleted or interrupted tasks are remembered better than completed ones.

Common theoretical causes of some cognitive biases

A 2012 Psychological Bulletin article suggested that at least eight seemingly unrelated biases can be produced by the same information-theoretic generative mechanism that assumes noisy information processing during storage and retrieval of information in human memory.[5]

Individual differences in decision making biases

People do appear to have stable individual differences in their susceptibility to decision biases such as overconfidence, temporal discounting, and bias blind spot.[134] That said, these stable levels of bias within individuals are possible to change. Participants in experiments who watched training videos and played debiasing games showed medium to large reductions both immediately and up to three months later in the extent to which they exhibited susceptibility to six cognitive biases: anchoring, bias blind spot, confirmation bias, fundamental attribution error, projection bias, and representativeness.[135]

Debiasing

Debiasing is the reduction of biases in judgment and decision making through incentives, nudges, and training. Cognitive bias mitigation and cognitive bias modification are forms of debiasing specifically applicable to cognitive biases and their effects.


Source: en.wikipedia.org/wiki/List_of_cognitive_biases

Further References

Podsakoff, P. M., MacKenzie, S. B., Lee, J. Y., & Podsakoff, N. P.. (2003). Common Method Biases in Behavioral Research: A Critical Review of the Literature and Recommended Remedies. Journal of Applied Psychology

Plain numerical DOI: 10.1037/0021-9010.88.5.879
DOI URL
directSciHub download

Tversky, A., & Kahneman, D.. (1973). Availability: A heuristic for judging frequency and probability. Cognitive Psychology

Plain numerical DOI: 10.1016/0010-0285(73)90033-9
DOI URL
directSciHub download

Kahneman, D., & Tversky, A.. (1996). On the reality of cognitive illusions.. Psychological Review

Plain numerical DOI: 10.1037/0033-295X.103.3.582
DOI URL
directSciHub download

Oechssler, J., Roider, A., & Schmitz, P. W.. (2009). Cognitive abilities and behavioral biases. Journal of Economic Behavior and Organization

Plain numerical DOI: 10.1016/j.jebo.2009.04.018
DOI URL
directSciHub download

Griffiths, T. L., Chater, N., Kemp, C., Perfors, A., & Tenenbaum, J. B.. (2010). Probabilistic models of cognition: exploring representations and inductive biases. Trends in Cognitive Sciences

Plain numerical DOI: 10.1016/j.tics.2010.05.004
DOI URL
directSciHub download

Stanovich, K. E., & West, R. F.. (2008). On the Relative Independence of Thinking Biases and Cognitive Ability. Journal of Personality and Social Psychology

Plain numerical DOI: 10.1037/0022-3514.94.4.672
DOI URL
directSciHub download

Hallion, L. S., & Ruscio, A. M.. (2011). A Meta-Analysis of the Effect of Cognitive Bias Modification on Anxiety and Depression. Psychological Bulletin

Plain numerical DOI: 10.1037/a0024355
DOI URL
directSciHub download

Gigerenzer, G.. (1991). How to make Cognitive Illusions Disappear: Beyond “Heuristics and Biases”. European Review of Social Psychology

Plain numerical DOI: 10.1080/14792779143000033
DOI URL
directSciHub download

Roiser, J. P., Elliott, R., & Sahakian, B. J.. (2012). Cognitive mechanisms of treatment in depression. Neuropsychopharmacology

Plain numerical DOI: 10.1038/npp.2011.183
DOI URL
directSciHub download

Haselton, M. G., Nettle, D., & Andrews, P. W.. (2015). The Evolution of Cognitive Bias. In The Handbook of Evolutionary Psychology

Plain numerical DOI: 10.1002/9780470939376.ch25
DOI URL
directSciHub download

Haselton, M. G., & Nettle, D.. (2006). The paranoid optimist: An integrative evolutionary model of cognitive biases. Personality and Social Psychology Review

Plain numerical DOI: 10.1207/s15327957pspr1001_3
DOI URL
directSciHub download

Croskerry, P.. (2003). The importance of cognitive errors in diagnosis and strategies to minimize them. Academic Medicine

Plain numerical DOI: 10.1097/00001888-200308000-00003
DOI URL
directSciHub download

Bertrand, M., & Morse, A.. (2011). Information Disclosure, Cognitive Biases, and Payday Borrowing. Journal of Finance

Plain numerical DOI: 10.1111/j.1540-6261.2011.01698.x
DOI URL
directSciHub download

Ioannidis, J. P. A., Munafò, M. R., Fusar-Poli, P., Nosek, B. A., & David, S. P.. (2014). Publication and other reporting biases in cognitive sciences: Detection, prevalence, and prevention. Trends in Cognitive Sciences

Plain numerical DOI: 10.1016/j.tics.2014.02.010
DOI URL
directSciHub download

Montibeller, G., & von Winterfeldt, D.. (2015). Cognitive and Motivational Biases in Decision and Risk Analysis. Risk Analysis

Plain numerical DOI: 10.1111/risa.12360
DOI URL
directSciHub download

Douglas, C., Bateson, M., Walsh, C., Bédué, A., & Edwards, S. A.. (2012). Environmental enrichment induces optimistic cognitive biases in pigs. Applied Animal Behaviour Science

Plain numerical DOI: 10.1016/j.applanim.2012.02.018
DOI URL
directSciHub download

Greenwald, A. G.. (1980). The totalitarian ego: Fabrication and revision of personal history. American Psychologist

Plain numerical DOI: 10.1037/0003-066X.35.7.603
DOI URL
directSciHub download

Bateson, M., Desire, S., Gartside, S. E., & Wright, G. A.. (2011). Agitated honeybees exhibit pessimistic cognitive biases. Current Biology

Plain numerical DOI: 10.1016/j.cub.2011.05.017
DOI URL
directSciHub download

Peters, E. R., Moritz, S., Schwannauer, M., Wiseman, Z., Greenwood, K. E., Scott, J., … Garety, P. A.. (2014). Cognitive biases questionnaire for psychosis. Schizophrenia Bulletin

Plain numerical DOI: 10.1093/schbul/sbs199
DOI URL
directSciHub download

Hoppe, E. I., & Kusterer, D. J.. (2011). Behavioral biases and cognitive reflection. Economics Letters

Plain numerical DOI: 10.1016/j.econlet.2010.11.015
DOI URL
directSciHub download

Marshall, J. A. R., Trimmer, P. C., Houston, A. I., & McNamara, J. M.. (2013). On evolutionary explanations of cognitive biases. Trends in Ecology and Evolution

Plain numerical DOI: 10.1016/j.tree.2013.05.013
DOI URL
directSciHub download

Croskerry, P., Singhal, G., & Mamede, S.. (2013). Cognitive debiasing 1: Origins of bias and theory of debiasing. BMJ Quality and Safety

Plain numerical DOI: 10.1136/bmjqs-2012-001712
DOI URL
directSciHub download

Das, T. K., & Teng, B. S.. (1999). Cognitive biases and strategic decision processes: An integratwe perspective. Journal of Management Studies

Plain numerical DOI: 10.1111/1467-6486.00157
DOI URL
directSciHub download

Gudmundsson, S. V., & Lechner, C.. (2013). Cognitive biases, organization, and entrepreneurial firm survival. European Management Journal

Plain numerical DOI: 10.1016/j.emj.2013.01.001
DOI URL
directSciHub download

Red Herring strategy/fallacy

A red herring is something that misleads or distracts from a relevant or important issue. It may be either a logical fallacy or a literary device that leads readers or audiences towards a false conclusion. A red herring might be intentionally used, such as in mystery fiction or as part of rhetorical strategies (e.g., in politics), or it could be inadvertently used during argumentation.

The term was popularized in 1807 by English polemicist William Cobbett, who told a story of having used a kipper (a strong-smelling smoked fish) to divert hounds from chasing a hare.

“When I was a boy, we used, in order to draw oft’ the harriers from the trail of a hare that we had set down as our own private property, get to her haunt early in the morning, and drag a red-herring, tied to a string, four or five miles over hedges and ditches, across fields and through coppices, till we got to a point, whence we were pretty sure the hunters would not return to the spot where they had thrown off; and, though I would, by no means, be understood, as comparing the editors and proprietors of the London daily press to animals half so sagacious and so faithful as hounds, I cannot help thinking, that, in the case to which we are referring, they must have been misled, at first, by some political deceiver.”

William Cobbett, February 14, 1807, Cobbett’s Political Register, Volume XI[10]

List of logical fallacies

Formal fallacies

A formal fallacy is an error in logic that can be seen in the argument’s form.[4] All formal fallacies are specific types of non sequiturs.

Propositional fallacies

A propositional fallacy is an error in logic that concerns compound propositions. For a compound proposition to be true, the truth values of its constituent parts must satisfy the relevant logical connectives that occur in it (most commonly: <and>, <or>, <not>, <only if>, <if and only if>). The following fallacies involve inferences whose correctness is not guaranteed by the behavior of those logical connectives, and hence, which are not logically guaranteed to yield true conclusions.
Types of propositional fallacies:

Quantification fallacies

A quantification fallacy is an error in logic where the quantifiers of the premises are in contradiction to the quantifier of the conclusion.
Types of quantification fallacies:

Formal syllogistic fallacies

Syllogistic fallacies – logical fallacies that occur in syllogisms.

Informal fallacies

Informal fallacies – arguments that are fallacious for reasons other than structural (formal) flaws and usually require examination of the argument’s content.[16]

  • Argument to moderation (false compromise, middle ground, fallacy of the mean, argumentum ad temperantiam) – assuming that the compromise between two positions is always correct.[17]
  • Continuum fallacy (fallacy of the beard, line-drawing fallacy, sorites fallacy, fallacy of the heap, bald man fallacy) – improperly rejecting a claim for being imprecise.[18]
  • Correlative-based fallacies
  • Divine fallacy (argument from incredulity) – arguing that, because something is so incredible or amazing, it must be the result of superior, divine, alien or paranormal agency.[20]
  • Double counting – counting events or occurrences more than once in probabilistic reasoning, which leads to the sum of the probabilities of all cases exceeding unity.
  • Equivocation – the misleading use of a term with more than one meaning (by glossing over which meaning is intended at a particular time).[21]
    • Ambiguous middle term – a common ambiguity in syllogisms in which the middle term is equivocated.[22]
    • Definitional retreat – changing the meaning of a word to deal with an objection raised against the original wording.[1]
    • Motte-and-bailey fallacy – The arguer conflates two similar positions, one modest and easy to defend (the “motte”) and one much more controversial (the “bailey”). The arguer advances the controversial position, but when challenged, they insist that they are only advancing the more modest position.[23]
    • Fallacy of accent – a specific type of ambiguity that arises when the meaning of a sentence is changed by placing an unusual prosodic stress, or when, in a written passage, it’s left unclear which word the emphasis was supposed to fall on.
    • (See also the if-by-whiskey fallacy, below.)
  • Ecological fallacy – inferences about the nature of specific individuals are based solely upon aggregate statistics collected for the group to which those individuals belong.[24]
  • Etymological fallacy – reasoning that the original or historical meaning of a word or phrase is necessarily similar to its actual present-day usage.[25]
  • Fallacy of composition – assuming that something true of part of a whole must also be true of the whole.[26]
  • Fallacy of division – assuming that something true of a thing must also be true of all or some of its parts.[27]
  • False attribution – an advocate appeals to an irrelevant, unqualified, unidentified, biased or fabricated source in support of an argument.
    • Fallacy of quoting out of context (contextotomy, contextomy; quotation mining) – refers to the selective excerpting of words from their original context in a way that distorts the source’s intended meaning.[28]
  • False authority (single authority) – using an expert of dubious credentials or using only one opinion to sell a product or idea. Related to the appeal to authority (not always fallacious).
  • False dilemma (false dichotomy, fallacy of bifurcation, black-or-white fallacy) – two alternative statements are held to be the only possible options when in reality there are more.[29]
  • False equivalence – describing a situation of logical and apparent equivalence, when in fact there is none.
  • Historian’s fallacy – the assumption that decision makers of the past viewed events from the same perspective and had the same information as those subsequently analyzing the decision.[30] (Not to be confused with presentism, which is a mode of historical analysis in which present-day ideas, such as moral standards, are projected into the past.)
  • Historical fallacy – a set of considerations is thought to hold good only because a completed process is read into the content of the process which conditions this completed result.[31]
  • Homunculus fallacy – a “middle-man” is used for explanation; this sometimes leads to regressive middle-men. Explains without actually explaining the real nature of a function or a process. Instead, it explains the concept in terms of the concept itself, without first defining or explaining the original concept. Explaining thought as something produced by a little thinker, a sort of homunculus inside the head, merely explains it as another kind of thinking (as different but the same).[32]
  • Inflation of conflict – arguing that if experts of a field of knowledge disagree on a certain point, the experts must know nothing, and therefore no conclusion can be reached, or that the legitimacy of their entire field is put to question.[33]
  • If-by-whiskey – an argument that supports both sides of an issue by using terms that are selectively emotionally sensitive.
  • Incomplete comparison – insufficient information is provided to make a complete comparison.
  • Inconsistent comparison – different methods of comparison are used, leaving a false impression of the whole comparison.
  • Intentionality fallacy – the insistence that the ultimate meaning of an expression must be consistent with the intention of the person from whom the communication originated (e.g. a work of fiction that is widely received as a blatant allegory must necessarily not be regarded as such if the author intended it not to be so.)[34]
  • Kettle logic – using multiple, jointly inconsistent arguments to defend a position.[dubious ]
  • Ludic fallacy – the belief that the outcomes of non-regulated random occurrences can be encapsulated by a statistic; a failure to take into account unknown unknowns in determining the probability of events taking place.[35]
  • McNamara fallacy (quantitative fallacy) – making a decision based only on quantitative observations, discounting all other considerations.
  • Mind projection fallacy – subjective judgments are “projected” to be inherent properties of an object, rather than being related to personal perceptions of that object.
  • Moralistic fallacy – inferring factual conclusions from purely evaluative premises in violation of fact–value distinction. For instance, inferring is from ought is an instance of moralistic fallacy. Moralistic fallacy is the inverse of naturalistic fallacy defined below.
  • Moving the goalposts (raising the bar) – argument in which evidence presented in response to a specific claim is dismissed and some other (often greater) evidence is demanded.
  • Nirvana fallacy (perfect-solution fallacy) – solutions to problems are rejected because they are not perfect.
  • Onus probandi – from the Latin onus probandi incumbit ei qui dicit, non ei qui negat the burden of proof is on the person who makes the claim, not on the person who denies (or questions the claim). It is a particular case of the argumentum ad ignorantiam fallacy, here the burden is shifted on the person defending against the assertion. Also known as “shifting the burden of proof“.
  • Proof by assertion – a proposition is repeatedly restated regardless of contradiction; sometimes confused with argument from repetition (argumentum ad infinitum, argumentum ad nauseam)
  • Prosecutor’s fallacy – a low probability of false matches does not mean a low probability of some false match being found.
  • Proving too much – using a form of argument that, if it were valid, could be used to reach an additional, invalid conclusion.
  • Psychologist’s fallacy – an observer presupposes the objectivity of their own perspective when analyzing a behavioral event.
  • Referential fallacy[36] – assuming all words refer to existing things and that the meaning of words reside within the things they refer to, as opposed to words possibly referring to no real object or that the meaning of words often comes from how they are used.
  • Reification (concretism, hypostatization, or the fallacy of misplaced concreteness) – a fallacy of ambiguity, when an abstraction (abstract belief or hypothetical construct) is treated as if it were a concrete, real event or physical entity. In other words, it is the error of treating as a “real thing” something that is not a real thing, but merely an idea.
  • Retrospective determinism – the argument that because an event has occurred under some circumstance, the circumstance must have made its occurrence inevitable.
  • Special pleading – a proponent of a position attempts to cite something as an exemption to a generally accepted rule or principle without justifying the exemption.

Improper premise

  • Begging the question (petitio principii) – providing what is essentially the conclusion of the argument as a premise.[37][38][39][40]
    • Loaded label – while not inherently fallacious, use of evocative terms to support a conclusion is a type of begging the question fallacy. When fallaciously used, the term’s connotations are relied on to sway the argument towards a particular conclusion. For example, an organic foods advertisement that says “Organic foods are safe and healthy foods grown without any pesticides, herbicides, or other unhealthy additives.” Use of the term “unhealthy additives” is used as support for the idea that the product is safe.[41]
  • Circular reasoning (circulus in demonstrando) – the reasoner begins with what he or she is trying to end up with; sometimes called assuming the conclusion.
  • Fallacy of many questions (complex question, fallacy of presuppositions, loaded question, plurium interrogationum) – someone asks a question that presupposes something that has not been proven or accepted by all the people involved. This fallacy is often used rhetorically so that the question limits direct replies to those that serve the questioner’s agenda.

Faulty generalizations

Faulty generalization – reach a conclusion from weak premises. Unlike fallacies of relevance, in fallacies of defective induction, the premises are related to the conclusions yet only weakly buttress the conclusions. A faulty generalization is thus produced.

  • Accident – an exception to a generalization is ignored.[42]
    • No true Scotsman – makes a generalization true by changing the generalization to exclude a counterexample.[43]
  • Cherry picking (suppressed evidence, incomplete evidence) – act of pointing at individual cases or data that seem to confirm a particular position, while ignoring a significant portion of related cases or data that may contradict that position.[44]
    • Survivorship bias – a small number of successes of a given process are actively promoted while completely ignoring a large number of failures
  • False analogy – an argument by analogy in which the analogy is poorly suited.[45]
  • Hasty generalization (fallacy of insufficient statistics, fallacy of insufficient sample, fallacy of the lonely fact, hasty induction, secundum quid, converse accident, jumping to conclusions) – basing a broad conclusion on a small sample or the making of a determination without all of the information required to do so.[46]
  • Inductive fallacy – A more general name to some fallacies, such as hasty generalization. It happens when a conclusion is made of premises that lightly support it.
  • Misleading vividness – involves describing an occurrence in vivid detail, even if it is an exceptional occurrence, to convince someone that it is a problem; this also relies on the appeal to emotion fallacy.
  • Overwhelming exception – an accurate generalization that comes with qualifications that eliminate so many cases that what remains is much less impressive than the initial statement might have led one to assume.[47]
  • Thought-terminating cliché – a commonly used phrase, sometimes passing as folk wisdom, used to quell cognitive dissonance, conceal lack of forethought, move on to other topics, etc. – but in any case, to end the debate with a cliché rather than a point.

Questionable cause

Questionable cause – Is a general type error with many variants. Its primary basis is the confusion of association with causation. Either by inappropriately deducing (or rejecting) causation or a broader failure to properly investigate the cause of an observed effect.

  • Cum hoc ergo propter hoc (Latin for “with this, therefore because of this”; correlation implies causation; faulty cause/effect, coincidental correlation, correlation without causation) – a faulty assumption that, because there is a correlation between two variables, one caused the other.[48]
    • Post hoc ergo propter hoc (Latin for “after this, therefore because of this”; temporal sequence implies causation) – X happened, then Y happened; therefore X caused Y.[49]
    • Wrong direction (reverse causation) – cause and effect are reversed. The cause is said to be the effect and vice versa.[50] The consequence of the phenomenon is claimed to be its root cause.
    • Ignoring a common cause
  • Fallacy of the single cause (causal oversimplification[51]) – it is assumed that there is one, simple cause of an outcome when in reality it may have been caused by a number of only jointly sufficient causes.
  • Furtive fallacy – outcomes are asserted to have been caused by the malfeasance of decision makers.
  • Gambler’s fallacy – the incorrect belief that separate, independent events can affect the likelihood of another random event. If a fair coin lands on heads 10 times in a row, the belief that it is “due to the number of times it had previously landed on tails” is incorrect.[52]
  • Magical thinking – fallacious attribution of causal relationships between actions and events. In anthropology, it refers primarily to cultural beliefs that ritual, prayer, sacrifice, and taboos will produce specific supernatural consequences. In psychology, it refers to an irrational belief that thoughts by themselves can affect the world or that thinking something corresponds with doing it.
  • Regression fallacy – ascribes cause where none exists. The flaw is failing to account for natural fluctuations. It is frequently a special kind of post hoc fallacy.

Relevance fallacies

  • Appeal to the stone (argumentum ad lapidem) – dismissing a claim as absurd without demonstrating proof for its absurdity.[53]
  • Argument from ignorance (appeal to ignorance, argumentum ad ignorantiam) – assuming that a claim is true because it has not been or cannot be proven false, or vice versa.[54]
  • Argument from incredulity (appeal to common sense) – “I cannot imagine how this could be true; therefore, it must be false.”[55]
  • Argument from repetition (argumentum ad nauseam, argumentum ad infinitum) – repeating an argument until nobody cares to discuss it any more;[56][57] sometimes confused with proof by assertion
  • Argument from silence (argumentum ex silentio) – assuming that a claim is true based on the absence of textual or spoken evidence from an authoritative source, or vice versa.[58][59]
  • Ignoratio elenchi (irrelevant conclusion, missing the point) – an argument that may in itself be valid, but does not address the issue in question.[60]

Red herring fallacies

A red herring fallacy, one of the main subtypes of fallacies of relevance, is an error in logic where a proposition is, or is intended to be, misleading in order to make irrelevant or false inferences. In the general case any logical inference based on fake arguments, intended to replace the lack of real arguments or to replace implicitly the subject of the discussion.[61][62]

Red herring – a speaker attempts to distract an audience by deviating from the topic at hand by introducing a separate argument the speaker believes is easier to speak to.[63] Argument given in response to another argument, which is irrelevant and draws attention away from the subject of argument. See also irrelevant conclusion.

  • Ad hominem – attacking the arguer instead of the argument.
    • Circumstantial ad hominem – stating that the arguers personal situation or perceived benefit from advancing a conclusion means that their conclusion is wrong.[64]
    • Poisoning the well – a subtype of ad hominem presenting adverse information about a target person with the intention of discrediting everything that the target person says.[65]
    • Abusive fallacy – verbally abusing the opponent rather than arguing about the originally proposed argument.[66]
    • Appeal to motive – dismissing an idea by questioning the motives of its proposer.
    • Kafka-trapping – A sophistical and unfalsifiable form of argument that attempts to overcome an opponent by inducing a sense of guilt and using the opponent’s denial of guilt as further evidence of guilt.[67]
    • Tone policing – focusing on emotion behind (or resulting from) a message rather than the message itself as a discrediting tactic.
    • Traitorous critic fallacy (ergo decedo, ‘thus leave’) – a critic’s perceived affiliation is portrayed as the underlying reason for the criticism and the critic is asked to stay away from the issue altogether. Easily confused with the association fallacy (“guilt by association”), below.
  • Appeal to authority (argument from authority, argumentum ad verecundiam) – an assertion is deemed true because of the position or authority of the person asserting it.[68][69] The term is also used more broadly to describe arguments that are not always fallacious; see entry in the Conditional or questionable fallacies section.
    • Appeal to accomplishment – an assertion is deemed true or false based on the accomplishments of the proposer.[70] This may often also have elements of appeal to emotion (see below).
    • Courtier’s reply – a criticism is dismissed by claiming that the critic lacks sufficient knowledge, credentials, or training to credibly comment on the subject matter.
  • Appeal to consequences (argumentum ad consequentiam) – the conclusion is supported by a premise that asserts positive or negative consequences from some course of action in an attempt to distract from the initial discussion.[71]
  • Appeal to emotion – an argument is made due to the manipulation of emotions, rather than the use of valid reasoning.[72]
    • Appeal to fear – an argument is made by increasing fear and prejudice towards the opposing side[73][74]
    • Appeal to flattery – an argument is made due to the use of flattery to gather support.[75]
    • Appeal to pity (argumentum ad misericordiam) – an argument attempts to induce pity to sway opponents.[76]
    • Appeal to ridicule – an argument is made by incorrectly presenting the opponent’s argument in a way that makes it appear ridiculous.[77][78]
    • Appeal to spite – an argument is made through exploiting people’s bitterness or spite towards an opposing party.[79]
    • Judgmental language – insulting or pejorative language to influence the audience’s judgment.
    • Pooh-pooh – dismissing an argument perceived unworthy of serious consideration.[80]
    • Wishful thinking – a decision is made according to what might be pleasing to imagine, rather than according to evidence or reason.[81]
  • Appeal to nature – judgment is based solely on whether the subject of judgment is ‘natural’ or ‘unnatural’.[82] (Sometimes also called the “naturalistic fallacy”, but is not to be confused with the other fallacies by that name.)
  • Appeal to novelty (argumentum novitatis, argumentum ad antiquitatis) – a proposal is claimed to be superior or better solely because it is new or modern.[83]
  • Appeal to poverty (argumentum ad Lazarum) – supporting a conclusion because the arguer is poor (or refuting because the arguer is wealthy). (Opposite of appeal to wealth.)[84]
  • Appeal to tradition (argumentum ad antiquitatem) – a conclusion supported solely because it has long been held to be true.[85]
  • Appeal to wealth (argumentum ad crumenam) – supporting a conclusion because the arguer is wealthy (or refuting because the arguer is poor).[86] (Sometimes taken together with the appeal to poverty as a general appeal to the arguer’s financial situation.)
  • Argumentum ad baculum (appeal to the stick, appeal to force, appeal to threat) – an argument made through coercion or threats of force to support position.[87]
  • Argumentum ad populum (appeal to widespread belief, bandwagon argument, appeal to the majority, appeal to the people) – a proposition is claimed to be true or good solely because a majority or many people believe it to be so.[88]
  • Association fallacy (guilt by association and honor by association) – arguing that because two things share (or are implied to share) some property, they are the same.[89]
  • Ipse dixit (bare assertion fallacy) – a claim that is presented as true without support, as self-evidently true, or as dogmatically true. This fallacy relies on the implied expertise of the speaker or on an unstated truism.[90][91]
  • Bulverism (psychogenetic fallacy) – inferring why an argument is being used, associating it to some psychological reason, then assuming it is invalid as a result. The assumption that if the origin of an idea comes from a biased mind, then the idea itself must also be a falsehood.[33]
  • Chronological snobbery – a thesis is deemed incorrect because it was commonly held when something else, known to be false, was also commonly held.[92][93]
  • Fallacy of relative privation (also known as “appeal to worse problems” or “not as bad as”) – dismissing an argument or complaint due to the existence of more important problems in the world, regardless of whether those problems bear relevance to the initial argument. First World problems are a subset of this fallacy.[94]
  • Genetic fallacy – a conclusion is suggested based solely on something or someone’s origin rather than its current meaning or context.[95]
  • Moralistic fallacy – inferring factual conclusions from evaluative premises, in violation of fact–value distinction; e.g. making statements about what is, on the basis of claims about what ought to be. This is the inverse of the naturalistic fallacy.
  • Naturalistic fallacy – inferring evaluative conclusions from purely factual premises[96][97] in violation of fact–value distinction. Naturalistic fallacy in the stricter sense defined in the section “Conditional or questionable fallacies” (below) is a variety of this broader sense. Naturalistic fallacy (sometimes confused with appeal to nature) is the inverse of moralistic fallacy.
  • Naturalistic fallacy fallacy[99] (anti-naturalistic fallacy)[100] – inferring an impossibility to infer any instance of ought from is from the general invalidity of is-ought fallacy, mentioned above. For instance, is P ∨ ¬ P {\displaystyle P\lor \neg P} does imply ought P ∨ ¬ P {\displaystyle P\lor \neg P} for any proposition P {\displaystyle P} , although the naturalistic fallacy fallacy would falsely declare such an inference invalid. Naturalistic fallacy fallacy is a type of argument from fallacy.
  • Straw man fallacy – an argument based on misrepresentation of an opponent’s position.[101]
  • Texas sharpshooter fallacy – improperly asserting a cause to explain a cluster of data.[102]
  • Tu quoque (‘you too’ – appeal to hypocrisy, whataboutism) – the argument states that a certain position is false or wrong or should be disregarded because its proponent fails to act consistently in accordance with that position.[103]
  • Two wrongs make a right – occurs when it is assumed that if one wrong is committed, another wrong will rectify it.[104]
  • Vacuous truth – a claim that is technically true but meaningless, in the form of claiming that no A in B has C, when there is no A in B. For example, claiming that no mobile phones in the room are on when there are no mobile phones in the room at all.

Conditional or questionable fallacies

  • Appeal to authority (argument from authority, argumentum ad verecundiam) – a form of defeasible argument in which a claimed authority’s support is used as evidence for an argument’s conclusion. The argument may actually be cogent when all sides of a discussion agree on the reliability of the authority in the given context. See the Red herring fallacies section, above, for the fallacious variant.
  • Broken window fallacy – an argument that disregards lost opportunity costs (typically non-obvious, difficult to determine or otherwise hidden) associated with destroying property of others, or other ways of externalizing costs onto others. For example, an argument that states breaking a window generates income for a window fitter, but disregards the fact that the money spent on the new window cannot now be spent on new shoes.
  • Definist fallacy – involves the confusion between two notions by defining one in terms of the other.[105]
  • The ends justify the means – an assertion that may or may not be defensible depending on the ends and means in question. The various approaches to this sort of question are the subject of the normative ethical theories of consequentialism.
  • Naturalistic fallacy – attempts to prove a claim about ethics by appealing to a definition of the term “good” in terms of either one or more claims about natural properties. The naturalistic fallacy also has a more general version, covered in the “Red herring fallacies” section, above.[82]
  • Slippery slope (thin edge of the wedge, camel’s nose) – asserting that a relatively small first step inevitably leads to a chain of related events culminating in some significant impact/event that should not happen, thus the first step should not happen. It is, in its essence, an appeal to probability fallacy.[106]

Autonomy and autodidactism

In development or moral, political, and bioethical philosophy, autonomy is the capacity to make an informed, un-coerced decision. Autonomous organizations or institutions are independent or self-governing. Autonomy can also be defined from human resource perspective and it means a level of discretion granted to an employee in his or her work.More at Wikipedia

Autodidacticism or self-education is education without the guidance of masters or institutions. Generally, an autodidact is an individual who chooses the subject they will study, their studying material, and the studying rhythm and time.More at Wikipedia

Tavistock institute for human relations


ia800203.us.archive.org/12/items/Tavistock_201601/Coleman_John_-_The_Tavistock_Institute_of_Human_Relations.pdf

The Tavistock Institute of Human Relations or TIHR is a British not-for-profit organisation which applies social science to contemporary issues and problems. It was initiated in 1946, when it developed from the Tavistock Clinic, and was formally established as a separate entity in September 1947.More at Wikipedia


References

Neumann, J. E.. (2005). Kurt lewin at the tavistock institute. Educational Action Research

Plain numerical DOI: 10.1080/09650790500200271
DOI URL
directSciHub download

Dwight D. Eisenhower Farewell Address – ‘Military Industrial Complex’

Medhurst, M. J.. (1994). Reconceptualizing rhetorical history: Eisenhower’s farewell address. Quarterly Journal of Speech

Plain numerical DOI: 10.1080/00335639409384067
DOI URL
directSciHub download

Eisenhower, D. D.. (1961). Text of Eisenhower’s Farewell Address. New York Times
Reppy, J.. (2008). A biomedical military-industrial complex?. Technovation

Plain numerical DOI: 10.1016/j.technovation.2008.09.004
DOI URL
directSciHub download

Gholz, E.. (2011). Eisenhower versus the spin-off story: Did the rise of the military-industrial complex hurt or help America’s commercial aircraft industry?. In Enterprise and Society

Plain numerical DOI: 10.1093/es/khq134
DOI URL
directSciHub download

Hooks, G.. (2010). Military-Industrial Complex, Organization And History. In Encyclopedia of Violence, Peace, and Conflict

Plain numerical DOI: 10.1016/B978-012373985-8.00109-4
DOI URL
directSciHub download

Stevens, C. W., & Glatstein, E.. (1996). Beware the Medical-Industrial Complex.. Oncologist
Ritter, D. P., & McLauchlan, G.. (2010). Military-Industrial Complex, Contemporary Significance. In Encyclopedia of Violence, Peace, and Conflict

Plain numerical DOI: 10.1016/B978-012373985-8.00104-5
DOI URL
directSciHub download

Cass Sunstein – Cognitive infiltration

Cass Robert Sunstein FBA is an American legal scholar, particularly in the fields of constitutional law, administrative law, environmental law, and law and behavioral economics, who was the Administrator of the White House Office of Information and Regulatory Affairs in the Obama administration from 2009 to 2012. For 27 years, Sunstein taught at the University of Chicago Law School. Sunstein is the Robert Walmsley University Professor at Harvard Law School.More at Wikipedia

Sunstein suggests that the government should use conspiracies (i.e., cognitive infiltration, social interference via cognitive diversity) to stop debates about governmental conspiracies – an absurd idea which he articulated in several papers. Given his position as a presidential adviser it is realistic to assume that his ideas have real-world impact. Sunstein is known for his “nudge theory” of behaviour modification (cf. linguistic thought control and subliminal indoctrination).

See also: The origins of the “conspiracy meme”:

The “conspiracy meme” as a linguistic tool for memetic hegemony

Origins of the “conspiracy meme”

Conspiracy-theories-causes-and-cures

References

Sunstein, C. R.. (2006). Irreversible and catastrophic. Cornell Law Review

Plain numerical DOI: 10.2139/ssrn.707128
DOI URL
directSciHub download

Sunstein, C. R.. (2000). Group dynamics. Law and Literature

Plain numerical DOI: 10.1080/1535685X.2000.11015605
DOI URL
directSciHub download

Jolls, C., Sunstein, C. R., & Thaler, R.. (1998). A Behavioral Approach to Law and Economics. Stanford Law Review

Plain numerical DOI: 10.2307/1229304
DOI URL
directSciHub download

Thaler, R., & Sunstein, C.. (2008). Nudge. Journal of Chemical Information and Modeling

Plain numerical DOI: 10.1007/s10602-008-9056-2
DOI URL
directSciHub download

Sunstein, C. R.. (2005). Moral heuristics. Behavioral and Brain Sciences

Plain numerical DOI: 10.1017/S0140525X05000099
DOI URL
directSciHub download

Sunstein, C. R.. (1999). The Law of Group Polarization. SSRN

doi.org/10.1111/1467-9760.00148

Thaler, R. H., & Sunstein, C. R.. (2003). Libertarian paternalism. In American Economic Review

Plain numerical DOI: 10.1257/000282803321947001
DOI URL
directSciHub download

Sunstein, C. R.. (2014). Nudging: A Very Short Guide. Journal of Consumer Policy

Plain numerical DOI: 10.1007/s10603-014-9273-1
DOI URL
directSciHub download

Sunstein, C. R.. (2001). Cass R. Sunstein. Virginia Law Review

Plain numerical DOI: 10.2139/ssrn.2733142
DOI URL
directSciHub download

Sunstein, C. R.. (2006). Infotopia: How Many Minds Produce Knowledge. First Monday

Plain numerical DOI: 10.1017/S1537592708080821
DOI URL
directSciHub download

Sunstein, C. R.. (1996). Social Norms and Social Roles. Columbia Law Review

Plain numerical DOI: 10.2307/1123430
DOI URL
directSciHub download

Selinger, E., & Whyte, K.. (2011). Is There a Right Way to Nudge? The Practice and Ethics of Choice Architecture. Sociology Compass

Plain numerical DOI: 10.1111/j.1751-9020.2011.00413.x
DOI URL
directSciHub download

Sunstein, C. R., & Thaler, R. H.. (2003). Libertarian Paternalism is Not an Oxymoron. SSRN

doi.org/10.2139/ssrn.405940

Sugden, R.. (2009). On nudging: A review of nudge: Improving decisions about health, wealth and happiness by Richard H. Thaler and cass R. Sunstein. International Journal of the Economics of Business

Plain numerical DOI: 10.1080/13571510903227064
DOI URL
directSciHub download

Sunstein, C. R.. (2013). The storrs lectures: Behavioral economics and paternalism. Yale Law Journal

Plain numerical DOI: 10.2139/ssrn.2182619
DOI URL
directSciHub download

Dominici, F., Greenstone, M., & Sunstein, C. R.. (2014). Particulate matter matters. Science

Plain numerical DOI: 10.1126/science.1247348
DOI URL
directSciHub download

Sunstein, C. R.. (2005). Laws of fear: Beyond the precautionary principle. Laws of Fear: Beyond the Precautionary Principle

Plain numerical DOI: 10.1017/CBO9780511790850
DOI URL
directSciHub download

Sunstein, C. R.. (2014). Why nudge?: The politics of libertarian paternalism (the Storrs Lectures series). The Politics of Libertarian Paternalism

Plain numerical DOI: 10.1007/s12115-015-9975-2
DOI URL
directSciHub download

Sunstein, C. R., & Vermeule, A.. (2009). Symposium on conspiracy theories: Conspiracy theories: Causes and cures. In Journal of Political Philosophy

Plain numerical DOI: 10.1111/j.1467-9760.2008.00325.x
DOI URL
directSciHub download

Sunstein, C. R.. (2003). Terrorism and Probability Neglect. Journal of Risk and Uncertainty

Plain numerical DOI: 10.1023/A:1024111006336
DOI URL
directSciHub download

Sunstein, C. R.. (2000). Deliberative Trouble? Why Groups Go to Extremes. Yale Law Journal

Plain numerical DOI: 10.2307/797587
DOI URL
directSciHub download

Sunstein, C. R.. (2014). The Ethics of Nudging. SSRN

doi.org/10.2139/ssrn.2526341