Multilingual AI translation:

Cognitive biases (availability heuristic, framing, base-rate neglect, false positive paradox)

“Most people will give proportionally more weight to a dramatic risk of dying from an airplane crash, for example, than to the risk of dying from lung cancer due to smoking, even though the latter is more likely. Drama, symbolism and identifiable victims, particularly children or celebrities, the science writer said, also make a risk more memorable.

Availability means that events that are easily remembered or imagined are more accessible or “available” to people, so that their frequencies are overestimated (Tversky and Kahneman, 1973). If, for example, a particular risk has recently or often been reported in the popular press, people may well overestimate its frequency. A science writer commented that people pay more attention to dramatic, new, or unknown risks or risks conveyed within the context of a personal story.

Framing, the way in which information is presented or the context into which it is placed, affects how risk communication messages are received. Studies show that a different framing of the same options can induce people to change their preferences among options (Tversky and Kahneman, 1973; Lichtenstein and Slovic, 1971). This is known as a preference reversal. For example, the data on lung cancer treatment suggest that surgical treatment has a higher initial mortality rate but radiation has a higher 5 year mortality rate. In one illustration, 10 percent of surgery patients die during treatment, 32 percent will have died one year after surgery, and 66 will have died by five years. For radiation, 23 percent die by one year and 78 die by five years. When people are given these mortality statistics, they tend to be evenly split between preferring radiation and preferring surgery. When the same statistics are given as life expectancies (6.1 years for surgery and 4.7 years for radiation) there is an overwhelming preference for surgery (McNeil et al., 1982).

How information is framed can also affect whether people allow an omission bias to be a prime motivator of a decision not to vaccinate. One study of university students found that when the issue of responsibility was removed, subjects were more likely to opt for vaccination. Responsibility was removed by reframing the question as “if you were the child, what decision would you like to see made” (Baron, 1992).

Other research shows that people tend to have a preference for eliminating risk and for maintaining the status quo (Thaler, 1980; Samuelson and Zeckhauser, 1988). Consequently, people often have an aversion to increasing the probability of one type of risk to reduce that of another, even by the same amount. They may even prefer a riskier situation over a less risky situation if the former maintains the status quo (Fischhoff et al., 1981).”

www.ncbi.nlm.nih.gov/books/NBK233844/

The base rate fallacy, also called base rate neglect[1] or base rate bias, is a type of fallacy in which people tend to ignore the base rate (i.e., general prevalence) in favor of the individuating information (i.e., information pertaining only to a specific case). Base rate neglect is a specific form of the more general extension neglect.

 

False positive paradox. This paradox describes situations where there are more false positive test results than true positives. For example, if a facial recognition camera can identify wanted criminals 99% accurately, but analyzes 10,000 people a day, the high accuracy is outweighed by the number of tests, and the program’s list of criminals will likely have far more false positives than true. The probability of a positive test result is determined not only by the accuracy of the test but also by the characteristics of the sampled population.[3] When the prevalence, the proportion of those who have a given condition, is lower than the test’s false positive rate, even tests that have a very low risk of giving a false positive in an individual case will give more false than true positives overall.[4] The paradox surprises most people.[5]

It is especially counter-intuitive when interpreting a positive result in a test on a low-prevalence population after having dealt with positive results drawn from a high-prevalence population.[4] If the false positive rate of the test is higher than the proportion of the new population with the condition, then a test administrator whose experience has been drawn from testing in a high-prevalence population may conclude from experience that a positive test result usually indicates a positive subject, when in fact a false positive is far more likely to have occurred.

There is considerable debate in psychology on the conditions under which people do or do not appreciate base rate information.[15][16] Researchers in the heuristics-and-biases program have stressed empirical findings showing that people tend to ignore base rates and make inferences that violate certain norms of probabilistic reasoning, such as Bayes’ theorem. The conclusion drawn from this line of research was that human probabilistic thinking is fundamentally flawed and error-prone.[17] Other researchers have emphasized the link between cognitive processes and information formats, arguing that such conclusions are not generally warranted.[18][19]

Source: en.wikipedia.org/wiki/Base_rate_fallacy

Leave a Reply

Your email address will not be published. Required fields are marked *

− 8 = 1