Showing posts with label Cognitive bias. Show all posts
Showing posts with label Cognitive bias. Show all posts

Saturday, March 18, 2023

Barnum effect in psychology

 


The Barnum effect is a psychological phenomenon that occurs when people believe that vague and general descriptions of their personality, character, or life experiences are uniquely tailored to them, even though they are actually applicable to a wide range of people. It is also known as the Forer effect, named after psychologist Bertram Forer who first demonstrated it in the 1940s.

One classic study by Forer (1949) demonstrated the Barnum effect by having his students complete a personality test, and then gave them a supposedly individualized description of their personality that was actually a mix of generic statements that could apply to almost anyone. The results showed that the students rated the description as highly accurate and applicable to their own personality, despite the fact that the description was not unique to them.

The Barnum effect has been demonstrated in a variety of contexts, including astrology, horoscopes, and psychic readings. It is thought to be related to cognitive biases such as confirmation bias, where people seek out information that confirms their existing beliefs or expectations.

Overall, the Barnum effect highlights the tendency for people to find meaning in vague and general statements, and the importance of critical thinking and skepticism when evaluating information about oneself.

References

Forer, B. R. (1949). The fallacy of personal validation: A classroom demonstration of gullibility. Journal of Abnormal Psychology, 44(1), 118-123. doi: 10.1037/h0059240

Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2(2), 175-220. doi: 10.1037/1089-2680.2.2.175



Geoffrey W. Sutton, PhD is Emeritus Professor of Psychology. He retired from a clinical practice and was credentialed in clinical neuropsychology and psychopharmacology. His website is  www.suttong.com

 

See Geoffrey Sutton’s books on   AMAZON       or  GOOGLE STORE

Follow on    FACEBOOK   Geoff W. Sutton    

   TWITTER  @Geoff.W.Sutton    

You can read many published articles at no charge:

  Academia   Geoff W Sutton     ResearchGate   Geoffrey W Sutton 

 

Dr. Sutton’s posts are for educational purposes only. See a licensed mental health provider for diagnoses, treatment, and consultation.

 

Friday, September 2, 2022

forensic confirmation bias

 In studies of forensic evidence, biasing information can lead analysts to ignore relevant evidence and lead to erroneous conclusions.


Reference example

Kassin, S. M., Dror, I. E., & Kukucka, J. (2013). The forensic confirmation bias: Problems, perspectives, and proposed solutions. Journal of applied research in memory and cognition, 2(1), 42-52.

bias blind spot

 The common finding that people have difficulty recognizing their own biases.

present bias in psychology

 The present bias is a tendency to heavily ignore possible future outcomes when making a decision. That is, present bias is evident when people focus on the present and recent past when making decisions.

illusion of agreement in psychology

 The illusion of agreement is a finding that people tend to assume others agree with their conclusion when they are unable to think about plausible alternatives.

affect heuristic

 The affect heuristic is a reliance on feelings in decision-making.

An example of the affect heuristic is a group's support for hiring a leader based on a variety of good feelings or a "gut reaction" rather than a systematic analysis of factors predictive or successful leadership.

A summary of the affect heuristic can be found in the work of Paul Slovic and his colleagues (2002).


Reference

Slovic, P., Finucane, M., Peters, E., & MacGregor, D. G. (2002). The affect heuristic. In T. Gilovich, D. Griffin, & D. Kahneman (Eds.), Heuristics and biases: The psychology of intuitive judgment (pp. 397–420). Cambridge University Press. https://doi.org/10.1017/CBO9780511808098.025


conclusion bias in psychology

 Conclusion bias occurs when a particular outcome is favored and the careful analysis of evidence gathering is ignored in favor of available supportive evidence.


planning fallacy- psychology

 The planning fallacy is a psychological bias based on findings that estimates of project completion time are usually lower than the actual time it takes to complete a project.


Thursday, September 1, 2022

gambler's fallacy


The gambler's fallacy is a type of cognitive bias reflected in an underestimate of the likelihood that streaks happen by chance.



photo credit: Bing images- free to share and use

Please see my website for psychology books and resources suttong.com


Wednesday, June 15, 2022

Illusion of personal objectivity

 The illusion of personal objectivity is a perception that one's ideas and beliefs are objective and reasonable. The person is convinced that when they explain the facts, others will agree with them and those who disagree are being unreasonable or irrational.


Physicians and psychologists can strongly disagree on the nature of a patient's condition based on their perspective on the same symptoms or the research concerning a drug or vaccine.

Parents can argue about the perceived causes of a child's misbehavior.

Researchers have documented a link between the objectivity illusion and the false consensus effect. The false consensus effects in the belief, based on supportive friends, that one's views are correct.

People generally fail to seriously consider opinions that are different from their own. This failure is found among highly educated professionals.

Problems of quick thinking can be found in the work of Daniel Kahneman--see Thinking, Fast and Slow.




Reference

Kahneman, D. (2011). Thinking, fast and slow. New York, NY: Farrar, Stauss, & Giroux.

Contingency symmetry bias

 The contingency symmetry bias is a human tendency to quickly assume bidirectional relationships exist between pairs of stimuli and expressions. 

Researchers have referred to the bias as a basis for early word learning. Children learn a symmetrical relationship between a word and a symbol like the word "apple" and a picture of an apple or the spoken sound of the word "apple." See for example Imai et al. (2021).

The contingency symmetry bias leads to false conclusions when people believe that the appearance of an event has a particular cause because in previous experience, a cause can lead to the observed event. In logic, it is the fallacy of affirming the consequent.

Example

Rain causes a driveway to be wet but a wet driveway does not always mean it is wet because it rained.



Reference

Imai, M., Murai, C., Michiko, M. Okada, H., & Tomonaga. (2021). The contingency symmetry bias (affirming the consequent fallacy) as a prerequisite for word learning: A comparative study of pre-linguistic human infants and chimpanzees, Cognition, 214, 104755, https://doi.org/10.1016/j.cognition.2021.104755.

Monday, April 18, 2022

First instinct fallacy


The first instinct fallacy is a belief that one should act on their first impression. 

Several studies indicated that students would do better on tests if they did not rely on their first instinct (Kruger et al., 2005). However, Couchman and his team (2016) found that confidence is a factor.

Coachman et al. (2016) suggest that the first answer on a test serves as an anchor, which creates a cognitive bias toward keeping the original answer. By recording a confidence rating with the answer, students have another piece of evidence to consider when reviewing their work.

_____

Justin Kruger and his colleagues found support for the first instinct fallacy in an oft quoted article from 2005. Following is their summary of results.

[Most people believe that they should avoid changing their answer when taking multiple-choice tests. Virtually all research on this topic, however, has suggested that this strategy is ill-founded: Most answer changes are from incorrect to correct, and people who change their answers usually improve their test scores. Why do people believe in this strategy if the data so strongly refute it? The authors argue that the belief is in part a product of counterfactual thinking. Changing an answer when one should have stuck with one’s original answer leads to more “if only . . .” self-recriminations than does sticking with one’s first instinct when one should have switched. As a consequence, instances of the former are more memorable than instances of the latter. This differential availability provides individuals with compelling (albeit illusory) personal evidence for the wisdom of always following their 1st instinct, with suboptimal test scores the result.]

_____

Justin Couchman and his team (2016) found that changing an answer or sticking with a first impression can both lead to correct answers. The outcome varies with the level of confidence students have in their response.

In the study, students provided their level of confidence and whether or not they changed their answers.

Here's a quote from their article( page 180).

[Should you ever revise your original choice, and if so, when? In both studies, there was a clear and consistent trend: On items that caused the most uncertainty, initial instincts were correct  less than half the time, general beliefs and post-exam assessments did not accurately reflect performance, but real-time metacognitive ratings did. Our data support the idea that – after a choice has been made and time has passed – people should consider revising answers when their recorded confidence in their initial answer was low, but not when confidence in their initial answer was high.]

The recommendation based on Couchman et al. (2016) would be to consider changing the low confidence answers.

_____

References


Couchman, J. J., Miller, N. E., Zmuda, S. J., Feather, K., & Schwartzmeyer, T. (2016). The instinct fallacy: The metacognition of answering and revising during college exams. Metacognition and Learning, 11(2), 171–185. https://doi.org/10.1007/s11409-015-9140-8

Kruger J, Wirtz D, & Miller DT. (2005). Counterfactual thinking and the first instinct fallacy. Journal of Personality and Social Psychology, 88(5),725-35. doi: 10.1037/0022-3514.88.5.725.

Photo from Bing search "free to share and use"

ad

For help with survey research, see Creating Surveys








Please check out my website   www.suttong.com

   and see my books on   AMAZON       or  GOOGLE STORE

Also, consider connecting with me on    FACEBOOK   Geoff W. Sutton    

   TWITTER  @Geoff.W.Sutton    

You can read many published articles at no charge:

  Academia   Geoff W Sutton     ResearchGate   Geoffrey W Sutton 

Wednesday, February 9, 2022

Illusory Truth Effect

 



The illusory truth effect is the persistent finding that simply repeating a statement can increase a person's belief that the statement is true.

The effect occurs in young children and adults (Fazio et al., 2020).

Only two repetitions are needed to create the illusion of truth although increases in the frequency of repetition indicate an increase in the perception of truth.

This illusory truth effect helps account for the sharing of fake news (See Vellani et al. 2023)

It seems, fake stories must be extreme for people to recognize them as false.

As a cognitive phenomenon, the illusory truth effect is primarily associated with C (Cognition) in the SCOPES model of functioning.

*****

Politicians and other leaders can take advantage of the illusory truth effect to shape public opinion.

Anyone may use the illusory truth effect when they repeat exaggerated stories or poorly supported opinions.

Businesses can use the illusory truth effect to sell their product by using repeated marketing phrases.

Psychology of Religion hypothesis: Clergy and religious leaders encourage congregants to repeat statements of faith. Religious songs (hymns, choruses) may also contain statements of faith that are considered inaccurate even by conservative scholars. The diversity of congregations (such as Christian denominations) hold different beliefs about what is a truthful belief that separates them from similar groups. This diversity of belief may be partly explained by the repetition of the statements that define their view of the truth.

*****

RESEARCH NOTES

Lisa Fazio and her colleagues (2020) studied the effect of repetitions in children (ages 5 and 10) and adults and found the illusory truth effect in all age groups.

Fazio, L. K., & Sherry, C. L. (2020). The Effect of Repetition on Truth Judgments Across Development. Psychological Science31(9), 1150–1160. https://doi.org/10.1177/0956797620939534 

*****

Hassan and Barber (2021) studied the truth effect in two experiments that tested more repetitions of statements than did previous researchers. In one experiment, the participants saw up to 9 repetitions of statements and in another experiment, they saw up to 27 repetitions. The illusion of truth effect was supported in both studies. More repetitions led to a greater perception of truth for repeated statements compared to new ones. A clarification of the effect was evident on analysis of the data. The largest increase in perceived truthfulness occurred on the second time. After that, the size of the effect decreased. 

Hassan, A., Barber, S.J. The effects of repetition frequency on the illusory truth effect. Cogn. Research 6, 38 (2021). https://doi.org/10.1186/s41235-021-00301-5

*****

Valentina Vellani and others (2023) found that people were more likely to share repeated misinformation on social media than they were to share misinformation that was only presented once. The sharing was related to perceived accuracy of a statement linked to the bias produced by repetition.

Vellani, V., Zheng, S., Ercelik, D., & Sharot, T. (2023). The illusory truth effect leads to the spread of misinformation. Cognition236, 105421. https://doi.org/10.1016/j.cognition.2023.105421

*****

For a resource summarizing the illusory truth effect and other cognitive phenomena, see Daniel Schachter's book, The Seven Sins of Memory

  


Please check out my website   www.suttong.com

   and see my books on   AMAZON       or  GOOGLE STORE

Also, consider connecting with me on    FACEBOOK   Geoff W. Sutton    

   TWITTER  @Geoff.W.Sutton    

You can read many published articles at no charge:

  Academia   Geoff W Sutton     ResearchGate   Geoffrey W Sutton 


Thursday, February 3, 2022

Dunning Kruger effect



The Dunning-Kruger effect is a cognitive bias evident when people markedly overestimate their abilities or competencies compared to those of their peers or scores on relevant assessments.

Low performers exhibit poor judgment of their own knowledge, skills, and competencies as well as those of other people.

In other studies, the scientists found highly competent people underestimating their knowledge, abilities, and competencies.

But there have been criticisms.

As with most findings in science, there have been challenges to the conclusions drawn by Dunning and Kruger. For example, in 2022 Magnus and Peresetsky noted that metacognitive psychological explanations may not be needed. They refer to the statistical phenomenon of regression to the mean (described by Francis Galton in 1886). In addition, the authors point out the role of boundary values. That is, if you score 97% on a test, you are close to 100% so the only prediction for future performance is likely at bit lower. Similarly, those who score near zero percent may be likely to predict at least some improvement in the future.

Magnus and Presetsky (2022) conducted a study. Following is their conclusion.

In this article, we have attempted to provide an explanation of the DK effect which does not require any psychological explanation. By specifying a simple statistical model which explicitly takes the (random) boundary constraints into account, we achieve a near-perfect fit, thus demonstrating that the DK effect is a statistical artifact. In other words: there is an effect, but it does not reflect human nature.

In their final paragraph, Magnus and Presetsky (2022) offer a suggestion about the belief in psychological explanations.

Perhaps the explanation for the persistence of this belief is: We have two facts, both true. First, we actually observe the DK effect. Second, if we compare people's ideas about their own ability with objective measurements of this ability, we find that people tend to overestimate themselves. Then, what is more natural than to think that these two statements are related to each other, in fact, that one causes the other? The problem is, they aren't.


Reference

Kruger, J., & Dunning, D. (1999). Unskilled and unaware of it: How difficulties in recognizing one's own incompetence lead to inflated self-assessments. Journal of Personality and Social Psychology, 77(6), 1121–1134. https://doi.org/10.1037/0022-3514.77.6.1121

Magnus, J. R., & Peresetsky, A. A. (2022). A Statistical Explanation of the Dunning-Kruger Effect. Frontiers in psychology13, 840180. https://doi.org/10.3389/fpsyg.2022.840180


Please check out my website   www.suttong.com

   and see my books on   AMAZON       or  GOOGLE STORE

Also, consider connecting with me on    FACEBOOK   Geoff W. Sutton    

   TWITTER  @Geoff.W.Sutton    

You can read many published articles at no charge:

  Academia   Geoff W Sutton     ResearchGate   Geoffrey W Sutton 


Wednesday, January 26, 2022

stereotypical bias


 

Stereotypical bias is the tendency to view people and objects in general ways based on various sources of information about people and things.

A reliance on stereotypes can lead to prejudicial behavior toward specific people who appear to be in the same class or group as the stereotype.

Stereotype bias can lead to false identification.

Racial stereotypes have been particular damaging leading to judgments of guilt by association with the stereotype rather than guilt based on observed behavior.

Stereotypical bias affects what people remember. Recalled information can fit a congruity bias. That is, people may recall features based on the stereotypes they hold. A witness may describe a person based on the stereotype of people from a particular race or ethnic group.


Please check out my website   www.suttong.com

   and see my books on   AMAZON       or  GOOGLE STORE

Also, consider connecting with me on    FACEBOOK   Geoff W. Sutton    

   TWITTER  @Geoff.W.Sutton    

You can read many published articles at no charge:

  Academia   Geoff W Sutton     ResearchGate   Geoffrey W Sutton 

 


egocentric bias



 Egocentric bias is a tendency to recall information about oneself in more detail than other information. In general, the recall tends to treat oneself in a positive light but for some, the recall can reflect unwarranted negative views of the self.

In couples conflicts, each person tends to recall more details from their personal point of view, which can increase the conflict about events.

The tendency to recall positive self-characteristics yields "positive illusions" and can lead to inflated self-worth. Researchers find people select positive personality traits to describe themselves more than would be expected based on average findings. Along similar lines, negative characteristics and failures are attributed to others or forces outside oneself.

Divorce research indicates a tendency to increase self-enhancing memory biases.

Self-enhancing recall of past difficulties can exaggerate the difficulties.

A favorable view of oneself may lead to exaggerating how bad one was in the past.


Please check out my website   www.suttong.com

   and see my books on   AMAZON       or  GOOGLE STORE

Also, consider connecting with me on    FACEBOOK   Geoff W. Sutton    

   TWITTER  @Geoff.W.Sutton    

You can read many published articles at no charge:

  Academia   Geoff W Sutton     ResearchGate   Geoffrey W Sutton 


hindsight bias

 Hindsight bias is the tendency to claim knowledge of how a past event would turn out or to consider the outcome inevitable. People tend to reconstruct the past to make it consistent to what is now known.

Hindsight bias is a common and powerful bias that people find difficult to disregard. This can create problems when juries are instructed to disregard information and when  experts are asked to offer a second opinion when they know the original opinion.

Hindsight bias leads to recalling information that confirms memories and can lead to false memories.

Hindsight bias can interfere with learning from experience.


implicit theory of stability

 The implicit theory of stability is a tendency of people to assume their views have not changed over time.

Wednesday, October 6, 2021

Backfire Effect



The backfire effect is the strengthening of a mistaken belief when presented with contrary evidence.

It appears that when people are heavily invested in their original views, contrary evidence is perceived as an attack on themselves and the decision they made. All new supportive evidence is accepted and contrary evidence rejected.

The backfire effect has been seen in efforts to counter racism and sexism.

Nyhan (2021) notes that the backfire effect does not explain the durability of political misperceptions and suggests other ways to weaken misperceptions about political and scientific information.

Read more about the backfire effect in research by Nyhan and Reifler (2010) and Nyhan (2021). 

Nyhan, B. (2021). Why the backfire effect does not explain the durability of political misperceptions. PNAS118 (15) e1912440117; https://doi.org/10.1073/pnas.1912440117

Nyhan, B., Reifler, J. When Corrections Fail: The Persistence of Political Misperceptions. Polit Behav 32, 303–330 (2010). https://doi.org/10.1007/s11109-010-9112-2

Key concepts

misperception, backfire effect, misinformation, fake news, fact checking


Improve accuracy in research when Creating Surveys on AMAZON or GOOGLE




Friday, October 1, 2021

Divine Attribution Bias Positive, Negative




 Divine Attribution Biases may be positive or negative. 

A Positive Divine Attribution Bias exists when acts considered good are routinely attributed to God or divine intervention while minimising or ignoring more plausible causes.  PDAB

A Negative Divine Attribution Bias exists when a person attributes unpleasant or harmful events to divine intervention as if God or a divine being were punishing people for their sinful or unacceptable behaviour while minimising or ignoring more plausible explanations. NDAB

Examples 

Positive bias: A parking space opens up as one is searching. A positive Divine Attribution Bias is evident when the driver asserts that God, in a role of personal assistant, provided this parking convenience.

Negative bias: A tornado destroys a large part of a city. A negative Divine Attribution Bias exists when someone insists that the event was God’s punishment on the people for sins in the community.

Geoffrey W. Sutton, PhD is Emeritus Professor of Psychology. He retired from a clinical practice and was credentialed in clinical neuropsychology and psychopharmacology. His website is  www.suttong.com

 

See Geoffrey Sutton’s books on   AMAZON       or  GOOGLE STORE

Follow on    FACEBOOK   Geoff W. Sutton    

   TWITTER  @Geoff.W.Sutton    


You can read many published articles at no charge:

  Academia   Geoff W Sutton     ResearchGate   Geoffrey W Sutton 

 

Dr. Sutton’s posts are for educational purposes only. See a licensed mental health provider for diagnoses, treatment, and consultation.