Advertisement

How to Mitigate the Effects of Cognitive Biases During Patient Safety Incident Investigations

      Sentinel and adverse events can result in harm or even death. Further, they can produce significant costs, such as lost income, decreased productivity, additional health care expenses, and disability, with an approximated total of nearly one trillion dollars annually.
      • Andel C
      • et al.
      The economics of health care quality and medical errors.
      Investigations into these events, though, are flawed, as pointed out by previous researchers.
      • Peerally MF
      • et al.
      The problem with root cause analysis.
      Often, investigations wrongly undergo a quest for a single root cause, adhere to unreasonably strict time lines, or exclude independent entities, compromising the integrity of the process and findings.
      The humans who are highly involved in these investigations are also fallible. Some researchers even state that “it is simply not possible to begin an investigation with a completely open mind.”
      • Lundberg J
      • Rollenhagen C
      • Hollnagel E.
      What-you-look-for-is-what-you-find—the consequences of underlying accident models in eight accident investigation manuals.
      (p. 1298) The underlying rationale is that investigations typically reflect the competencies and experiences of investigators, so investigators are inclined to focus on their own respective expertise.
      • Cedergren A
      • Petersen K.
      Prerequisites for learning from accident investigations—a cross-country comparison of national accident investigation boards.

      Biases And Heuristics

      Human cognition and decision-making are influenced by biases and heuristics. This view was put forth by Tversky and Kahneman
      • Tversky A
      • Kahneman D.
      Judgment under uncertainty: heuristics and biases.
      and proposes that people use heuristics and biases to arrive at decisions. Heuristics are considered to be general shortcut or rule-of-thumb strategies that usually—but not always—lead people to make the optimal decision.
      • Reisberg D.
      Cognition: Exploring the Science of the Mind.
      Biases are similar in that they involve shortcut judgements, but conversely, biases typically result in systematic mistakes that are derivative of limitations in information processing.
      • Weisberg RW
      • Reeves LM.
      Cognition: From Memory to Creativity.
      Working from the understanding that heuristics and biases contribute to sentinel events (for example, retained foreign objects, wrong-site surgeries),

      The Joint Commission. Cognitive biases in health care. Quick Safety, Issue 28. Oct 2016. Accessed Jun 27, 2022. https://www.jointcommission.org/-/media/tjc/documents/newsletters/quick_safety_issue_28_oct_2016pdf.pdf.

      research has examined how cognitive biases may influence health care practitioners’ decision-making.
      • Arkes HR.
      The consequences of the hindsight bias in medical decision making.
      ,
      • Saposnik G
      • et al.
      Cognitive biases associated with medical decisions: a systematic review.
      Recognizing that bias is influential during the provision of patient care, some researchers have offered general guidance for mitigating bias.

      The Joint Commission. Cognitive biases in health care. Quick Safety, Issue 28. Oct 2016. Accessed Jun 27, 2022. https://www.jointcommission.org/-/media/tjc/documents/newsletters/quick_safety_issue_28_oct_2016pdf.pdf.

      ,
      • Croskerry P
      • Singhal G
      • Mamede S.
      Cognitive debiasing 2: impediments to and strategies for change.
      ,
      • O'Sullivan ED
      • Schofield SJ.
      Cognitive bias in clinical medicine.
      Other work has even recognized and investigated the effects of cognitive biases on investigators who are examining adverse medical events.
      • Lundberg J
      • Rollenhagen C
      • Hollnagel E.
      What you find is not always what you fix—how other aspects than causes of accidents decide recommendations for remedial actions.
      ,
      • Sanchez JA
      • et al.
      Investigating the causes of adverse events.
      However, researchers have yet to put forth strategies for mitigating bias by individuals during investigatory events. Even though understanding the breakdowns during these investigatory events is necessary, the ability to employ prescriptive solutions is quite useful. Consequently, it is the goal of this article to discuss the most relevant biases to accident investigations in health care, their potential impact on investigators’ decision-making, and mitigations to combat their potential negative effects (see Table 1).
      Table 1Summary of Discussed Biases and Accompanying Mitigations
      DefinitionExampleMitigation
      Hindsight BiasTendency to perceive outcome as more predictable than it actually was (“I knew it all along”)A patient experiences a postoperative complication after a complex surgery. Investigators question the surgical team's decision to proceed with surgery and overestimate the risk for a complication knowing a complication occurred.
      • Employ a bottom-up approach by avoiding focus on the outcome.
      • Have analysts consider decisional alternatives within appropriate context.
      Availability HeuristicTendency to base decisions on easily accessible informationThe investigation team is composed primarily of clinicians with backgrounds in nursing. As such, their focus may be biased toward workflow issues that are common for nursing staff and may miss a potential cause that is less relevant to their training (for example, drug-drug interactions).
      • Develop a thorough case description that includes information beyond just that which is most recent and salient.
      • Use a multidisciplinary investigation team to avoid a faulty focus.
      Fundamental Attribution ErrorTendency to attribute an outcome to an individual's traits rather than their circumstancesThe patient experienced a severe allergic reaction because the provider did not realize the patient was allergic to the medication. The investigation team perceived the provider to be careless and incompetent rather than contributing factors such as poor electronic health record (EHR) usability and high caseload.
      • Employ systems thinking to shift the focus from the individual to the context of their tasks, technologies, and environment.
      Confirmation BiasTendency to select information that supports one's beliefs and ignore information that does notThe investigation team conducted initial interviews that led them to attribute poor equipment usability as the primary cause of a surgical adverse event. Their initial support for this causal factor led them to discount additional evidence that established poor teamwork as an additional causal factor.
      • Use a large, multidisciplinary team of stakeholders with varying perspectives.
      • Encourage stakeholders to speak up and share perspectives.
      GroupthinkTendency for groups to prioritize concurrence and suppress individual perspectivesIn a group interview led by the investigation team, the certified registered nurse anesthetist (CRNA) surrendered to the social influence of the anesthesiologist and surgeon despite having an opposing thought.
      • Divide the group to eliminate unfounded consensus.
      • Conduct 1:1 interviews.
      • Request authoritarian figures to periodically leave.

      Hindsight and Outcome Biases

      Hindsight bias and outcome bias are two cognitive phenomena related to the tendency to perceive the results of prior decisions as more predictable than they actually were.
      • Keebler JR
      • et al.
      Human factors applied to perioperative process improvement.
      ,
      • Baron J
      • Hershey JC.
      Outcome bias in decision evaluation.
      Hindsight bias is also known as the “knew it all along” tendency, and outcome bias as the “ends justify the means” rationale. Although the two biases are similar in that they both distort a past result's predictability, they are distinct from one another in their cognitive action. Hindsight bias typically distorts the memory of an event in relation to the decision(s) of an actor,
      • Arkes HR
      • et al.
      Eliminating the hindsight bias.
      while outcome bias weighs past decisions based on their outcomes.
      • Baron J
      • Hershey JC.
      Outcome bias in decision evaluation.
      Considering investigatory methods for a preventable adverse event, hindsight bias may build off of an individual's or group's preconceived notion of event responsibility (see “Confirmation Bias”) by attributing greater certainty to the results of an actor's actions (for example, “they should have known”). “The ends justify the means” is a common phrase that, when used, often suggests the presence of outcome bias. Rationalizing decisions in light of their outcomes can be problematic when viewed from the perspective of clinical care. This form of outcome bias may serve to strengthen and legitimize poor decisions and false beliefs by attributing the positive outcomes as a matter not of good luck, but of solid practice.
      • Gino F
      • Shu LL
      • Bazerman MH.
      Nameless + harmless = blameless: when seemingly irrelevant factors influence judgment of (un)ethical behavior.
      Further, the converse action of outcome bias, in which a negative outcome degrades a reasonable clinical decision, may also affect the investigation of safety incidents.
      Two methods for mitigating the effects of hindsight bias and outcome bias are taking a bottom-up approach and advising the investigators to consider decisional alternatives within the context of the event.
      • Henriksen K
      • Kaplan H.
      Hindsight bias, outcome knowledge and adaptive learning.
      ,
      • Roese NJ
      • Vohs KD.
      Hindsight bias.
      To employ a bottom-up approach, the investigation team should avoid focusing on the knowledge of the sentinel or adverse event result before evaluating the causal events and evaluate those events and decisions on their own merits.
      • Henriksen K
      • Kaplan H.
      Hindsight bias, outcome knowledge and adaptive learning.
      Conversely, if the result is known, due diligence should be taken by the team to evaluate each decision by considering what other options may have been apparent as well as their associated clinical implications.
      • Roese NJ
      • Vohs KD.
      Hindsight bias.
      This approach, deemed the consider-the-opposite strategy, was demonstrated by Arkes et al. in a study involving assessments of medical diagnoses.
      • Arkes HR
      • et al.
      Eliminating the hindsight bias.
      The study showed a reduction in hindsight bias when investigators were asked to provide rationalization for why each of the diagnostic options might have been true; considering the merits of each decision afforded a broader perspective that was less focused on the result of the event.

      Availability Heuristic

      Availability heuristic is the tendency to base decisions on information that is easily accessible, which may be either correct or incorrect, rather than information that is supported by valid investigation.
      • Baybutt P.
      Cognitive biases in process hazard analysis.
      Some information may be more “available” to an investigation team based on its recency, saliency, or use as an anchor point.
      • Keebler JR
      • et al.
      Human factors applied to perioperative process improvement.
      ,
      • Baybutt P.
      Cognitive biases in process hazard analysis.
      ,
      • Okes D.
      The human side of root cause analysis.
      Although the availability heuristic may lead to a quick, potentially correct determination, this heuristic creates a risk that important information may be missed or ignored.
      Related to the availability heuristic are anchoring bias, recency bias, and saliency bias. Anchoring is the tendency to fixate on specific information such that other information is ignored, discounted, or evaluated in light of the anchored information.
      • Baybutt P.
      Cognitive biases in process hazard analysis.
      Individuals may anchor to the first piece of information they receive, as is the case in recency bias, in which a causal factor of a recent event might be favored as the causal factor for the current event due to its temporality.
      • Okes D.
      The human side of root cause analysis.
      Within examinations of sentinel or adverse events, this may present as a perceived “problem” actor or department within the hospital. Individuals may also anchor to information based on its salience. This is termed saliency bias, in which the information that is most prominent or striking takes precedence over other information. While anchoring bias and recency bias have strong ties to the characterization of investigations, saliency bias relates to the investigators themselves, as saliency is largely determined by the investigators’ background, beliefs, and education.
      • Schneider W
      • Shiffrin RM.
      Controlled and automatic human information processing: I. detection, search, and attention.
      ,
      • Shiffrin RM
      • Schneider W.
      Controlled and automatic human information processing: II. perceptual learning, automatic attending and a general theory.
      If an investigation team is composed of clinical representatives from similar backgrounds as those involved in the adverse event, saliency bias potentially becomes more prominent.
      To mitigate the tendencies of anchoring bias and recency bias, producing a case description may be helpful. In evaluating and explaining all contributing factors and actors in an event, teams must appraise more information than that which is immediate or most recent. Abdi and Ravaghi suggest the utility of gathering and mapping information to allow investigators to better visualize the event time line.
      • Abdi Z
      • Ravaghi H.
      Implementing root cause analysis in Iranian hospitals: challenges and benefits.
      Although saliency bias may, at times, enable an investigation team to quickly detect deviations in clinical practice if they have been directly trained on them themselves, there is a risk that important contributing factors might be missed if they are outside of the scope of this common training and/or experience. In addition, using a multidisciplinary investigation team may prove useful in avoiding a faulty focus. A team that has expertise in quality as well as clinical processes, tasks, and equipment is beneficial. Ideally, the team should have a breadth of experience and expertise to ensure that the investigation and questions specifically cover the vast amount of applicable topics.

      Fundamental Attribution Error

      The fundamental attribution error (FAE) is a bias associated with assigning the cause of certain behaviors to a trait of the acting individual instead of considering circumstantial causes. This error is based in attribution theory, which is concerned with how individuals assign meaning and judgment to others’ behaviors.
      • Kelley HH.
      Attribution theory in social psychology.
      ,
      • Heider F.
      The Psychology of Interpersonal Relations.
      The main tenet of this theory is that humans pay more attention to other humans and their behaviors than to environmental factors and context. This leads to individuals often blaming an outcome on a person's behavior rather than on circumstances.
      The tendency to overestimate the influence of one's disposition on one's behaviors is prevalent in medicine. In other words, it is easy to commit FAE when something goes wrong in the medical setting and quickly blame an individual instead of understanding the complexity of the circumstances that led to an adverse outcome. Attributing an error to a worker or provider without understanding the context of their tasks, technologies, and other factors can easily lead one down the path of committing this error. There are a few techniques for combating FAE, including paying attention to base rates, perspective-taking, and hidden causes.
      • Plous S.
      The Psychology of Judgment and Decision Making.
      The first of these is to pay close attention to consensus information (that is, base rates). If an individual acts the same way in most situations and then differently in a novel situation, it is highly likely that they acted differently due to the circumstances and not their behavioral intention. FAE can also be diminished through perspective-taking. Perspective-taking can include thinking of other outcomes and why or why not they could have occurred
      • Plous S.
      The Psychology of Judgment and Decision Making.
      and empathizing with the actors involved in an incident.
      • Regan DT
      • Totten J.
      Empathy and attribution: Turning observers into actors.
      Finally, hidden causes include considerations of other factors that led to the outcome. This recommendation is particularly tied to systems thinking. In the medical context, this would include looking across a variety of systems factors, including task, equipment, teams, environment, and organizational policy.
      • Holden RJ
      • et al.
      SEIPS 2.0: a human factors framework for studying and improving the work of healthcare professionals and patients.
      Assessment of the influence of these system features during adverse events can provide insight into potential hidden causes and provide some ground for perspective-taking and base rates of incidences.

      Confirmation Bias

      Perhaps one of the best-known types of cognitive bias is confirmation bias. To avoid cognitive dissonance (that is, having inconsistent thoughts regarding decision-making), one might subconsciously employ confirmation bias to select information that supports a preestablished belief and misinterpret or ignore information that does not.
      • Okes D.
      The human side of root cause analysis.
      ,
      • Plous S.
      The Psychology of Judgment and Decision Making.
      Similar to hindsight bias, confirmation bias can also present as an “I knew it was so” moment. However, the confirmation occurs in the discovery of the information that finally clicks and not necessarily as a reference to previous information, as it does in hindsight bias.
      Individuals involved in investigating sentinel or adverse events may employ confirmation bias if they gather or retain information selectively or explain information in a biased manner.
      • Keebler JR
      • et al.
      Human factors applied to perioperative process improvement.
      ,
      • Murata A
      • Nakamura T
      • Karwowski W.
      Influence of cognitive biases in distorting decision making and leading to critical unfavorable incidents.
      Cognizance of this type of bias is particularly pertinent for these types of evaluations, as the presence of hypotheses creates an additional in-route for confirmation bias to occur. In addition, if a hypothesis is founded on an embedded belief within the investigation team or hospital culture, confirmation bias may have an even more effective and potentially negative impact.
      • Baybutt P.
      Cognitive biases in process hazard analysis.
      ,
      • Nickerson RS.
      Confirmation bias: a ubiquitous phenomenon in many guises.
      Another opportunity for confirmation bias to occur is the sometimes ambiguous nature of sentinel and adverse event investigations. These events within medical care are seldom simple, and those that require a formal review are likely to be highly complex. For that reason, these investigations often require a large scope of focus that collects a lot of seemingly inconsequential data. These ambiguous data may be distorted through confirmation bias to emphasize that which supports a hypothesis and dismiss that which does not.
      • Baybutt P.
      Cognitive biases in process hazard analysis.
      ,
      • Nickerson RS.
      Confirmation bias: a ubiquitous phenomenon in many guises.
      One method for reducing the opportunity for confirmation bias to occur is to use a large, multidisciplinary investigation team to examine the associated data. The presence of many stakeholders will increase the diversity of perspectives and reduce the probability of a preeminent hypothesis. It is also important to encourage, within the investigation team, a culture of being able to speak up and voice differing opinions (that is, psychological safety). This way, if certain stakeholders seek biased support for their hypothesis, the group as a whole can offer a balancing effect and challenge that belief. However, reliance on a group as a mitigation for confirmation bias may lead to additional pitfalls within group dynamics, such as groupthink.
      • Murata A
      • Nakamura T
      • Karwowski W.
      Influence of cognitive biases in distorting decision making and leading to critical unfavorable incidents.

      Groupthink

      Clearly, biases affect cognitive processes, including decision-making; being in a group also affects decision-making.
      • Kaba A
      • et al.
      Are we at risk of groupthink in our approach to teamwork interventions in health care?.
      Because investigations involve multiple individuals, it is important to understand the relationship between groups and decision-making. Simply stated, the group context demonstrates a strong influence. Observing the decisions made by others influences our own choices,
      • Viscusi WK
      • Phillips OR
      • Kroll S.
      Risky investment decisions: how are individuals influenced by their groups?.
      and individuals think and behave differently in a group context compared to in isolation.
      • Cleary M
      • Lees D
      • Sayers J.
      Leadership, thought diversity, and the influence of groupthink.
      Groupthink refers to a type of thinking that manifests in cohesive groups that prioritize concurrence seeking as opposed to emphasizing alternative solutions.
      • Janis IL
      Groupthink.
      Groupthink occurs because social influence can lead to internalization or compliance. Internationalization occurs when individuals assimilate to the correctness of the group, and compliance occurs when individuals suppress their own private doubts.
      • McCauley C.
      The nature of social influence in groupthink: compliance and internalization.
      Regardless, the symptoms of groupthink can produce poor decision-making outcomes.
      • Őnday Ő.
      Human resource theory: from Hawthorne experiments of Mayo to groupthink of Janis.
      Fortunately, researchers have specified several strategies that can mitigate the emergence of groupthink. One strategy is to break up the group such that individuals can draw upon their respective expertise to give their perspectives freely.
      • Griffin EA
      • In Groupthink of Irving Janis.
      Griffin EA: A First Look at Communication Theory.
      One of the hallmarks of groupthink is consensus; thus, separating individuals involved in the incident eliminates unfounded consensus. The various roles of the individuals involved in the incident could be interviewed separately to limit such influence. Another strategy is to request that authoritarian figures leave the group periodically.
      • Griffin EA
      • In Groupthink of Irving Janis.
      Griffin EA: A First Look at Communication Theory.
      Often, authoritarian figures may have difficulty not imposing their viewpoints on others, so forcing intermittent removals will enable potentially impressionable members to voice their own perspectives without undue influence. For example, it might be beneficial to have a chief of surgical services leave the room when speaking with respective surgical staff. Others have also recommended building thought diversity, which entails a varied way of thinking to foster differing perspectives and unique approaches.
      • Fernandez CP.
      Creating thought diversity: the antidote to group think.
      In addition, team training efforts that focus on conflict management and resolution may enable teams to more openly discuss dissenting viewpoints. Despite the strategy employed, considerations for diminishing the conditions for groupthink are necessary, as some posit that it negatively affects care.
      • McCauley C.
      The nature of social influence in groupthink: compliance and internalization.

      Conclusion

      This article reviewed heuristics and biases relevant to practitioners who are examining how and why sentinel and adverse events occur in health care environments. Although biases can be beneficial because they facilitate quicker decision-making, which is often essential in everyday environments, this article discussed the dominant approach regarding biases, which postulates that biases can be associated with poor decision-making and suboptimal outcomes. Consequently, we have potential mitigations to aid in the processes involved in the investigation of safety incidents in health care.
      These strategies do have merit; however, we realize that institutions may have practical constraints that may limit their ability to employ some of these strategies. For example, institutions may not have the time or financial resources to dedicate toward a multidisciplinary team of interviews or even conduct interviews with lone individuals. We encourage institutions to prioritize their respective needs and implement strategies that are feasible. This work fills an important knowledge gap in the literature by approaching this topic and providing actionable strategies to mitigate the negative effects of biases in decision-making.

      Conflicts of Interest

      All authors report no conflicts of interest.

      References

        • Andel C
        • et al.
        The economics of health care quality and medical errors.
        J Health Care Finance. 2012; 39: 39-50
        • Peerally MF
        • et al.
        The problem with root cause analysis.
        BMJ Qual Saf. 2017; 26: 417-422
        • Lundberg J
        • Rollenhagen C
        • Hollnagel E.
        What-you-look-for-is-what-you-find—the consequences of underlying accident models in eight accident investigation manuals.
        Saf Sci. 2009; 47: 1297-1311
        • Cedergren A
        • Petersen K.
        Prerequisites for learning from accident investigations—a cross-country comparison of national accident investigation boards.
        Saf Sci. 2011; 49: 1238-1245
        • Tversky A
        • Kahneman D.
        Judgment under uncertainty: heuristics and biases.
        Science. 1974 Sep 27; 185: 1124-1131
        • Reisberg D.
        Cognition: Exploring the Science of the Mind.
        W. W. Norton, New York2010
        • Weisberg RW
        • Reeves LM.
        Cognition: From Memory to Creativity.
        John Wiley & Sons, Hoboken, NJ2013
      1. The Joint Commission. Cognitive biases in health care. Quick Safety, Issue 28. Oct 2016. Accessed Jun 27, 2022. https://www.jointcommission.org/-/media/tjc/documents/newsletters/quick_safety_issue_28_oct_2016pdf.pdf.

        • Arkes HR.
        The consequences of the hindsight bias in medical decision making.
        Curr Dir Psychol Sci. 2013; 22: 356-360
        • Saposnik G
        • et al.
        Cognitive biases associated with medical decisions: a systematic review.
        BMC Med Inform Decis Mak. 2016 Nov 3; 16: 138
        • Croskerry P
        • Singhal G
        • Mamede S.
        Cognitive debiasing 2: impediments to and strategies for change.
        BMJ Qual Saf. 2013; 22 Suppl 2: ii65-ii72
        • O'Sullivan ED
        • Schofield SJ.
        Cognitive bias in clinical medicine.
        J R Coll Physicians Edinb. 2018; 48: 225-232
        • Lundberg J
        • Rollenhagen C
        • Hollnagel E.
        What you find is not always what you fix—how other aspects than causes of accidents decide recommendations for remedial actions.
        Accid Anal Prev. 2010; 42: 2132-2139
        • Sanchez JA
        • et al.
        Investigating the causes of adverse events.
        Ann Thorac Surg. 2017; 103: 1693-1699
        • Keebler JR
        • et al.
        Human factors applied to perioperative process improvement.
        Anesthesiol Clin. 2018; 36: 17-29
        • Baron J
        • Hershey JC.
        Outcome bias in decision evaluation.
        J Pers Soc Psychol. 1988; 54: 569-579
        • Arkes HR
        • et al.
        Eliminating the hindsight bias.
        J Appl Psychol. 1988; 73: 305-307
        • Gino F
        • Shu LL
        • Bazerman MH.
        Nameless + harmless = blameless: when seemingly irrelevant factors influence judgment of (un)ethical behavior.
        Organ Behav Hum Decis Process. 2010; 111: 93-101
        • Henriksen K
        • Kaplan H.
        Hindsight bias, outcome knowledge and adaptive learning.
        Qual Saf Heal Care. 2003; 12 Suppl 2: ii46-ii50
        • Roese NJ
        • Vohs KD.
        Hindsight bias.
        Perspect Psychol Sci. 2012; 7: 411-426
        • Baybutt P.
        Cognitive biases in process hazard analysis.
        J Loss Prev Process Ind. 2016; 43: 372-377
        • Okes D.
        The human side of root cause analysis.
        Journal for Quality and Participation. 2008; 31: 20-29
        • Schneider W
        • Shiffrin RM.
        Controlled and automatic human information processing: I. detection, search, and attention.
        Psychol Rev. 1977; 84: 1-66
        • Shiffrin RM
        • Schneider W.
        Controlled and automatic human information processing: II. perceptual learning, automatic attending and a general theory.
        Psychol Rev. 1977; 84: 127-190
        • Abdi Z
        • Ravaghi H.
        Implementing root cause analysis in Iranian hospitals: challenges and benefits.
        Int J Health Plann Manage. 2017; 32: 147-162
        • Kelley HH.
        Attribution theory in social psychology.
        Nebr Symp Motiv. 1967; 15: 192-238
        • Heider F.
        The Psychology of Interpersonal Relations.
        Lawrence Erlbaum Associates, Hillsdale, NJ1958
        • Plous S.
        The Psychology of Judgment and Decision Making.
        Temple University Press, Philadelphia1993
        • Regan DT
        • Totten J.
        Empathy and attribution: Turning observers into actors.
        J Pers Soc Psychol. 1975; 32: 850-856
        • Holden RJ
        • et al.
        SEIPS 2.0: a human factors framework for studying and improving the work of healthcare professionals and patients.
        Ergonomics. 2013; 56: 1669-1686
        • Murata A
        • Nakamura T
        • Karwowski W.
        Influence of cognitive biases in distorting decision making and leading to critical unfavorable incidents.
        Safety. 2015; 1: 44-58
        • Nickerson RS.
        Confirmation bias: a ubiquitous phenomenon in many guises.
        Rev Gen Psychol. 1998; 2: 175-220
        • Kaba A
        • et al.
        Are we at risk of groupthink in our approach to teamwork interventions in health care?.
        Med Educ. 2016; 50: 400-408
        • Viscusi WK
        • Phillips OR
        • Kroll S.
        Risky investment decisions: how are individuals influenced by their groups?.
        J Risk Uncertain. 2011; 43: 81-106
        • Cleary M
        • Lees D
        • Sayers J.
        Leadership, thought diversity, and the influence of groupthink.
        Issues Ment Health Nurs. 2019; 40: 731-733
        • Janis IL
        Groupthink.
        Psychol Today. 1971; 5 (46, 74–76): 43-44
        • McCauley C.
        The nature of social influence in groupthink: compliance and internalization.
        J Pers Soc Psychol. 1989; 57: 250-260
        • Őnday Ő.
        Human resource theory: from Hawthorne experiments of Mayo to groupthink of Janis.
        Global Journal of Human Resource Management. 2016; 4: 95-110
        • Griffin EA
        • In Groupthink of Irving Janis.
        Griffin EA: A First Look at Communication Theory.
        McGraw-Hill, New York1991: 235-246
        • Fernandez CP.
        Creating thought diversity: the antidote to group think.
        J Public Health Manag Pract. 2007; 13: 670-671