ACS Publications. Most Trusted. Most Cited. Most Read
Cognitive and Human Factors in Expert Decision Making: Six Fallacies and the Eight Sources of Bias
My Activity
  • Open Access
  • Editors Choice
Perspective

Cognitive and Human Factors in Expert Decision Making: Six Fallacies and the Eight Sources of Bias
Click to copy article linkArticle link copied!

Open PDF

Analytical Chemistry

Cite this: Anal. Chem. 2020, 92, 12, 7998–8004
Click to copy citationCitation copied!
https://doi.org/10.1021/acs.analchem.0c00704
Published June 8, 2020

Copyright © 2020 American Chemical Society. This publication is licensed under these Terms of Use.

Abstract

Click to copy section linkSection link copied!

Fallacies about the nature of biases have shadowed a proper cognitive understanding of biases and their sources, which in turn lead to ways that minimize their impact. Six such fallacies are presented: it is an ethical issue, only applies to “bad apples”, experts are impartial and immune, technology eliminates bias, blind spot, and the illusion of control. Then, eight sources of bias are discussed and conceptualized within three categories: (A) factors that relate to the specific case and analysis, which include the data, reference materials, and contextual information, (B) factors that relate to the specific person doing the analysis, which include past experience base rates, organizational factors, education and training, and personal factors, and lastly, (C) cognitive architecture and human nature that impacts all of us. These factors can impact what the data are (e.g., how data are sampled and collected, or what is considered as noise and therefore disregarded), the actual results (e.g., decisions on testing strategies, how analysis is conducted, and when to stop testing), and the conclusions (e.g., interpretation of the results). The paper concludes with specific measures that can minimize these biases.

Copyright © 2020 American Chemical Society
Scientists and experts are expected to make observations and draw impartial conclusions based on the relevant data. Therefore, we tend to focus on the data and methods used, as if they exist in isolation from the human decision maker. For example, a recent paper reviews forensic DNA analysis (1) and provides a thorough review of the DNA methods, but does not give sufficient note and attention to the fact that DNA analysis depends and hinges on the humans conducting it, and how their biases may impact the DNA results. Indeed, a flurry of recent publications demonstrate the importance and influence of the human impact on DNA analysis. (2−4)
Such cognitive biases are not limited to DNA analysis, and have been demonstrated in many domains (e.g., fingerprinting and other forensic science disciplines, (5,6) forensic psychology, (7) as well as drug detection and discovery (8)). Biases can impact the actual observation and perception of the data, testing strategies, as well as how the results are then interpreted and how conclusions are reached. (6) Even one of the most objective domains, toxicology, has now been shown to be exposed to issues of cognitive bias, (9) and such biases have created error in real toxicology casework (e.g., determining a wrong testing strategy in a post-mortem toxicology case (10)).
Thus, the impact of bias is not limited to the important interpretation of the results, but it has a much broader and deeper scope as it impacts what the data are (e.g., what is collected, what is considered as noise and disregarded, etc.) and the results of the analyses themselves (e.g., testing strategy as to what tests and types of analysis to conduct, how they are carried out and by whom, and when to stop testing).
Many of the challenges in dealing with these biases arise from the nature of cognitive biases, their various sources, and fallacies about them, which are described in detail below.

Six Fallacies of Bias

Click to copy section linkSection link copied!

Before delving into the eight sources of cognitive biases, it is important to present and dispel commonly held fallacies. If we do not recognize and write-off these fallacies, it becomes practically impossible to move forward because these fallacies minimize (if not dismiss altogether) the very existence of the biases. The first step to deal with biases is to acknowledge their existence and the impact they may have on hard working, dedicated, and competent experts.

First Fallacy: Ethical Issues

Many incorrectly think these biases are an ethical issue of corrupt individuals or even malicious acts by unscrupulous people, whereas cognitive bias actually impacts honest and dedicated examiners. Time and again cognitive biases are presented under the umbrella of ethical issues, in books (11), conferences, and training (e.g., “ethics in cognitive bias” (12)). It is a fallacy that arises from a basic misunderstanding of what cognitive bias is all about. It is not a matter of dishonesty, intentional discrimination, or of a deliberate act arising from underlying ethical issues. (13) It is true that there are cases of intentional professional misconduct, e.g., when a chemist in a drug laboratory in Boston forged tests. (14,15) However, cognitive bias is not about such ethical issues of personal character, integrity, or intentional misconduct.

Second Fallacy: Bad Apples

Often when errors and biases are found, the experts involved are blamed for the errors, rather than acknowledging any systemic issues, such as the human element in the analysis and the general susceptibility to bias. (16) Surely there are cases where error is due to examiners’ competency, but these are relatively easy to detect and to fix. The kind of biases discussed here, implicit cognitive biases, are widespread and are not due to incompetency, and therefore are harder to detect.

Third Fallacy: Expert Immunity

There is a widely incorrect belief that experts are impartial and immune to biases. (17) However, the truth of the matter is that no one is immune to bias, not even experts. (18) In fact, in many ways, experts are more susceptible to certain biases. The very making of expertise creates and underpins many of the biases. (19) For example, experience and training make experts engage in more selective attention, use chunking and schemas (typical activities and their sequence), and rely on heuristics and expectations arising from past base rate experiences, utilizing a whole range of top-down cognitive processes which create a priori assumptions and expectations.
These cognitive processes enable experts to often make quick and accurate decisions. However, these very mechanisms also create bias that can lead them in the wrong direction. Regardless of the utilities (and vulnerability) of such cognitive processing in experts, they do not immune experts from bias, and indeed, expertise and experience may actually increase (or even cause) certain biases. Experts across domains are subject to cognitive vulnerabilities. (20)
Although experts tend to be very confident (sometimes even overconfident), more experienced experts can actually perform worse than novices. This has been demonstrated, for example, in environmental ecology, where data collection is critical and underpinned by the ability to correctly identify and collect samples, and novices actually outperform experts. (21)

Fourth Fallacy: Technological Protection

People think that the mere use of technology, instrumentation, automation, or artificial intelligence eliminates bias. These can reduce bias, but even when these are in used, human biases are still at play because these systems are built, programmed, operated, or interpreted by humans. There is a danger that people will incorrectly believe that using technology is a guaranteed protection from being susceptible to and affected by bias. Furthermore, technology can even introduce biases, be it mass spectral library matching software, Automated Fingerprint Identification Systems (AFIS), or other technological devices. (22)

Fifth Fallacy: Bias Blind Spot

The experts themselves are often not aware of their biases and therefore hold a fallacy that they are not biased. One must remember that these are implicit cognitive biases, not intentional discriminatory types of biases. The bias blind spot (23) is well documented and has been demonstrated in a variety of domains, including forensic science (17) and forensic psychology. (24) While it is relatively easy to see bias in others, we are often blind to our own biases. Research has found that 70% of forensic scientists now acknowledge that cognitive bias is a cause for concern in forensic science as a whole, but only 52% think it is a concern in their own specific forensic domain, and just 25% think it is relevant to them personally, reflecting the hallmarks of the bias blind spot. (17,23)

Sixth Fallacy: Illusion of Control

Even when experts are made aware of and acknowledge their biases, they nevertheless think they can overcome them by mere willpower. (25) This is the illusion of control. Combating and countering these biases requires taking specific steps—willpower alone is inadequate to deal with the various manifestations of bias.
In fact, trying to deal with bias by the illusion of control may actually increase the bias, due to “ironic processing” or “ironic rebound”. (26) Hence, trying to minimize bias by willpower makes you think of it more and actually increases its effect. This is similar to a judge instructing jurors to disregard certain evidence. By doing so, the judge is actually making the jurors notice this evidence even more. (27)
The beliefs in such fallacies, see Table 1, prevent dealing with the biases because they dismiss their powers and even their very existence. We need to acknowledge the impact of biases and understand their sources, so we can take appropriate measures when needed and when possible to combat their affects.
Table 1. Six Fallacies about Cognitive Bias Commonly Held by Experts
FallacyIncorrect belief
1. Ethical IssuesIt only happens to corrupt and unscrupulous individuals, an issue of morals and personal integrity, a question of personal character.
2. Bad ApplesIt is a question of competency and happens to experts who do not know how to do their job properly.
3. Expert ImmunityExperts are impartial and are not affected because bias does not impact competent experts doing their job with integrity.
4. Technological ProtectionUsing technology, instrumentation, automation, or artificial intelligence guarantees protection from human biases.
5. Blind SpotOther experts are affected by bias, but not me. I am not biased; it is the other experts who are biased.
6. Illusion of ControlI am aware that bias impacts me, and therefore, I can control and counter its affect. I can overcome bias by mere willpower.

Eight Sources of Bias

Click to copy section linkSection link copied!

Cognitive biases are widespread, a ubiquitous phenomenon in many guises. (13)Figure 1 illustrates eight sources of bias in expert decision making. It shows that cognitive bias can arise from a variety of sources, which are categorized into three groups. Category A relates to the specific case—something about this case causes bias in how data are perceived, analyzed, and interpreted. Other sources of cognitive bias, Category B, have nothing to do with the specific case, but they arise from factors relating to the specific person doing the work—something about them (e.g., their experience, their personality, their working environment, their motivation, etc.) causes the bias. Further sources of cognitive bias, Category C, arise from human nature, the very cognitive architecture of the human brain that we all share, regardless of the specific case or the specific person doing the analysis.

Figure 1

Figure 1. Eight sources of bias that may cognitively contaminate sampling, observations, testing strategies, analysis, and conclusions, even by experts. They are organized in a taxonomy within three categories: starting off at the top with sources relating to the specific case and analysis (Category A), moving down to sources that relate to the specific person doing the analysis (Category B), and at the very bottom sources that relate to human nature (Category C).

It is important to understand these different sources of bias so we can more easily recognize and expose them, and most importantly, each of the different sources of biases requires specific countermeasures.

(1) The Data

The first source of cognitive bias within the factors that relate to the specific analysis is the data. How can data cause bias? Well, it depends on the data. Some data, such as fingermarks, do not, per se, cause bias, as they convey no information beyond the friction ridges impressions. However, with other types of data (such as in the analysis of voice, handwriting, blood spatter, and bitemarks), the data can contain potentially biasing information. For example, in voice analysis, the content, or even the tone and screaming, can reveal a brutal assault. Similarly, gang-rape mixture DNA evidence can evoke emotions which can impact decision making. (28−30)

(2) Reference Materials

Reference materials can bias how the data are perceived and interpreted. This applies to a variety of data, including DNA, fingerprinting, and any decisions that are based on making comparisons. If DNA evidence interpretation is influenced by the “target” suspect known reference material (i.e., their DNA profile) so to better fit them, then there is biased interpretation of the biological material from the crime scene. (31)
Rather than the actual evidence being the driver of the decision making process—where evidence is interpreted based on the data it contains—the suspect’s profile is driving the decision making. Hence, rather than going from the evidence to the suspect (from data to theory), the reference materials cause examiners to go backward (circular reasoning) from the target/suspect to the evidence, thus biasing how the data are interpreted. (32)
Take the case of Kerry Robinson who was wrongly convicted of rape based on mistaken analysis of DNA evidence. (33) The two DNA examiners in the case made an error as they seem to have been working backward from Kerry Robinson’s known DNA profile to the latent evidence. Thus, the suspect “target” was driving the analysis rather than the evidence itself. He was exonerated and released from jail after serving 17 years. (34) Indeed, when the same DNA evidence was later analyzed properly, different results were obtained. (35)
The problem with “going backward”, circular reasoning, is not limited to DNA and can be found in other comparative domains, such as fingerprinting, handwriting, firearms, and other domains (for a review, see ref (5)), where the reference materials of the known suspect influences the interpretation of the data from the crime scene. This has been exemplified when the FBI (as well as an expert hired by the defense) erroneously identified Mayfield as the Madrid bomber. (36) Again, the “target” (in this case, the fingerprints of the suspect) was driving the analysis, resulting in an erroneous identification. For example, in the analysis, a signal in the evidence was perceived as noise and disregarded because it did not match the target—a typical error emerging from going backward and letting the target or expected results drive the analysis.
Furthermore, this source of bias is not limited to circumstances that have a “target” suspect per se, but can also arise from pre-existing templates and patterns, such as in the interpretation of blood pattern analysis or a crime scene. It can even impact what color is observed. Hence, bias is not limited to the interpretation for the appearance of a color (e.g., pink color in the Griess test, incorrectly interpreted as meaning the suspects handled dynamite, see below), but can even impact what color is observed in the first place when Munsell color charts are used as reference templates. (37)
This bias goes beyond impacting only the conclusion and interpretation of what the presence of the color means, because it can also bias and impact the observation of what the color itself is (for this important distinction between biases impacting observations vs conclusions, see the Hierarchy of Expert Performance (HEP)). (6)
These types of cognitive biases are therefore not limited to a target or reference, they can even be caused by having a theory, a chart, or a pattern in mind, and have been shown to impact a variety of expert domains. (38−40)
This bias can also occur, for example, in a messy mass chromatogram or infrared spectrum where an analyst might “pick out” the peaks they are looking for and ignore the others. This is not intentional and happens without awareness. The expectation biases cognitive resources and attention toward a certain stimulus or signal (while suppressing and ignoring others) (41) and impacts sensory response to predictable object stimuli throughout the ventral visual pathway. (42,43) It also biases perception sensitivity for those targets, (44) changing sensory representations in the visual cortex, (45) and hence impacts what we actually perceive and how. (46) Just as it can bias the visual search of a mass chromatogram or infrared spectrum, it can impact detection in radiography (38) and many other domains. In all of these examples, the human examiner is driven by a “target” they expect (or want) rather than by the actual data.

(3) Contextual Information

Experts are often exposed to irrelevant information. In the forensic domain, for example, such information may be that the suspect confessed to the crime, that they have been identified by eyewitnesses and other lines of evidence, or that the suspect has a criminal record. (47) Even knowing the name of the suspect may be suggestive of a specific race evoking biases and stereotypes. These all cause expectations that can impact not only the interpretation of the results obtained from the analysis, but also the analysis itself because the expectations impact the detection of what goes into the analysis as well as testing strategies. This source of bias is not derived from a target generated by the reference materials (see above) but from contextual information that can be irrelevant to the task carried out by the analyst.
In toxicology, for example, contextual information can bias testing strategies. Consider, for instance, when a post-mortem case is provided with contextual informant, such as “drug-overdose” or/and that “the deceased was a known to have a history of heroin use.” Such information can impact the established testing strategies, such as to go straight to the confirmation and quantification of a limited range of opiate-type drugs (morphine, codeine, 6-monoacetylmorphine, and other heroin markers), without running the other standard testing, such as an immunoassay or looking for other opioids that may have been present (e.g., fentanyl). Hence, the contextual information caused a confirmation bias approach and deviation from the standard testing and screening protocols. This has indeed caused errors in toxicology testing cases. (10)
In the forensic context, for example, irrelevant contextual information has been shown to impact data collection at the crime scene, (48) as well as bias in laboratory DNA analysis (35) (for reviews, see refs (5,6)). Similarly, deviation from standard testing strategies, data collection, and sampling can be biased by contextual information when the laboratory does analyses knowing that a Drug Recognition Expert (DRE) evaluated a person as being impaired and under the influence of a stimulant, but there was no alcohol in their blood, or when a dog signals the presence of drugs, or any other contextual information that causes an expectation to what the results should be.
Contextual expectations not only impact data collection and testing strategies but also the interpretation and conclusions of the analysis. For example, “a poor-quality chromatographic or mass spectral match for a drug could be consciously or subconsciously “upgraded” to a positive result based on the expectation that the sample should be positive for that drug. Conversely, a similarly poor-quality match might be “downgraded” to negative if the analyst is not expecting the drug to be present, or its presence ‘does not fit the circumstances of the case’. In both situations, the context could give the illusion of a stronger basis for the decision than is warranted by the data” (ref (9), page 381). Cognitive biases arise when task-irrelevant context causes some aspect of analysis to be overweighted, underweighted, or neglected (e.g., not perceived, determined to be noise, an anomaly, or an outlier). This does not only happen in subjective judgements but can also bias even established procedures, (10) and criteria for accepting evidence and proper judgment. (49)
The problem with task irrelevant contextual information is that it can cause many kinds of biases that impact analysis in many different ways. Another impact of bias can be overlooking or underweighting the absence of data, not properly confirming results, or not considering alternatives. Consider, for example, how biasing terrorism contextual information can result in a false positive identification of dynamite. The Griess test, a presumptive color test for explosives, (50) was positive for nitroglycerine. However, the positive result could be gained from nitrocellulose in a range of innocent non-terrorism related products. Nevertheless, because of the contextual information, it was wrongly concluded that the appearance of a pink color meant the suspects handled dynamite, when actually the traces were generated from an old pack of playing cards that they had been shuffling. (51) The suspects were wrongfully convicted and were sentenced to life imprisonment, only to be exonerated and released from jail after serving 16 years following a review of the testing used to obtain their original convictions. (52) Such errors happen when analysts are biased to accept results that are expected or wanted, without carrying out proper confirmation or considering alternatives.
Another example is hair-strand drug and alcohol testing. Even though there is a known false-positive problem with immunoassays, and therefore, results must be confirmed by another more specific technique, these were not always property carried out when results were in line with contextual information. (53)
It is important to emphasize that contextual irrelevant information biases scientists and experts, and it can do so at an unconscious level–they may not be aware of the impact. The expectation biases what and how information is represented and processed in the brain. (41−46) These biases impact experts and cannot be properly controlled by mere willpower (see the six bias fallacies above).

(4) Base Rate

An important asset that experts bring to thier work is their experience from previous cases. However, such experience brings expectations to new cases, that are not derived from the specific case or analysis at hand, but nevertheless can still impact their interpretation. Hence, the sampling and analysis being conducted are impacted by factors that have nothing to do with the case at hand, but rather with expected base rates generated from previous unrelated cases, which influence how this case is conducted.
For example, when the cause of death is cerebral hypoxia caused by hanging, then the manner of death is most often suicide. In contrast, when cerebral hypoxia is caused by strangulation, then the manner of death is most often homicide. This is not necessarily the case since, although rarely, hanging can be a result of a homicide and strangulation can be a suicide. However, the base rate of associations between the cause and manner of death can bias the interpretation and determination of manner of death. Such base rate biases are common in many domains from medical diagnosis to security X-rays at airports. (54)
The biasing effects of base rates are not limited to how the results of the analysis are interpreted. They can impact other stages of the analysis, even the sampling and data collection, as well as detection of relevant marks, items, or signals, or even verification. For example, low target prevalence base rate bias shows that if in past experiences it was rare or uncommon to find, then observers are more likely to miss it in the future. (55,56) Hence, even when observers are searching for an item, mark, or signal, then base rate can bias their search: relatively rare and uncommon items, signals, or objects will make observers more likely to miss them even when they are present.
Base rate bias derives from expectations generated from past similar cases. (46) The issue is that this case is biased because its analysis is actually based on other cases. The crux of the bias is that perception and decisions are not based on the case itself. This type of bias is even more potent when the similarity to past cases is superficial and more in the eye of the beholder than in reality. (57)

(5) Organizational Factors

Organizational factors that can cause bias are many and varied, and have been well documented in a variety of domains. When it comes to DNA and other forensic evidence, where analysis and work is often conducted within the adversarial legal system, cognitive bias may emerge from an allegiance effect and myside bias. (58) Indeed, a study showed that when forensic experts are presented with identical data, they nevertheless reach conclusions biased toward the side that retained them (59)—an adversarial allegiance and myside bias. These are implicit biases, not explicit partiality when one side is openly favored over the other.
Many forensic science laboratories are part of law enforcement (or even part of the DA prosecution office). Such organizational influences have been recognized as biasing by the National Academy of Sciences Report, calling for “removing all public forensic laboratories and facilities from the administrative control of law enforcement agencies or prosecutors’ offices” (ref (60), Recommendation #4, page 24). The point is that organizational factors and the administration surrounding forensic science induces biases. (61)
The impact of organizational factors applies to any and every laboratory—they work within a variety of contexts, structures, and frameworks that can bias their work. For example, laboratories have clear hierarchy and dependencies. If there is a senior person who “signs off” on reports or analyses, there can be the danger of “writing what that person wants to read” and a lack of challenge of their scientific decisions. Thus, science is muddled with managerial authority and other organizational pressures. (62,63)
Other organizational factors relate to time pressure, expectations to reach certain results, stress, budget controls, pressure to obtain publications and other targets, and a whole range of organizational factors that can impact the work carried out in laboratories and other professional settings.

(6) Education and Training

Education and training play an important role in how work is conducted. For example, whether forensic examiners see their role more as supporting the police rather than as scientists. When approaching a case, training and education may instil the pursuit of a single hypothesis vs examining multiple hypotheses, considering alterative hypotheses (including scenarios proposed by the opposing side), conducting differential diagnosis, considering categorical decisions (such as “match” and “nonmatch”, often used in fingerprinting and firearms) vs using statistics and other methods to determine the strength of the evidence, etc. Digital forensics, firearms, fingerprinting, and many forensic domains have actually grown out of police work with minimal-to-no proper education and training in science.

(7) Personal Factors

Many personal factors impact biases and decision making. These include motivation, personal ideology and beliefs. Furthermore, some people are more risk takers and others are more risk averse, and people also vary in their tolerance to ambiguity. (64) Other individual differences between people can bias results, for example, tests that use color can be biased because people differ in what color they perceive when looking at the same item. (37)
In areas where there is more objective quantification and instrumentation, these factors are minimized. However, in areas where the human examiner has a greater role in deciding how to collect, sample and interpret the data, and where there is subjectivity in evaluatating the data and conclusions, then such personal factors play a greater role in how work is carried out.
Even when technology is used, human biases are still at play. Technology cannot be totally objective as humans are involved in construction and operation of the technology, as well as calibrating it, maintaining it, and interpreting the results and deciding if and what action to take. (65) Indeed, ISO (International Organization for Standardization) standard 17025:2017 (general requirements for the competence of testing and calibration of laboratories that rely more on instrumentation and objective quantification) (66) now follows the standard for the more subjective laboratory domains, ISO 17020 and ISO 17025, and includes specific requirements for impartiality and freedom from bias. Hence, it acknowledges the role of the human examiner, even when quantification and instrumentation are used. It recognizes that even the use of instrumentation does not guarantee freedom from bias. (67)
Other personal factors that can cause bias in decisions include the need for closure that can result in premature decisions or opting to reach inconclusive decisions, (68) how people respond to stress and fatigue, personality, and a whole range of personal factors that can impact expert decision making. (69−71)

(8) Human and Cognitive Factors, and the Human Brain

The workings of our brain create architectural and capacity constraints that do not allow it to process all the incoming information. The brain therefore engages in a variety of processes (mainly known as “top-down”) to make sense of the world and data around us. The human mind is not a camera, the active nature of human cognition means that we do not see the world “as it is.”
Beyond many cognitive processes and how the human brain is wired which can cause biases, there are biasing affects related to social interaction, in-group and availability biases, processing fluency, and other biasing influences that impact all of us. (72−75)

Snowball and Cascade Bias

Click to copy section linkSection link copied!

Bias does not impact only the individual in isolation or just one aspect of the work; often the bias cascades from one person to another, from one aspect of the work to another, influencing different elements of an investigation. As people and various aspects are influenced, they then influence others, turning from influenced to influencers, perpetuating the bias and impacting others. Then, biases are not only cascaded but gather momentum and snowball. (32)

Overcoming Bias

Click to copy section linkSection link copied!

First, we need to acknowledge the existence of bias and move beyond believing the fallacies about its nature. When people have a bias blind spot or think that as experts they are immune to it or that by mere willpower they can overcome it (see the six bias fallacies in Table 1), then biases only perpetuate.
Second, as a general principle to combat bias, we need to take actions that will cause us to focus solely on the relevant data and not work backward. These need to be part of ongoing training and laboratory procedures. Accreditation to the appropriate standards and certification in the appropriate discipline may not solve the problems, but they do force laboratories to document procedures. External scrutiny can also be very helpful for illuminating areas of bias (especially given that ISOs (e.g., 17020 and 17025 (66)) specifically require that steps are taken to make sure there is freedom from bias. (67)
Third, specifically, we can combat various sources of bias by the following:
(A)

Using blinding and masking techniques that prevent exposure to task irrelevant information. (76)

(B)

Using methods, such as Linear Sequential Unmasking (LSU), to control the sequence, timing, and linearity of exposure to information, so as to minimize “going backward” and being biased by the reference materials. (32)

(C)

Using case managers that screen and control what information is given to whom and when.

(D)

Using blind, double blind, and proper verifications when possible.

(E)

Rather than have one “reference target” or hypothesis, having a “line up” of competing and alternative conclusions and hypotheses.

(F)

Adopting a differential diagnosis approach, where all different conclusions and their probability are presented, rather than one conclusion. (77,78)

Summary and Conclusions

Click to copy section linkSection link copied!

Biases, often without our awareness or consciousness, impact how we sample and perceive data, decide on testing strategies and how we interpret the results. People’s beliefs in various fallacies about bias prevent a proper understanding of bias, which in turn precludes taking steps to fight it. Understanding biases and acknowledging them enables discussion about their sources and to develop means to minimize their impact.
Six fallacies are presented (Table 1), which are widely held beliefs about biases that prevent correct conceptualization of bias. Then, sources of biases are presented (Figure 1), which emerge from the specific case being analyzed, the specific person doing the analysis, and from basic human nature that impacts us all. By presenting those fallacies and sources of bias, the hope is that people will reflect on their own practices and consider whether cognitive biases may unwillingly influence their work. Discussing bias often encounters defensive responses, not many want to consider and acknowledge their own biases, let alone research them. However, it is essential to do so if we are to combat and minimize bias. It is the hope of the author that this paper contributes to that direction.

Author Information

Click to copy section linkSection link copied!

Acknowledgments

Click to copy section linkSection link copied!

I want to thank Hilary Hamnett, Nikolas P. Lemos, Joseph Almog, Roderick Kennedy, and anonymous reviewers for their helpful comments on an earlier version of this perspective.

References

Click to copy section linkSection link copied!

This article references 78 other publications.

  1. 1
    McCord, B. R.; Gauthier, Q.; Cho, S.; Roig, M. N.; Gibson-Daw, G. C.; Young, B.; Taglia, F.; Zapico, S. C.; Mariot, R. F.; Lee, S. B.; Duncan, G. Anal. Chem. 2019, 91, 673688,  DOI: 10.1021/acs.analchem.8b05318
  2. 2
    Barrio, P.A.; Crespillo, M.; Luque, J.A.; Aler, M.; Baeza-Richer, C.; Baldassarri, L.; Carnevali, E.; Coufalova, P.; Flores, I.; Garcia, O.; Garcia, M.A.; Gonzalez, R.; Hernandez, A.; Ingles, V.; Luque, G.M.; Mosquera-Miguel, A.; Pedrosa, S.; Pontes, M.L.; Porto, M.J.; Posada, Y.; Ramella, M.I.; Ribeiro, T.; Riego, E.; Sala, A.; Saragoni, V.G.; Serrano, A.; Vannelli, S. Forensic Sci. Int.: Genet. 2018, 35, 156163,  DOI: 10.1016/j.fsigen.2018.05.005
  3. 3
    Butler, J. M.; Kline, M. C.; Coble, M. D. Forensic Sci. Int.: Genet. 2018, 37, 8194,  DOI: 10.1016/j.fsigen.2018.07.024
  4. 4
    Bright, J.-A.; Cheng, K.; Kerr, Z.; McGovern, C.; Kelly, H.; Moretti, T. R.; Smith, M. A.; Bieber, F. R.; Budowle, B.; Coble, M. D.; Alghafri, R.; Allen, P. S.; Barber, A.; Beamer, V.; Buettner, C.; Russell, M.; Gehrig, C.; Hicks, T.; Charak, J.; Cheong-Wing, K.; Ciecko, A.; Davis, C. T.; Donley, M.; Pedersen, N.; Gartside, B.; Granger, D.; Greer-Ritzheimer, M.; Reisinger, E.; Kennedy, J.; Grammer, E.; Kaplan, M.; Hansen, D.; Larsen, H. J.; Laureano, A.; Li, C.; Lien, E.; Lindberg, E.; Kelly, C.; Mallinder, B.; Malsom, S.; Yacovone-Margetts, A.; McWhorter, A.; Prajapati, S. M.; Powell, T.; Shutler, G.; Stevenson, K.; Stonehouse, A. R.; Smith, L.; Murakami, J.; Halsing, E.; Wright, D.; Clark, L.; Taylor, D. A.; Buckleton, J. Forensic Sci. Int.: Genet. 2019, 40, 18,  DOI: 10.1016/j.fsigen.2019.01.006
  5. 5
    Cooper, G. S.; Meterko, V. Forensic Sci. Int. 2019, 297, 3546,  DOI: 10.1016/j.forsciint.2019.01.016
  6. 6
    Dror, I. E. J. Appl. Res. Mem. Cog. 2016, 5 (2), 121127,  DOI: 10.1016/j.jarmac.2016.03.001
  7. 7
    Dror, I. E.; Murrie, D. Psych. Pub. Policy Law. 2018, 24 (1), 1123,  DOI: 10.1037/law0000140
  8. 8
    Segall, M.; Chadwick, A. Future Med. Chem. 2011, 3 (7), 771774,  DOI: 10.4155/fmc.11.33
  9. 9
    Hamnett, H.; Jack, R. Sci. Justice 2019, 59 (4), 380389,  DOI: 10.1016/j.scijus.2019.02.004
  10. 10
    Forensic Science Regulator. Contextual Bias in Forensic toxicology , 2019. https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/800561/Lessons_Learnt_May19__L-B03_-final.pdf (accessed June 2020).
  11. 11
    Downs, U., Swienton, A. R., Eds.; Ethics in Forensic Science; Academic Press: Waltham, MA, 2012; pp 1441.
  12. 12
    Gannett, C. A130 Ethics in Forensic Science, 2020. https://cci.gosignmeup.com/public/Course/browse?courseid=2552 (accessed June 2020).
  13. 13
    Nickerson, R. S. Rev. Gen. Psych. 1998, 2, 175220,  DOI: 10.1037/1089-2680.2.2.175
  14. 14
    Bidgood, J. Chemist’s Misconduct Is Likely to Void 20,000 Massachusetts Drug Cases. New York Times , April 18, 2017.
  15. 15
    Mettler, K. How a Lab Chemist Went from ‘Superwoman’ to Disgraced Saboteur of More than 20,000 Drug Cases. Washington Post , April 21, 2017.
  16. 16
    Thompson, W. C. Southwestern Uni. Law Rev. 2009, 37, 971994
  17. 17
    Kukucka, J.; Kassin, S.; Zapf, P.; Dror, I. E. J. Appl. Res. Mem. Cog. 2017, 6 (4), 452459,  DOI: 10.1016/j.jarmac.2017.09.001
  18. 18
    Dror, I. E.; Kukucka, J.; Kassin, S.; Zapf, P. J. Appl. Res. Mem. Cog. 2018, 7 (2), 316317,  DOI: 10.1016/j.jarmac.2018.03.005
  19. 19
    Dror, I. E. In The Paradoxical Brain; Kapur, N., Ed.; Cambridge University Press: Cambridge, UK, 2011; pp 177188.
  20. 20
    Shanteat, J. In Advances in Design Research; Rohrmann, B., Beach, L. R., Vlek, C., Watson, S. R., Eds.; Elsevier: Amsterdam, 1989; pp 203215.
  21. 21
    Soller, J. M.; Ausband, D. E.; Gunther, S. M. PLoS One 2020, 15 (3), e0229762,  DOI: 10.1371/journal.pone.0229762
  22. 22
    Dror, I. E.; Wertheim, K.; Fraser-Mackenzie, P.; Walajtys, J. J. Forensic Sci. 2012, 57 (2), 343352,  DOI: 10.1111/j.1556-4029.2011.02013.x
  23. 23
    Pronin, E.; Lin, D. Y.; Ross, L. Person. Soc. Psych. Bull. 2002, 28, 369381,  DOI: 10.1177/0146167202286008
  24. 24
    Zapf, P.; Kukucka, J.; Kassin, S.; Dror, I. E. Psych., Pub. Policy Law 2018, 24 (1), 110,  DOI: 10.1037/law0000153
  25. 25
    Thornton, J. I. J. Forensic Sci. 2010, 55 (6), 1663,  DOI: 10.1111/j.1556-4029.2010.01497.x
  26. 26
    Wegner, D. M. Psych. Rev. 1994, 101, 3452,  DOI: 10.1037/0033-295X.101.1.34
  27. 27
    Steblay, N.; Hosch, H. M.; Culhane, S. E.; McWethy, A. Law Hum. Beh. 2006, 30, 469492,  DOI: 10.1007/s10979-006-9039-7
  28. 28
    Zajonc, R. B. Am. Psychol. 1980, 35, 151175,  DOI: 10.1037/0003-066X.35.2.151
  29. 29
    Finucane, M. L.; Alhakami, A.; Slovic, P.; Johnson, S. M. J. Behav. Decis. Making 2000, 13, 117,  DOI: 10.1002/(SICI)1099-0771(200001/03)13:1<1::AID-BDM333>3.0.CO;2-S
  30. 30
    Damasio, A. R. Descartes’ Error: Emotion, Reason, and the Human Brain; Penguin Books: New York, 2005; pp 1312.
  31. 31
    Jeanguenat, A. M.; Budowle, B.; Dror, I. E. Sci. Justice 2017, 57 (6), 415420,  DOI: 10.1016/j.scijus.2017.07.005
  32. 32
    Dror, I. E. Science 2018, 360 (6386), 243,  DOI: 10.1126/science.aat8443
  33. 33
    Starr, D. Forensics gone wrong: When DNA snares the innocent. Science 2016, 7, nana,  DOI: 10.1126/science.aaf4160
  34. 34
    Hanna, J.; Valencia, N. DNA Analysis Clears Georgia Man Who Served 17 Years in Wrongful Rape Conviction. CNN, January 10, 2020.
  35. 35
    Dror, I. E.; Hampikian, G. Sci. Justice 2011, 51 (4), 204208,  DOI: 10.1016/j.scijus.2011.08.004
  36. 36
    A Review of the FBI’s Handling of the Brandon Mayfield Case; Office of the Inspector General, Oversight & Review Division, U.S. Department of Justice, 2006.
  37. 37
    Marqués-Mateu, Á.; Moreno-Ramón, H.; Balasch, S.; Ibáñez-Asensio, S. Catena 2018, 171, 4453,  DOI: 10.1016/j.catena.2018.06.027
  38. 38
    Berlin, L. AJR, Am. J. Roentgenol. 2007, 189, 517522,  DOI: 10.2214/AJR.07.2209
  39. 39
    Kriegeskorte, N.; Simmons, W K.; Bellgowan, P. S F; Baker, C. I Nat. Neurosci. 2009, 12, 535540,  DOI: 10.1038/nn.2303
  40. 40
    Vul, E.; Kanwisher, N. In Foundational Issues for Human Brain Mapping; Hanson, S., Bunzl, M., Eds.; MIT Press: Cambridge, MA, 2010; pp 7192.
  41. 41
    Richter, D.; Ekman, M.; de Lange, F. P. J. Neurosci. 2018, 38, 74527461,  DOI: 10.1523/JNEUROSCI.3421-17.2018
  42. 42
    Luck, S. J.; Ford, M. A. Proc. Natl. Acad. Sci. U. S. A. 1998, 95 (3), 825830,  DOI: 10.1073/pnas.95.3.825
  43. 43
    Simons, D. J.; Chabris, C. F. Percept. 1999, 28, 10591074,  DOI: 10.1068/p281059
  44. 44
    Stein, T.; Peelen, M. V. J. Exp. Psychol. Gen. 2015, 144, 10891104,  DOI: 10.1037/xge0000109
  45. 45
    Kok, P.; Brouwer, G. J.; van Gerven, M. A. J.; de Lange, F. P. J. Neurosci. 2013, 33, 1627516284,  DOI: 10.1523/JNEUROSCI.0742-13.2013
  46. 46
    de Lange, F. P.; Heilbron, M.; Kok, P. Trends Cognit. Sci. 2018, 22, 764779,  DOI: 10.1016/j.tics.2018.06.002
  47. 47
    Gardner, B. O.; Kelley, S.; Murrie, D. C.; Blaisdell, K. N. Forensic Sci. Int. 2019, 297, 236242,  DOI: 10.1016/j.forsciint.2019.01.048
  48. 48
    Eeden, C. A. J.; de Poot, C. J.; van Koppen, P. J. J. Forensic Sci. 2019, 64 (1), 120126,  DOI: 10.1111/1556-4029.13817
  49. 49
    Morewedge, C. K.; Kahneman, D. Trends Cognit. Sci. 2010, 14, 435440,  DOI: 10.1016/j.tics.2010.07.004
  50. 50
    Moorcroft, M.; Davis, J.; Compton, R. G. Talanta 2001, 54, 785803,  DOI: 10.1016/S0039-9140(01)00323-X
  51. 51
    Almog, J.; Zitrin, S. In Aspects of Explosive Detection; Marshal, M., Oxley, M., Eds.; Elsevier: Amsterdam, 2009; pp 4748.
  52. 52
    Lissaman, C. Birmingham Pub Bombers Will Probably Never Be Found. BBC News, March 14, 2011.
  53. 53
    Lang, S. E. Report of the Motherisk Hair Analysis Independent Review; Ontario Ministry of the Attorney General, Canada, 2015.
  54. 54
    Egglin, T. K.; Feinstein, A. R. J. Am. Med. Assoc. 1996, 276 (21), 17521755,  DOI: 10.1001/jama.1996.03540210060035
  55. 55
    Wolfe, J. M.; Horowitz, T. S.; Kenner, N. M. Nature 2005, 435, 439440,  DOI: 10.1038/435439a
  56. 56
    Wolfe, J. M.; Horowitz, T. S.; Van Wert, M. J.; Kenner, N. M.; Place, S. S.; Kibbi, N. J. Exp. Psychol. Gen. 2007, 136, 623638,  DOI: 10.1037/0096-3445.136.4.623
  57. 57
    Shafffi, E. B.; Smith, E. E.; Osherson, D. N. Mem Cog. 1990, 18 (3), 229239,  DOI: 10.3758/BF03213877
  58. 58
    Simon, D.; Ahn, M.; Stenstrom, D. M.; Read, S. J. Psych. Public Policy. Law 2020, n, na,  DOI: 10.1037/law0000226
  59. 59
    Murrie, D. C.; Boccaccini, M. T.; Guarnera, L. A.; Rufino, K. A. Psych Sci. 2013, 24, 18891897,  DOI: 10.1177/0956797613481812
  60. 60
    Strengthening Forensic Science in the United States: A Path Forward; National Academies Press: Washington, DC, 2009.
  61. 61
    Whitman, G.; Koppl, R. Law Prob. Risk 2010, 9, 6990,  DOI: 10.1093/lpr/mgp028
  62. 62
    Howard, J. Cognitive Errors and Diagnostic Mistakes: A Case-Based Guide to Critical Thinking in Medicine; Springer: New York, 2019.
  63. 63
    Cosby, K. S.; Croskerry, P. Acad. Emergency Med. 2004, 11, 13411345,  DOI: 10.1197/j.aem.2004.07.005
  64. 64
    Saposnik, G.; Redelmeier, D.; Ruff, C. C.; Tobler BMC Med. Inf. Decis. Making 2016, 16, 138,  DOI: 10.1186/s12911-016-0377-1
  65. 65
    Dror, I. E.; Mnookin, J. Law Prob. Risk 2010, 9 (1), 4767,  DOI: 10.1093/lpr/mgp031
  66. 66
    General Requirements for the Competence of Testing and Calibration Laboratories, 3rd ed.; ISO/IEC 17025; International Organization for Standardization/International Electrotechnical Commission, Geneva, Switzerland, 2017.
  67. 67
    Dror, I. E.; Pierce, M. L. J. Forensic Sci. 2020, 65 (3), 800808,  DOI: 10.1111/1556-4029.14265
  68. 68
    Dror, I. E.; Langenburg, G. J. Forensic Sci. 2019, 64 (1), 1015,  DOI: 10.1111/1556-4029.13854
  69. 69
    Gok, K.; Atsan, N. Intern J. Business Soc. Res. 2016, 6 (3), 3847,  DOI: 10.18533/ijbsr.v6i3.936
  70. 70
    Neal, T. PLoS One 2016, 11 (4), e0154434  DOI: 10.1371/journal.pone.0154434
  71. 71
    Miller, A. K.; Rufino, K. A.; Boccaccini, M. T.; Jackson, R. L.; Murrie, D. C. Assessment 2011, 18 (2), 253260,  DOI: 10.1177/1073191111402460
  72. 72
    Griffin, D.; Ross, L. In Advances in Experimental Social Psychology; Zanna, M. P., Ed.; Academic Press: San Diego, CA, 1991; pp 319359.
  73. 73
    Ross, L.; Greene, D.; House, P. J. Exp. Soc. Psych. 1977, 13, 279301,  DOI: 10.1016/0022-1031(77)90049-X
  74. 74
    Oppenheimer, D. M. Trends Cognit. Sci. 2008, 12, 237241,  DOI: 10.1016/j.tics.2008.02.014
  75. 75
    Goldstein, D. G.; Gigerenzer, G. Psychol. Rev. 2002, 109, 7590,  DOI: 10.1037/0033-295X.109.1.75
  76. 76
    Robertson, C., Kesselheim, A., Eds.; Blinding as a Solution to Bias: Strengthening Biomedical Science, Forensic Science, and Law; Academic Press: New York, 2016; pp 1388.
  77. 77
    Maude, J. Diagnosis 2014, 1, 107109,  DOI: 10.1515/dx-2013-0009
  78. 78
    Barondess, J. A., Carpenter, C. C., Eds.; Differential Diagnosis; Lea & Febiger: Philadelphia, PA, 1994; pp 1800.

Cited By

Click to copy section linkSection link copied!
Citation Statements
Explore this article's citation statements on scite.ai

This article is cited by 188 publications.

  1. Mohammed A. Almazrouei. Comment on “Cognitive and Human Factors in Expert Decision Making: Six Fallacies and the Eight Sources of Bias”. Analytical Chemistry 2020, 92 (18) , 12725-12726. https://doi.org/10.1021/acs.analchem.0c03002
  2. Itiel E. Dror. Reply to Comment on “Cognitive and Human Factors in Expert Decision Making: Six Fallacies and the Eight Sources of Bias”. Analytical Chemistry 2020, 92 (18) , 12727-12728. https://doi.org/10.1021/acs.analchem.0c03051
  3. Alex Biedermann, Kyriakos N. Kotsoglou. More on digital evidence exceptionalism: Critique of the argument-based method for evaluative opinions. Forensic Science International: Digital Investigation 2025, 53 , 301885. https://doi.org/10.1016/j.fsidi.2025.301885
  4. Carolina Rojas Alfaro, Diego Ureña Mora, Mauricio Chacón Hernández, Adele Quigley-McBride. A practical approach to mitigating cognitive bias effects in forensic casework. Forensic Science International: Synergy 2025, 10 , 100569. https://doi.org/10.1016/j.fsisyn.2024.100569
  5. Jonas Ammeling, Marc Aubreville, Alexis Fritz, Angelika Kießig, Sebastian Krügel, Matthias Uhl. An interdisciplinary perspective on AI-supported decision making in medicine. Technology in Society 2025, 81 , 102791. https://doi.org/10.1016/j.techsoc.2024.102791
  6. Tyler Williams, Jason Torrie, Mark Schvaneveldt, Ranon Fuller, Greg Chipman, Devin Rappleye. Electrochemical Identification of Metal Chlorides in Eutectic LiCl-KCl Without Prior Knowledge of Analyte Identities. Nuclear Technology 2025, 211 (4) , 708-724. https://doi.org/10.1080/00295450.2024.2348849
  7. Linda Shallcross, Summer Bland. Silent Wounds: Unmasking Emotional Abuse and Psychosocial Risks in Australian Organisations. 2025https://doi.org/10.5772/intechopen.1009152
  8. Carolyne Bird, Kylie Jones, Kaye Ballantyne. Cognitive bias and contextual information management: Considerations for forensic handwriting examinations. WIREs Forensic Science 2025, 7 (1) https://doi.org/10.1002/wfs2.1530
  9. Itiel E. Dror. Bloodstain Pattern Analysis (BPA): Validity, reliability, cognitive bias, and error rate. Science & Justice 2025, 297 , 101245. https://doi.org/10.1016/j.scijus.2025.02.002
  10. Mohammed A. Almazrouei, Itiel E. Dror, Ruth M. Morgan, Ohad Dan, Megan Paterson, Ifat Levy. Human factors in triaging forensic items: Casework pressures and ambiguity aversion. Science & Justice 2025, 65 (2) , 149-162. https://doi.org/10.1016/j.scijus.2025.02.001
  11. Vincent Wahyudi, Cedric C. Ziegler, Matthias Frieß, Stefan Schramm, Constantin Lang, Lars Eberhardt, Fabian Freund, Alexander Dobhan, Martin Storath. A computer vision system for recognition and defect detection for reusable containers. Machine Vision and Applications 2025, 36 (2) https://doi.org/10.1007/s00138-024-01636-x
  12. Diana S Goldstein, Joel E Morgan. Retainer Bias: Ethical and Practical Considerations for the Forensic Neuropsychologist. Archives of Clinical Neuropsychology 2025, 40 (2) , 174-189. https://doi.org/10.1093/arclin/acae104
  13. William Lea, Luke Budworth, Jane O'Hara, Charles Vincent, Rebecca Lawton. Investigators are human too: outcome bias and perceptions of individual culpability in patient safety incident investigations. BMJ Quality & Safety 2025, 91 , bmjqs-2024-017926. https://doi.org/10.1136/bmjqs-2024-017926
  14. Stephen John Watkins, Charles Musselwhite. Recognised cognitive biases: How far do they explain transport behaviour?. Journal of Transport & Health 2025, 40 , 101941. https://doi.org/10.1016/j.jth.2024.101941
  15. Hedayat Selim, Pia Lindblad, Johanna Vanto, Jenny Skrifvars, Anne Alvesalo‐Kuusi, Julia Korkman, Elina Pirjatanniemi, Jan Antfolk. (In)credibly queer? Assessments of asylum claims based on sexual orientation. Legal and Criminological Psychology 2025, 30 (1) , 159-182. https://doi.org/10.1111/lcrp.12278
  16. Verena Oberlader, Bruno Verschuere. Bias is persistent: Sequencing case information does not protect against contextual bias in criminal risk assessment. Legal and Criminological Psychology 2025, 30 (1) , 143-158. https://doi.org/10.1111/lcrp.12279
  17. Jane Cioffi. Action for cognitive biases in clinical decision-making. Academia Medicine 2025, 2 (1) https://doi.org/10.20935/AcadMed7526
  18. Raquel Amezcua de Miguel. Análisis e implementación de estrategias para prevenir o atenuar la “contaminación” cognitiva en la obtención, análisis e interpretación de las pruebas científico-forenses en el proceso penal. Ciencia Policial 2025, 183 , 43-89. https://doi.org/10.14201/cp.32165
  19. Christy Galletta Horner, Kristina N. LaVenia, Katherine Brodeur, Oluwatobi Taiwo Ishola, Bernie Compton, Tracy Huziak-Clark, Vernita Glenn-White, Oluwafisayo Oke. The white fragility dilemma in culturally responsive pedagogy professional development. Whiteness and Education 2025, 132 , 1-22. https://doi.org/10.1080/23793406.2024.2447697
  20. Márton Lontai, Horolma Pamzsav, Dávid Petrétei. A szakértői elfogultság a büntetőügyekben II. rész. Belügyi Szemle 2025, 73 (1) , 159-179. https://doi.org/10.38146/bsz-ajia.2025.v73.i1.pp159-179
  21. Alexandre Giovanelli. The forensic´s scientist craft: toward an integrative theory. Part 2: meso- and macroapproach. Australian Journal of Forensic Sciences 2025, 57 (1) , 71-86. https://doi.org/10.1080/00450618.2023.2283418
  22. Aleksandr Segal, Aistė Bakaitytė, Goda Kaniušonytė, Laura Ustinavičiūtė‐Klenauskė, Shumpei Haginoya, Rita Žukauskienė, Pekka Santtila. Are emotions and psychophysiological states experienced when observing a child sexual abuse interview associated with confirmation bias in subsequent question formulation?. Journal of Investigative Psychology and Offender Profiling 2025, 22 (1) https://doi.org/10.1002/jip.1643
  23. Hans H. de Boer. Autopsies: Medico-legal Considerations, Including the Role and Responsibilities of the Forensic Pathologist. 2025, 371-378. https://doi.org/10.1016/B978-0-443-21441-7.00240-5
  24. Sadaf Gachkar, Darya Gachkar, Erfan Ghofrani, Antonio García Martínez, Cecilio Angulo Bahón. Text-based algorithms for automating life cycle inventory analysis in building sector life cycle assessment studies. Journal of Cleaner Production 2025, 486 , 144448. https://doi.org/10.1016/j.jclepro.2024.144448
  25. Richard Segovia. Confirmation bias in law enforcement and how debiasing interventions and mitigation strategies might help rebuild public trust: A thematic review. Nauka bezbednost policija 2025, 30 (1) , 76-95. https://doi.org/10.5937/nabepo30-52571
  26. Alex Biedermann, Kyriakos N. Kotsoglou. More on digital evidence exceptionalism: Critique of the argument-based method for evaluative opinions. SSRN Electronic Journal 2025, https://doi.org/10.2139/ssrn.5143771
  27. Aylin Yalçın Sarıbey, Ayşenur Büyükkaymaz. Adli antropolojide bilişsel yanlılıklar. Antropoloji 2024, (49) , 47-62. https://doi.org/10.33613/antropolojidergisi.1502775
  28. Miyuki Takase. Context Matters: Assessing Cognitive Bias Susceptibility in Health Care Professionals Using Generic and Context-Specific Scales. 2024https://doi.org/10.21203/rs.3.rs-5666472/v1
  29. Márton Lontai, Horolma Pamzsav, Dávid Petrétei. A szakértői elfogultság a büntetőügyekben - I. rész. Belügyi Szemle 2024, 72 (12) , 2349-2364. https://doi.org/10.38146/bsz-ajia.2024.v72.i12.pp2349-2364
  30. Shilun Zhou. Analyzing the justification for using generative AI technology to generate judgments based on the virtue jurisprudence theory. Journal of Decision Systems 2024, 40 , 1-24. https://doi.org/10.1080/12460125.2024.2428999
  31. Nicole Crown, Raymond Marquis, Erich Kupferschmid, Tomasz Dziedzic, Diana Belic, Dorijan Kerzan. Error mitigation in forensic handwriting examination: the examiner’s perspective. Forensic Sciences Research 2024, 9 (4) https://doi.org/10.1093/fsr/owae065
  32. Rebecca Sophia Lais, Julia Fitzner, Yeon-Kyeng Lee, Verena Struckmann. Open-sourced modeling and simulating tools for decision-makers during an emerging pandemic or epidemic – Systematic evaluation of utility and usability: A scoping review update. Dialogues in Health 2024, 5 , 100189. https://doi.org/10.1016/j.dialog.2024.100189
  33. Cléo Berger, Benoît Meylan, Thomas R. Souvignet. Uncertainty and error in location traces. Forensic Science International: Digital Investigation 2024, 51 , 301841. https://doi.org/10.1016/j.fsidi.2024.301841
  34. Amandeep Singh, Yovela Murzello, Hyowon Lee, Shene Abdalla, Siby Samuel. Moral decision making: Explainable insights into the role of working memory in autonomous driving. Machine Learning with Applications 2024, 18 , 100599. https://doi.org/10.1016/j.mlwa.2024.100599
  35. Joakim Sundh. Human behavior in the context of low-probability high-impact events. Humanities and Social Sciences Communications 2024, 11 (1) https://doi.org/10.1057/s41599-024-03403-9
  36. Dan Simon, David E. Melnikoff. Zeal Spillovers: The Adversarial Bias in Simulated Pre-Trial Decision Making. Journal of Law and Empirical Analysis 2024, 1 (2) https://doi.org/10.1177/2755323X241296360
  37. Mark L. Graber, Gerard M. Castro, Missy Danforth, Jean-Luc Tilly, Pat Croskerry, Rob El-Kareh, Carole Hemmalgarn, Ruth Ryan, Michael P. Tozier, Bob Trowbridge, Julie Wright, Laura Zwaan. Root cause analysis of cases involving diagnosis. Diagnosis 2024, 11 (4) , 353-368. https://doi.org/10.1515/dx-2024-0102
  38. Stefan K. Schauber, Anne O. Olsen, Erik L. Werner, Morten Magelssen. Inconsistencies in rater-based assessments mainly affect borderline candidates: but using simple heuristics might improve pass-fail decisions. Advances in Health Sciences Education 2024, 29 (5) , 1749-1767. https://doi.org/10.1007/s10459-024-10328-0
  39. Lee Curley, Till Neuhaus. Are legal experts better decision makers than jurors? A psychological evaluation of the role of juries in the 21st century. Journal of Criminal Psychology 2024, 14 (4) , 325-335. https://doi.org/10.1108/JCP-12-2023-0079
  40. . Probability in Forensic Science. 2024, 286-305. https://doi.org/10.1039/9781837670406-00286
  41. . Forensic Science: Where Next?. 2024, 333-337. https://doi.org/10.1039/9781837670406-00333
  42. Trayci A. Dahl, Melinda DiCiro, Emily Ziegler, Sean Sterling. Field Reliability of Forensic Mental Health Evaluations of Offenders for Post-Prison Civil Commitment. Journal of Forensic Psychology Research and Practice 2024, 24 (5) , 714-732. https://doi.org/10.1080/24732850.2023.2249451
  43. Jordan M.J. Peper, John H. Kalivas. Redefining Spectral Data Analysis with Immersive Analytics: Exploring Domain-Shifted Model Spaces for Optimal Model Selection. Applied Spectroscopy 2024, https://doi.org/10.1177/00037028241280669
  44. Claudio Avila, Adam West, Anna C. Vicini, William Waddington, Christopher Brearley, James Clarke, Andrew M. Derrick. Chemistry in a graph: modern insights into commercial organic synthesis planning. Digital Discovery 2024, 3 (9) , 1682-1694. https://doi.org/10.1039/D4DD00120F
  45. Mirko Casu, Luca Guarnera, Pasquale Caponnetto, Sebastiano Battiato. GenAI mirage: The impostor bias and the deepfake detection challenge in the era of artificial illusions. Forensic Science International: Digital Investigation 2024, 50 , 301795. https://doi.org/10.1016/j.fsidi.2024.301795
  46. Lizel Göranson, Olof Svensson, Peter Andiné, Sara Bromander, Karl Ask, Ann-Sophie Lindqvist Bagge, Malin Hildebrand Karlén. Which diagnoses and arguments regarding severe mental disorder do forensic psychiatric experts in Sweden consider in different cases? A qualitative vignette study. International Journal of Law and Psychiatry 2024, 96 , 102003. https://doi.org/10.1016/j.ijlp.2024.102003
  47. Michelle M. Pena, Stephanie Stoiloff, Maria Sparacino, Nadja Schreiber Compo. The effects of cognitive bias, examiner expertise, and stimulus material on forensic evidence analysis. Journal of Forensic Sciences 2024, 69 (5) , 1740-1757. https://doi.org/10.1111/1556-4029.15565
  48. Annelies Vredeveldt, Eva A. J. van Rosmalen, Peter J. van Koppen, Itiel E. Dror, Henry Otgaar. Legal psychologists as experts: guidelines for minimizing bias. Psychology, Crime & Law 2024, 30 (7) , 705-729. https://doi.org/10.1080/1068316X.2022.2114476
  49. Kiu Nga Leung, Sherry Nakhaeizadeh, Ruth M. Morgan. A global survey of the attitudes and perspectives of cognitive bias in forensic anthropology. Science & Justice 2024, 64 (4) , 347-359. https://doi.org/10.1016/j.scijus.2024.04.003
  50. Ning He, Hongxia Hao. Contextual bias by Forensic Document Examination trainees: An empirical study from China. Science & Justice 2024, 64 (4) , 360-366. https://doi.org/10.1016/j.scijus.2024.05.002
  51. Ning He, Hongxia Hao. Contextual bias in forensic toxicology decisions: A follow‐up empirical study from China. Journal of Forensic Sciences 2024, 69 (4) , 1400-1406. https://doi.org/10.1111/1556-4029.15520
  52. Wim Neuteboom, Alastair Ross, Lyndal Bugeja, Sheila Willis, Claude Roux, Kevin Lothridge. Quality management and competencies in forensic science. WIREs Forensic Science 2024, 6 (3) https://doi.org/10.1002/wfs2.1513
  53. Frank Crispino. Towards a forensic semiotics. Forensic Science International 2024, 357 , 111968. https://doi.org/10.1016/j.forsciint.2024.111968
  54. Aleksandr Segal, Francesco Pompedda, Shumpei Haginoya, Goda Kaniušonytė, Pekka Santtila. Avatars with child sexual abuse (vs. no abuse) scenarios elicit different emotional reactions. Psychology, Crime & Law 2024, 30 (3) , 250-270. https://doi.org/10.1080/1068316X.2022.2082422
  55. Raphael G. De Vittoris, Norbert Lebrument, Carole Bousquet. Listening to experts is not always wise: Unravelling the dynamics of decision‐making in the crisis cell. Journal of Contingencies and Crisis Management 2024, 32 (1) https://doi.org/10.1111/1468-5973.12553
  56. Meghan Prusinowski, Pedram Tavadze, Zachary Andrews, Logan Lang, Divyanjali Pulivendhan, Cedric Neumann, Aldo H. Romero, Tatiana Trejos. Experimental results on data analysis algorithms for extracting and interpreting edge feature data for duct tape and textile physical fit examinations. Journal of Forensic Sciences 2024, 69 (2) , 498-514. https://doi.org/10.1111/1556-4029.15449
  57. Zachary Andrews, Meghan Prusinowski, Evie Nguyen, Cedric Neumann, Tatiana Trejos. Assessing physical fit examinations of stabbed and torn textiles through a large dataset of casework‐like items and interlaboratory studies. Journal of Forensic Sciences 2024, 69 (2) , 469-497. https://doi.org/10.1111/1556-4029.15452
  58. Tayyaba Masood, Scheila Mânica, Hemlata Pandey. The Most Common Types of Bias in a Human Bitemark Analysis. Oral 2024, 4 (1) , 113-125. https://doi.org/10.3390/oral4010010
  59. Bethany Growns, Tess M.S. Neal. Forensic Science Decision-Making. 2024, 193-210. https://doi.org/10.1017/9781009119375.013
  60. Charlotte A. Bücken, Ivan Mangiulli, Brenda Erens, Aniek Leistra, Henry Otgaar. International researchers and child protection service workers beliefs about child sexual abuse disclosure and statement validity. Psychology, Crime & Law 2024, 85 , 1-25. https://doi.org/10.1080/1068316X.2024.2318370
  61. Maria Cuellar, Susan Vanderplas, Amanda Luby, Michael Rosenblum. Methodological problems in every black-box study of forensic firearm comparisons. Law, Probability and Risk 2024, 23 (1) https://doi.org/10.1093/lpr/mgae015
  62. Nicole A. Mantl, Sherry Nakhaeizadeh, Rebecca Watts, Carolyn Rando, Ruth M. Morgan. Evaluating intuitive decision-making in non-metric sex estimation from the cranium: an exploratory study. Australian Journal of Forensic Sciences 2024, 56 (1) , 2-23. https://doi.org/10.1080/00450618.2022.2104371
  63. Jenny Skrifvars, Veronica Sui, Jan Antfolk, Tanja van Veldhuizen, Julia Korkman. Psychological assumptions underlying credibility assessments in Finnish asylum determinations. Nordic Psychology 2024, 76 (1) , 55-77. https://doi.org/10.1080/19012276.2022.2145986
  64. David C. Coker. Learning to Lead: Debunking Strategic Leadership Myths and Misconceptions. 2024, 215-240. https://doi.org/10.1007/978-3-031-56415-4_9
  65. Deborah Davis, Gage A. Miller, Demi J. Hart, Alexis A. Hogan. On the Importance of Recognition and Mitigation of Bias in Forensic Science. 2024, 89-112. https://doi.org/10.1007/978-3-031-56556-4_5
  66. Lyndsie Ferrara. Ethical Culture in Forensic Science. 2024, 15-27. https://doi.org/10.1007/978-3-031-58392-6_3
  67. Al-Hareth Alhalalmeh, Alalddin Al-Tarawneh. Exploring Cognitive Biases, Decision-Making, and Their Impact on the Legal System. 2024, 635-645. https://doi.org/10.1007/978-3-031-73545-5_53
  68. Abu Md Ashif Ikbal, Rabin Debnath, Sabu Thomas, Debprasad Chattopadhyay, Partha Palit. Forensic Drug Chemistry: Unravelling Evidence Through Scientific Analysis. 2024, 319-361. https://doi.org/10.1007/978-981-97-1148-2_16
  69. Susan M. Wilczynski. Critically evaluating systematic reviews, meta-analyses, and alternate reviews. 2024, 119-131. https://doi.org/10.1016/B978-0-443-15632-8.00015-0
  70. . References. 2024, 247-272. https://doi.org/10.1016/B978-0-443-15632-8.09989-5
  71. Mohammed A. Almazrouei, Jeff Kukucka, Ruth M. Morgan, Ifat Levy. Unpacking workplace stress and forensic expert decision-making: From theory to practice. Forensic Science International: Synergy 2024, 8 , 100473. https://doi.org/10.1016/j.fsisyn.2024.100473
  72. Henry Otgaar, Tamara L.F. De Beuf, Melanie Sauerland, Alexa Schincariol. Evaluating the validity of testimony: The role of the order of evidence. Forensic Science International: Synergy 2024, 9 , 100562. https://doi.org/10.1016/j.fsisyn.2024.100562
  73. Olof Svensson, Peter Andiné, Sara Bromander, Karl Ask, Ann-Sophie Lindqvist Bagge, Malin Hildebrand Karlén. Experts' decision-making processes in Swedish forensic psychiatric investigations: A case vignette study. International Journal of Law and Psychiatry 2024, 92 , 101947. https://doi.org/10.1016/j.ijlp.2023.101947
  74. Sofia Holguin Jimenez, Xavier Godot, Jelena Petronijevic, Marc Lassagne, Bruno Daille-Lefevre. Considering cognitive biases in design: an integrated approach. Procedia Computer Science 2024, 232 , 2800-2809. https://doi.org/10.1016/j.procs.2024.02.097
  75. Amandeep Singh, Yovela Murzello, Hyowon Lee, Shene Abdalla, Siby Samuel. Moral Decision Making: Explainable Insights on the Role of Working Memory in Autonomous Driving. 2024https://doi.org/10.2139/ssrn.4888602
  76. Robert J. Coffey, Stanley N. Caroff. Neurosurgery for mental conditions and pain: An historical perspective on the limits of biological determinism. Surgical Neurology International 2024, 15 , 479. https://doi.org/10.25259/SNI_819_2024
  77. Marcele Elisa Fontana, Natallya de Almeida Levino, Patrícia Guarnieri, Sattar Salehi. Using Group Decision-Making to assess the negative environmental, social and economic impacts of unstable rock salt mines in Maceio, Brazil. The Extractive Industries and Society 2023, 16 , 101360. https://doi.org/10.1016/j.exis.2023.101360
  78. Sarah Barrington, Hany Farid. A comparative analysis of human and AI performance in forensic estimation of physical attributes. Scientific Reports 2023, 13 (1) https://doi.org/10.1038/s41598-023-31821-3
  79. Kathy Pezdek, Tamar Lerer. The New Reality: Non-Eyewitness Identifications in a Surveillance World. Current Directions in Psychological Science 2023, 32 (6) , 439-445. https://doi.org/10.1177/09637214231182582
  80. Itiel E. Dror. Racial Bias in Forensic Decision Making. Comment on Yim, A.-D.; Passalacqua, N.V. A Systematic Review and Meta-Analysis of the Effects of Race in the Criminal Justice System with Respect to Forensic Science Decision Making: Implications for Forensic Anthropology. Humans 2023, 3, 203–218. Humans 2023, 3 (4) , 319-320. https://doi.org/10.3390/humans3040024
  81. , Yung-Fou Chen, Paul Kuei-chi Tseng, . The Boundary of Artificial Intelligence in Forensic Science. DIALOGO 2023, 10 (1) , 83-90. https://doi.org/10.51917/dialogo.2023.10.1.5
  82. Hedayat Selim, Julia Korkman, Elina Pirjatanniemi, Jan Antfolk. Asylum claims based on sexual orientation: a review of psycho-legal issues in credibility assessments. Psychology, Crime & Law 2023, 29 (10) , 1001-1030. https://doi.org/10.1080/1068316X.2022.2044038
  83. Hedayat Selim, Julia Korkman, Peter Nynäs, Elina Pirjatanniemi, Jan Antfolk. A review of psycho-legal issues in credibility assessments of asylum claims based on religion. Psychiatry, Psychology and Law 2023, 30 (6) , 760-788. https://doi.org/10.1080/13218719.2022.2116611
  84. Michelle D. Sullivan, William Pinson, Troy Eberhardt, John J. Ross,, Tyler W. Wood. Deposition order and physicochemical process visualization of ink intersections using X‐ray photoelectron spectroscopy imaging for forensic analysis. Surface and Interface Analysis 2023, 55 (11) , 808-821. https://doi.org/10.1002/sia.7246
  85. Itiel E. Dror. The most consistent finding in forensic science is inconsistency. Journal of Forensic Sciences 2023, 68 (6) , 1851-1855. https://doi.org/10.1111/1556-4029.15369
  86. William C. Thompson. Shifting decision thresholds can undermine the probative value and legal utility of forensic pattern-matching evidence. Proceedings of the National Academy of Sciences 2023, 120 (41) https://doi.org/10.1073/pnas.2301844120
  87. Christine J. Ko, Earl J. Glusac. Cognitive bias in pathology, as exemplified in dermatopathology. Human Pathology 2023, 140 , 267-275. https://doi.org/10.1016/j.humpath.2023.03.003
  88. Michael Odei Erdiaw-Kwasie, Matthew Abunyewah, Charles Baah. Corporate social responsibility (CSR) and cognitive bias: A systematic review and research direction. Resources Policy 2023, 86 , 104201. https://doi.org/10.1016/j.resourpol.2023.104201
  89. Mario S Staller, Swen Koerner. A case example of teaching reflective policing to police students. Teaching Public Administration 2023, 41 (3) , 351-366. https://doi.org/10.1177/01447394211067109
  90. Siegfried L. Sporer, Jaume Masip. Millennia of legal content criteria of lies and truths: wisdom or common-sense folly?. Frontiers in Psychology 2023, 14 https://doi.org/10.3389/fpsyg.2023.1219995
  91. Sally F. Kelty, Olivier Ribaux, James Robertson. Identifying the critical skillset of top crime scene examiners: Why this matters and why agencies should develop top performers. WIREs Forensic Science 2023, 5 (5) https://doi.org/10.1002/wfs2.1494
  92. Frances A. Whitehead, Mary R. Williams, Michael E. Sigman. Analyst and machine learning opinions in fire debris analysis. Forensic Chemistry 2023, 35 , 100517. https://doi.org/10.1016/j.forc.2023.100517
  93. Nathalie Bugeja, Cameron Oliver, Nicole McGrath, Jake McGuire, Chunhui Yan, Felicity Carlysle-Davies, Marc Reid. Teaching old presumptive tests new digital tricks with computer vision for forensic applications. Digital Discovery 2023, 2 (4) , 1143-1151. https://doi.org/10.1039/D3DD00066D
  94. Taro Shimizu, Itiel E Dror. History information management strategy for minimising biases and noise for improved medical diagnosis. BMJ Open Quality 2023, 12 (3) , e002367. https://doi.org/10.1136/bmjoq-2023-002367
  95. Jonathan W. Hak. “The pedagogical expert witness: teaching complex science in the courtroom”. Canadian Society of Forensic Science Journal 2023, 56 (3) , 182-189. https://doi.org/10.1080/00085030.2022.2135742
  96. Meghan Prusinowski, Evie Brooks, Cedric Neumann, Tatiana Trejos. Forensic interlaboratory evaluations of a systematic method for examining, documenting, and interpreting duct tape physical fits. Forensic Chemistry 2023, 34 , 100487. https://doi.org/10.1016/j.forc.2023.100487
  97. Radina Stoykova, Katrin Franke. Reliability validation enabling framework (RVEF) for digital forensics in criminal investigations. Forensic Science International: Digital Investigation 2023, 45 , 301554. https://doi.org/10.1016/j.fsidi.2023.301554
  98. Paulina Salazar-Aguilar, Carlos Zaror-Sánchez, Gabriel M. Fonseca. Forensic odontology: Wrong convictions, “bad apples” and “the innocence files”. Journal of Forensic and Legal Medicine 2023, 96 , 102528. https://doi.org/10.1016/j.jflm.2023.102528
  99. John Morgan. Wrongful convictions and claims of false or misleading forensic evidence. Journal of Forensic Sciences 2023, 68 (3) , 908-961. https://doi.org/10.1111/1556-4029.15233
  100. Sara L. Gleasman-DeSimone, . Identifying and Addressing Bias in Nursing Teaching: A Creative Controversy Essay. Nursing Forum 2023, 2023 , 1-6. https://doi.org/10.1155/2023/3459527
Load all citations

Analytical Chemistry

Cite this: Anal. Chem. 2020, 92, 12, 7998–8004
Click to copy citationCitation copied!
https://doi.org/10.1021/acs.analchem.0c00704
Published June 8, 2020

Copyright © 2020 American Chemical Society. This publication is licensed under these Terms of Use.

Article Views

86k

Altmetric

-

Citations

Learn about these metrics

Article Views are the COUNTER-compliant sum of full text article downloads since November 2008 (both PDF and HTML) across all institutions and individuals. These metrics are regularly updated to reflect usage leading up to the last few days.

Citations are the number of other articles citing this article, calculated by Crossref and updated daily. Find more information about Crossref citation counts.

The Altmetric Attention Score is a quantitative measure of the attention that a research article has received online. Clicking on the donut icon will load a page at altmetric.com with additional details about the score and the social media presence for the given article. Find more information on the Altmetric Attention Score and how the score is calculated.

  • Abstract

    Figure 1

    Figure 1. Eight sources of bias that may cognitively contaminate sampling, observations, testing strategies, analysis, and conclusions, even by experts. They are organized in a taxonomy within three categories: starting off at the top with sources relating to the specific case and analysis (Category A), moving down to sources that relate to the specific person doing the analysis (Category B), and at the very bottom sources that relate to human nature (Category C).

  • References


    This article references 78 other publications.

    1. 1
      McCord, B. R.; Gauthier, Q.; Cho, S.; Roig, M. N.; Gibson-Daw, G. C.; Young, B.; Taglia, F.; Zapico, S. C.; Mariot, R. F.; Lee, S. B.; Duncan, G. Anal. Chem. 2019, 91, 673688,  DOI: 10.1021/acs.analchem.8b05318
    2. 2
      Barrio, P.A.; Crespillo, M.; Luque, J.A.; Aler, M.; Baeza-Richer, C.; Baldassarri, L.; Carnevali, E.; Coufalova, P.; Flores, I.; Garcia, O.; Garcia, M.A.; Gonzalez, R.; Hernandez, A.; Ingles, V.; Luque, G.M.; Mosquera-Miguel, A.; Pedrosa, S.; Pontes, M.L.; Porto, M.J.; Posada, Y.; Ramella, M.I.; Ribeiro, T.; Riego, E.; Sala, A.; Saragoni, V.G.; Serrano, A.; Vannelli, S. Forensic Sci. Int.: Genet. 2018, 35, 156163,  DOI: 10.1016/j.fsigen.2018.05.005
    3. 3
      Butler, J. M.; Kline, M. C.; Coble, M. D. Forensic Sci. Int.: Genet. 2018, 37, 8194,  DOI: 10.1016/j.fsigen.2018.07.024
    4. 4
      Bright, J.-A.; Cheng, K.; Kerr, Z.; McGovern, C.; Kelly, H.; Moretti, T. R.; Smith, M. A.; Bieber, F. R.; Budowle, B.; Coble, M. D.; Alghafri, R.; Allen, P. S.; Barber, A.; Beamer, V.; Buettner, C.; Russell, M.; Gehrig, C.; Hicks, T.; Charak, J.; Cheong-Wing, K.; Ciecko, A.; Davis, C. T.; Donley, M.; Pedersen, N.; Gartside, B.; Granger, D.; Greer-Ritzheimer, M.; Reisinger, E.; Kennedy, J.; Grammer, E.; Kaplan, M.; Hansen, D.; Larsen, H. J.; Laureano, A.; Li, C.; Lien, E.; Lindberg, E.; Kelly, C.; Mallinder, B.; Malsom, S.; Yacovone-Margetts, A.; McWhorter, A.; Prajapati, S. M.; Powell, T.; Shutler, G.; Stevenson, K.; Stonehouse, A. R.; Smith, L.; Murakami, J.; Halsing, E.; Wright, D.; Clark, L.; Taylor, D. A.; Buckleton, J. Forensic Sci. Int.: Genet. 2019, 40, 18,  DOI: 10.1016/j.fsigen.2019.01.006
    5. 5
      Cooper, G. S.; Meterko, V. Forensic Sci. Int. 2019, 297, 3546,  DOI: 10.1016/j.forsciint.2019.01.016
    6. 6
      Dror, I. E. J. Appl. Res. Mem. Cog. 2016, 5 (2), 121127,  DOI: 10.1016/j.jarmac.2016.03.001
    7. 7
      Dror, I. E.; Murrie, D. Psych. Pub. Policy Law. 2018, 24 (1), 1123,  DOI: 10.1037/law0000140
    8. 8
      Segall, M.; Chadwick, A. Future Med. Chem. 2011, 3 (7), 771774,  DOI: 10.4155/fmc.11.33
    9. 9
      Hamnett, H.; Jack, R. Sci. Justice 2019, 59 (4), 380389,  DOI: 10.1016/j.scijus.2019.02.004
    10. 10
      Forensic Science Regulator. Contextual Bias in Forensic toxicology , 2019. https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/800561/Lessons_Learnt_May19__L-B03_-final.pdf (accessed June 2020).
    11. 11
      Downs, U., Swienton, A. R., Eds.; Ethics in Forensic Science; Academic Press: Waltham, MA, 2012; pp 1441.
    12. 12
      Gannett, C. A130 Ethics in Forensic Science, 2020. https://cci.gosignmeup.com/public/Course/browse?courseid=2552 (accessed June 2020).
    13. 13
      Nickerson, R. S. Rev. Gen. Psych. 1998, 2, 175220,  DOI: 10.1037/1089-2680.2.2.175
    14. 14
      Bidgood, J. Chemist’s Misconduct Is Likely to Void 20,000 Massachusetts Drug Cases. New York Times , April 18, 2017.
    15. 15
      Mettler, K. How a Lab Chemist Went from ‘Superwoman’ to Disgraced Saboteur of More than 20,000 Drug Cases. Washington Post , April 21, 2017.
    16. 16
      Thompson, W. C. Southwestern Uni. Law Rev. 2009, 37, 971994
    17. 17
      Kukucka, J.; Kassin, S.; Zapf, P.; Dror, I. E. J. Appl. Res. Mem. Cog. 2017, 6 (4), 452459,  DOI: 10.1016/j.jarmac.2017.09.001
    18. 18
      Dror, I. E.; Kukucka, J.; Kassin, S.; Zapf, P. J. Appl. Res. Mem. Cog. 2018, 7 (2), 316317,  DOI: 10.1016/j.jarmac.2018.03.005
    19. 19
      Dror, I. E. In The Paradoxical Brain; Kapur, N., Ed.; Cambridge University Press: Cambridge, UK, 2011; pp 177188.
    20. 20
      Shanteat, J. In Advances in Design Research; Rohrmann, B., Beach, L. R., Vlek, C., Watson, S. R., Eds.; Elsevier: Amsterdam, 1989; pp 203215.
    21. 21
      Soller, J. M.; Ausband, D. E.; Gunther, S. M. PLoS One 2020, 15 (3), e0229762,  DOI: 10.1371/journal.pone.0229762
    22. 22
      Dror, I. E.; Wertheim, K.; Fraser-Mackenzie, P.; Walajtys, J. J. Forensic Sci. 2012, 57 (2), 343352,  DOI: 10.1111/j.1556-4029.2011.02013.x
    23. 23
      Pronin, E.; Lin, D. Y.; Ross, L. Person. Soc. Psych. Bull. 2002, 28, 369381,  DOI: 10.1177/0146167202286008
    24. 24
      Zapf, P.; Kukucka, J.; Kassin, S.; Dror, I. E. Psych., Pub. Policy Law 2018, 24 (1), 110,  DOI: 10.1037/law0000153
    25. 25
      Thornton, J. I. J. Forensic Sci. 2010, 55 (6), 1663,  DOI: 10.1111/j.1556-4029.2010.01497.x
    26. 26
      Wegner, D. M. Psych. Rev. 1994, 101, 3452,  DOI: 10.1037/0033-295X.101.1.34
    27. 27
      Steblay, N.; Hosch, H. M.; Culhane, S. E.; McWethy, A. Law Hum. Beh. 2006, 30, 469492,  DOI: 10.1007/s10979-006-9039-7
    28. 28
      Zajonc, R. B. Am. Psychol. 1980, 35, 151175,  DOI: 10.1037/0003-066X.35.2.151
    29. 29
      Finucane, M. L.; Alhakami, A.; Slovic, P.; Johnson, S. M. J. Behav. Decis. Making 2000, 13, 117,  DOI: 10.1002/(SICI)1099-0771(200001/03)13:1<1::AID-BDM333>3.0.CO;2-S
    30. 30
      Damasio, A. R. Descartes’ Error: Emotion, Reason, and the Human Brain; Penguin Books: New York, 2005; pp 1312.
    31. 31
      Jeanguenat, A. M.; Budowle, B.; Dror, I. E. Sci. Justice 2017, 57 (6), 415420,  DOI: 10.1016/j.scijus.2017.07.005
    32. 32
      Dror, I. E. Science 2018, 360 (6386), 243,  DOI: 10.1126/science.aat8443
    33. 33
      Starr, D. Forensics gone wrong: When DNA snares the innocent. Science 2016, 7, nana,  DOI: 10.1126/science.aaf4160
    34. 34
      Hanna, J.; Valencia, N. DNA Analysis Clears Georgia Man Who Served 17 Years in Wrongful Rape Conviction. CNN, January 10, 2020.
    35. 35
      Dror, I. E.; Hampikian, G. Sci. Justice 2011, 51 (4), 204208,  DOI: 10.1016/j.scijus.2011.08.004
    36. 36
      A Review of the FBI’s Handling of the Brandon Mayfield Case; Office of the Inspector General, Oversight & Review Division, U.S. Department of Justice, 2006.
    37. 37
      Marqués-Mateu, Á.; Moreno-Ramón, H.; Balasch, S.; Ibáñez-Asensio, S. Catena 2018, 171, 4453,  DOI: 10.1016/j.catena.2018.06.027
    38. 38
      Berlin, L. AJR, Am. J. Roentgenol. 2007, 189, 517522,  DOI: 10.2214/AJR.07.2209
    39. 39
      Kriegeskorte, N.; Simmons, W K.; Bellgowan, P. S F; Baker, C. I Nat. Neurosci. 2009, 12, 535540,  DOI: 10.1038/nn.2303
    40. 40
      Vul, E.; Kanwisher, N. In Foundational Issues for Human Brain Mapping; Hanson, S., Bunzl, M., Eds.; MIT Press: Cambridge, MA, 2010; pp 7192.
    41. 41
      Richter, D.; Ekman, M.; de Lange, F. P. J. Neurosci. 2018, 38, 74527461,  DOI: 10.1523/JNEUROSCI.3421-17.2018
    42. 42
      Luck, S. J.; Ford, M. A. Proc. Natl. Acad. Sci. U. S. A. 1998, 95 (3), 825830,  DOI: 10.1073/pnas.95.3.825
    43. 43
      Simons, D. J.; Chabris, C. F. Percept. 1999, 28, 10591074,  DOI: 10.1068/p281059
    44. 44
      Stein, T.; Peelen, M. V. J. Exp. Psychol. Gen. 2015, 144, 10891104,  DOI: 10.1037/xge0000109
    45. 45
      Kok, P.; Brouwer, G. J.; van Gerven, M. A. J.; de Lange, F. P. J. Neurosci. 2013, 33, 1627516284,  DOI: 10.1523/JNEUROSCI.0742-13.2013
    46. 46
      de Lange, F. P.; Heilbron, M.; Kok, P. Trends Cognit. Sci. 2018, 22, 764779,  DOI: 10.1016/j.tics.2018.06.002
    47. 47
      Gardner, B. O.; Kelley, S.; Murrie, D. C.; Blaisdell, K. N. Forensic Sci. Int. 2019, 297, 236242,  DOI: 10.1016/j.forsciint.2019.01.048
    48. 48
      Eeden, C. A. J.; de Poot, C. J.; van Koppen, P. J. J. Forensic Sci. 2019, 64 (1), 120126,  DOI: 10.1111/1556-4029.13817
    49. 49
      Morewedge, C. K.; Kahneman, D. Trends Cognit. Sci. 2010, 14, 435440,  DOI: 10.1016/j.tics.2010.07.004
    50. 50
      Moorcroft, M.; Davis, J.; Compton, R. G. Talanta 2001, 54, 785803,  DOI: 10.1016/S0039-9140(01)00323-X
    51. 51
      Almog, J.; Zitrin, S. In Aspects of Explosive Detection; Marshal, M., Oxley, M., Eds.; Elsevier: Amsterdam, 2009; pp 4748.
    52. 52
      Lissaman, C. Birmingham Pub Bombers Will Probably Never Be Found. BBC News, March 14, 2011.
    53. 53
      Lang, S. E. Report of the Motherisk Hair Analysis Independent Review; Ontario Ministry of the Attorney General, Canada, 2015.
    54. 54
      Egglin, T. K.; Feinstein, A. R. J. Am. Med. Assoc. 1996, 276 (21), 17521755,  DOI: 10.1001/jama.1996.03540210060035
    55. 55
      Wolfe, J. M.; Horowitz, T. S.; Kenner, N. M. Nature 2005, 435, 439440,  DOI: 10.1038/435439a
    56. 56
      Wolfe, J. M.; Horowitz, T. S.; Van Wert, M. J.; Kenner, N. M.; Place, S. S.; Kibbi, N. J. Exp. Psychol. Gen. 2007, 136, 623638,  DOI: 10.1037/0096-3445.136.4.623
    57. 57
      Shafffi, E. B.; Smith, E. E.; Osherson, D. N. Mem Cog. 1990, 18 (3), 229239,  DOI: 10.3758/BF03213877
    58. 58
      Simon, D.; Ahn, M.; Stenstrom, D. M.; Read, S. J. Psych. Public Policy. Law 2020, n, na,  DOI: 10.1037/law0000226
    59. 59
      Murrie, D. C.; Boccaccini, M. T.; Guarnera, L. A.; Rufino, K. A. Psych Sci. 2013, 24, 18891897,  DOI: 10.1177/0956797613481812
    60. 60
      Strengthening Forensic Science in the United States: A Path Forward; National Academies Press: Washington, DC, 2009.
    61. 61
      Whitman, G.; Koppl, R. Law Prob. Risk 2010, 9, 6990,  DOI: 10.1093/lpr/mgp028
    62. 62
      Howard, J. Cognitive Errors and Diagnostic Mistakes: A Case-Based Guide to Critical Thinking in Medicine; Springer: New York, 2019.
    63. 63
      Cosby, K. S.; Croskerry, P. Acad. Emergency Med. 2004, 11, 13411345,  DOI: 10.1197/j.aem.2004.07.005
    64. 64
      Saposnik, G.; Redelmeier, D.; Ruff, C. C.; Tobler BMC Med. Inf. Decis. Making 2016, 16, 138,  DOI: 10.1186/s12911-016-0377-1
    65. 65
      Dror, I. E.; Mnookin, J. Law Prob. Risk 2010, 9 (1), 4767,  DOI: 10.1093/lpr/mgp031
    66. 66
      General Requirements for the Competence of Testing and Calibration Laboratories, 3rd ed.; ISO/IEC 17025; International Organization for Standardization/International Electrotechnical Commission, Geneva, Switzerland, 2017.
    67. 67
      Dror, I. E.; Pierce, M. L. J. Forensic Sci. 2020, 65 (3), 800808,  DOI: 10.1111/1556-4029.14265
    68. 68
      Dror, I. E.; Langenburg, G. J. Forensic Sci. 2019, 64 (1), 1015,  DOI: 10.1111/1556-4029.13854
    69. 69
      Gok, K.; Atsan, N. Intern J. Business Soc. Res. 2016, 6 (3), 3847,  DOI: 10.18533/ijbsr.v6i3.936
    70. 70
      Neal, T. PLoS One 2016, 11 (4), e0154434  DOI: 10.1371/journal.pone.0154434
    71. 71
      Miller, A. K.; Rufino, K. A.; Boccaccini, M. T.; Jackson, R. L.; Murrie, D. C. Assessment 2011, 18 (2), 253260,  DOI: 10.1177/1073191111402460
    72. 72
      Griffin, D.; Ross, L. In Advances in Experimental Social Psychology; Zanna, M. P., Ed.; Academic Press: San Diego, CA, 1991; pp 319359.
    73. 73
      Ross, L.; Greene, D.; House, P. J. Exp. Soc. Psych. 1977, 13, 279301,  DOI: 10.1016/0022-1031(77)90049-X
    74. 74
      Oppenheimer, D. M. Trends Cognit. Sci. 2008, 12, 237241,  DOI: 10.1016/j.tics.2008.02.014
    75. 75
      Goldstein, D. G.; Gigerenzer, G. Psychol. Rev. 2002, 109, 7590,  DOI: 10.1037/0033-295X.109.1.75
    76. 76
      Robertson, C., Kesselheim, A., Eds.; Blinding as a Solution to Bias: Strengthening Biomedical Science, Forensic Science, and Law; Academic Press: New York, 2016; pp 1388.
    77. 77
      Maude, J. Diagnosis 2014, 1, 107109,  DOI: 10.1515/dx-2013-0009
    78. 78
      Barondess, J. A., Carpenter, C. C., Eds.; Differential Diagnosis; Lea & Febiger: Philadelphia, PA, 1994; pp 1800.