ACS Publications. Most Trusted. Most Cited. Most Read
My Activity
CONTENT TYPES

Cognitive and Human Factors in Expert Decision Making: Six Fallacies and the Eight Sources of Bias

Cite this: Anal. Chem. 2020, 92, 12, 7998–8004
Publication Date (Web):June 8, 2020
https://doi.org/10.1021/acs.analchem.0c00704

Copyright © 2020 American Chemical Society. This publication is licensed under these Terms of Use.

  • Open Access
  • Editors Choice

Article Views

75844

Altmetric

-

Citations

LEARN ABOUT THESE METRICS
PDF (1 MB)

Abstract

Fallacies about the nature of biases have shadowed a proper cognitive understanding of biases and their sources, which in turn lead to ways that minimize their impact. Six such fallacies are presented: it is an ethical issue, only applies to “bad apples”, experts are impartial and immune, technology eliminates bias, blind spot, and the illusion of control. Then, eight sources of bias are discussed and conceptualized within three categories: (A) factors that relate to the specific case and analysis, which include the data, reference materials, and contextual information, (B) factors that relate to the specific person doing the analysis, which include past experience base rates, organizational factors, education and training, and personal factors, and lastly, (C) cognitive architecture and human nature that impacts all of us. These factors can impact what the data are (e.g., how data are sampled and collected, or what is considered as noise and therefore disregarded), the actual results (e.g., decisions on testing strategies, how analysis is conducted, and when to stop testing), and the conclusions (e.g., interpretation of the results). The paper concludes with specific measures that can minimize these biases.

Scientists and experts are expected to make observations and draw impartial conclusions based on the relevant data. Therefore, we tend to focus on the data and methods used, as if they exist in isolation from the human decision maker. For example, a recent paper reviews forensic DNA analysis (1) and provides a thorough review of the DNA methods, but does not give sufficient note and attention to the fact that DNA analysis depends and hinges on the humans conducting it, and how their biases may impact the DNA results. Indeed, a flurry of recent publications demonstrate the importance and influence of the human impact on DNA analysis. (2−4)
Such cognitive biases are not limited to DNA analysis, and have been demonstrated in many domains (e.g., fingerprinting and other forensic science disciplines, (5,6) forensic psychology, (7) as well as drug detection and discovery (8)). Biases can impact the actual observation and perception of the data, testing strategies, as well as how the results are then interpreted and how conclusions are reached. (6) Even one of the most objective domains, toxicology, has now been shown to be exposed to issues of cognitive bias, (9) and such biases have created error in real toxicology casework (e.g., determining a wrong testing strategy in a post-mortem toxicology case (10)).
Thus, the impact of bias is not limited to the important interpretation of the results, but it has a much broader and deeper scope as it impacts what the data are (e.g., what is collected, what is considered as noise and disregarded, etc.) and the results of the analyses themselves (e.g., testing strategy as to what tests and types of analysis to conduct, how they are carried out and by whom, and when to stop testing).
Many of the challenges in dealing with these biases arise from the nature of cognitive biases, their various sources, and fallacies about them, which are described in detail below.

Six Fallacies of Bias

ARTICLE SECTIONS
Jump To

Before delving into the eight sources of cognitive biases, it is important to present and dispel commonly held fallacies. If we do not recognize and write-off these fallacies, it becomes practically impossible to move forward because these fallacies minimize (if not dismiss altogether) the very existence of the biases. The first step to deal with biases is to acknowledge their existence and the impact they may have on hard working, dedicated, and competent experts.

First Fallacy: Ethical Issues

Many incorrectly think these biases are an ethical issue of corrupt individuals or even malicious acts by unscrupulous people, whereas cognitive bias actually impacts honest and dedicated examiners. Time and again cognitive biases are presented under the umbrella of ethical issues, in books (11), conferences, and training (e.g., “ethics in cognitive bias” (12)). It is a fallacy that arises from a basic misunderstanding of what cognitive bias is all about. It is not a matter of dishonesty, intentional discrimination, or of a deliberate act arising from underlying ethical issues. (13) It is true that there are cases of intentional professional misconduct, e.g., when a chemist in a drug laboratory in Boston forged tests. (14,15) However, cognitive bias is not about such ethical issues of personal character, integrity, or intentional misconduct.

Second Fallacy: Bad Apples

Often when errors and biases are found, the experts involved are blamed for the errors, rather than acknowledging any systemic issues, such as the human element in the analysis and the general susceptibility to bias. (16) Surely there are cases where error is due to examiners’ competency, but these are relatively easy to detect and to fix. The kind of biases discussed here, implicit cognitive biases, are widespread and are not due to incompetency, and therefore are harder to detect.

Third Fallacy: Expert Immunity

There is a widely incorrect belief that experts are impartial and immune to biases. (17) However, the truth of the matter is that no one is immune to bias, not even experts. (18) In fact, in many ways, experts are more susceptible to certain biases. The very making of expertise creates and underpins many of the biases. (19) For example, experience and training make experts engage in more selective attention, use chunking and schemas (typical activities and their sequence), and rely on heuristics and expectations arising from past base rate experiences, utilizing a whole range of top-down cognitive processes which create a priori assumptions and expectations.
These cognitive processes enable experts to often make quick and accurate decisions. However, these very mechanisms also create bias that can lead them in the wrong direction. Regardless of the utilities (and vulnerability) of such cognitive processing in experts, they do not immune experts from bias, and indeed, expertise and experience may actually increase (or even cause) certain biases. Experts across domains are subject to cognitive vulnerabilities. (20)
Although experts tend to be very confident (sometimes even overconfident), more experienced experts can actually perform worse than novices. This has been demonstrated, for example, in environmental ecology, where data collection is critical and underpinned by the ability to correctly identify and collect samples, and novices actually outperform experts. (21)

Fourth Fallacy: Technological Protection

People think that the mere use of technology, instrumentation, automation, or artificial intelligence eliminates bias. These can reduce bias, but even when these are in used, human biases are still at play because these systems are built, programmed, operated, or interpreted by humans. There is a danger that people will incorrectly believe that using technology is a guaranteed protection from being susceptible to and affected by bias. Furthermore, technology can even introduce biases, be it mass spectral library matching software, Automated Fingerprint Identification Systems (AFIS), or other technological devices. (22)

Fifth Fallacy: Bias Blind Spot

The experts themselves are often not aware of their biases and therefore hold a fallacy that they are not biased. One must remember that these are implicit cognitive biases, not intentional discriminatory types of biases. The bias blind spot (23) is well documented and has been demonstrated in a variety of domains, including forensic science (17) and forensic psychology. (24) While it is relatively easy to see bias in others, we are often blind to our own biases. Research has found that 70% of forensic scientists now acknowledge that cognitive bias is a cause for concern in forensic science as a whole, but only 52% think it is a concern in their own specific forensic domain, and just 25% think it is relevant to them personally, reflecting the hallmarks of the bias blind spot. (17,23)

Sixth Fallacy: Illusion of Control

Even when experts are made aware of and acknowledge their biases, they nevertheless think they can overcome them by mere willpower. (25) This is the illusion of control. Combating and countering these biases requires taking specific steps—willpower alone is inadequate to deal with the various manifestations of bias.
In fact, trying to deal with bias by the illusion of control may actually increase the bias, due to “ironic processing” or “ironic rebound”. (26) Hence, trying to minimize bias by willpower makes you think of it more and actually increases its effect. This is similar to a judge instructing jurors to disregard certain evidence. By doing so, the judge is actually making the jurors notice this evidence even more. (27)
The beliefs in such fallacies, see Table 1, prevent dealing with the biases because they dismiss their powers and even their very existence. We need to acknowledge the impact of biases and understand their sources, so we can take appropriate measures when needed and when possible to combat their affects.
Table 1. Six Fallacies about Cognitive Bias Commonly Held by Experts
FallacyIncorrect belief
1. Ethical IssuesIt only happens to corrupt and unscrupulous individuals, an issue of morals and personal integrity, a question of personal character.
2. Bad ApplesIt is a question of competency and happens to experts who do not know how to do their job properly.
3. Expert ImmunityExperts are impartial and are not affected because bias does not impact competent experts doing their job with integrity.
4. Technological ProtectionUsing technology, instrumentation, automation, or artificial intelligence guarantees protection from human biases.
5. Blind SpotOther experts are affected by bias, but not me. I am not biased; it is the other experts who are biased.
6. Illusion of ControlI am aware that bias impacts me, and therefore, I can control and counter its affect. I can overcome bias by mere willpower.

Eight Sources of Bias

ARTICLE SECTIONS
Jump To

Cognitive biases are widespread, a ubiquitous phenomenon in many guises. (13)Figure 1 illustrates eight sources of bias in expert decision making. It shows that cognitive bias can arise from a variety of sources, which are categorized into three groups. Category A relates to the specific case—something about this case causes bias in how data are perceived, analyzed, and interpreted. Other sources of cognitive bias, Category B, have nothing to do with the specific case, but they arise from factors relating to the specific person doing the work—something about them (e.g., their experience, their personality, their working environment, their motivation, etc.) causes the bias. Further sources of cognitive bias, Category C, arise from human nature, the very cognitive architecture of the human brain that we all share, regardless of the specific case or the specific person doing the analysis.

Figure 1

Figure 1. Eight sources of bias that may cognitively contaminate sampling, observations, testing strategies, analysis, and conclusions, even by experts. They are organized in a taxonomy within three categories: starting off at the top with sources relating to the specific case and analysis (Category A), moving down to sources that relate to the specific person doing the analysis (Category B), and at the very bottom sources that relate to human nature (Category C).

It is important to understand these different sources of bias so we can more easily recognize and expose them, and most importantly, each of the different sources of biases requires specific countermeasures.

(1) The Data

The first source of cognitive bias within the factors that relate to the specific analysis is the data. How can data cause bias? Well, it depends on the data. Some data, such as fingermarks, do not, per se, cause bias, as they convey no information beyond the friction ridges impressions. However, with other types of data (such as in the analysis of voice, handwriting, blood spatter, and bitemarks), the data can contain potentially biasing information. For example, in voice analysis, the content, or even the tone and screaming, can reveal a brutal assault. Similarly, gang-rape mixture DNA evidence can evoke emotions which can impact decision making. (28−30)

(2) Reference Materials

Reference materials can bias how the data are perceived and interpreted. This applies to a variety of data, including DNA, fingerprinting, and any decisions that are based on making comparisons. If DNA evidence interpretation is influenced by the “target” suspect known reference material (i.e., their DNA profile) so to better fit them, then there is biased interpretation of the biological material from the crime scene. (31)
Rather than the actual evidence being the driver of the decision making process—where evidence is interpreted based on the data it contains—the suspect’s profile is driving the decision making. Hence, rather than going from the evidence to the suspect (from data to theory), the reference materials cause examiners to go backward (circular reasoning) from the target/suspect to the evidence, thus biasing how the data are interpreted. (32)
Take the case of Kerry Robinson who was wrongly convicted of rape based on mistaken analysis of DNA evidence. (33) The two DNA examiners in the case made an error as they seem to have been working backward from Kerry Robinson’s known DNA profile to the latent evidence. Thus, the suspect “target” was driving the analysis rather than the evidence itself. He was exonerated and released from jail after serving 17 years. (34) Indeed, when the same DNA evidence was later analyzed properly, different results were obtained. (35)
The problem with “going backward”, circular reasoning, is not limited to DNA and can be found in other comparative domains, such as fingerprinting, handwriting, firearms, and other domains (for a review, see ref (5)), where the reference materials of the known suspect influences the interpretation of the data from the crime scene. This has been exemplified when the FBI (as well as an expert hired by the defense) erroneously identified Mayfield as the Madrid bomber. (36) Again, the “target” (in this case, the fingerprints of the suspect) was driving the analysis, resulting in an erroneous identification. For example, in the analysis, a signal in the evidence was perceived as noise and disregarded because it did not match the target—a typical error emerging from going backward and letting the target or expected results drive the analysis.
Furthermore, this source of bias is not limited to circumstances that have a “target” suspect per se, but can also arise from pre-existing templates and patterns, such as in the interpretation of blood pattern analysis or a crime scene. It can even impact what color is observed. Hence, bias is not limited to the interpretation for the appearance of a color (e.g., pink color in the Griess test, incorrectly interpreted as meaning the suspects handled dynamite, see below), but can even impact what color is observed in the first place when Munsell color charts are used as reference templates. (37)
This bias goes beyond impacting only the conclusion and interpretation of what the presence of the color means, because it can also bias and impact the observation of what the color itself is (for this important distinction between biases impacting observations vs conclusions, see the Hierarchy of Expert Performance (HEP)). (6)
These types of cognitive biases are therefore not limited to a target or reference, they can even be caused by having a theory, a chart, or a pattern in mind, and have been shown to impact a variety of expert domains. (38−40)
This bias can also occur, for example, in a messy mass chromatogram or infrared spectrum where an analyst might “pick out” the peaks they are looking for and ignore the others. This is not intentional and happens without awareness. The expectation biases cognitive resources and attention toward a certain stimulus or signal (while suppressing and ignoring others) (41) and impacts sensory response to predictable object stimuli throughout the ventral visual pathway. (42,43) It also biases perception sensitivity for those targets, (44) changing sensory representations in the visual cortex, (45) and hence impacts what we actually perceive and how. (46) Just as it can bias the visual search of a mass chromatogram or infrared spectrum, it can impact detection in radiography (38) and many other domains. In all of these examples, the human examiner is driven by a “target” they expect (or want) rather than by the actual data.

(3) Contextual Information

Experts are often exposed to irrelevant information. In the forensic domain, for example, such information may be that the suspect confessed to the crime, that they have been identified by eyewitnesses and other lines of evidence, or that the suspect has a criminal record. (47) Even knowing the name of the suspect may be suggestive of a specific race evoking biases and stereotypes. These all cause expectations that can impact not only the interpretation of the results obtained from the analysis, but also the analysis itself because the expectations impact the detection of what goes into the analysis as well as testing strategies. This source of bias is not derived from a target generated by the reference materials (see above) but from contextual information that can be irrelevant to the task carried out by the analyst.
In toxicology, for example, contextual information can bias testing strategies. Consider, for instance, when a post-mortem case is provided with contextual informant, such as “drug-overdose” or/and that “the deceased was a known to have a history of heroin use.” Such information can impact the established testing strategies, such as to go straight to the confirmation and quantification of a limited range of opiate-type drugs (morphine, codeine, 6-monoacetylmorphine, and other heroin markers), without running the other standard testing, such as an immunoassay or looking for other opioids that may have been present (e.g., fentanyl). Hence, the contextual information caused a confirmation bias approach and deviation from the standard testing and screening protocols. This has indeed caused errors in toxicology testing cases. (10)
In the forensic context, for example, irrelevant contextual information has been shown to impact data collection at the crime scene, (48) as well as bias in laboratory DNA analysis (35) (for reviews, see refs (5,6)). Similarly, deviation from standard testing strategies, data collection, and sampling can be biased by contextual information when the laboratory does analyses knowing that a Drug Recognition Expert (DRE) evaluated a person as being impaired and under the influence of a stimulant, but there was no alcohol in their blood, or when a dog signals the presence of drugs, or any other contextual information that causes an expectation to what the results should be.
Contextual expectations not only impact data collection and testing strategies but also the interpretation and conclusions of the analysis. For example, “a poor-quality chromatographic or mass spectral match for a drug could be consciously or subconsciously “upgraded” to a positive result based on the expectation that the sample should be positive for that drug. Conversely, a similarly poor-quality match might be “downgraded” to negative if the analyst is not expecting the drug to be present, or its presence ‘does not fit the circumstances of the case’. In both situations, the context could give the illusion of a stronger basis for the decision than is warranted by the data” (ref (9), page 381). Cognitive biases arise when task-irrelevant context causes some aspect of analysis to be overweighted, underweighted, or neglected (e.g., not perceived, determined to be noise, an anomaly, or an outlier). This does not only happen in subjective judgements but can also bias even established procedures, (10) and criteria for accepting evidence and proper judgment. (49)
The problem with task irrelevant contextual information is that it can cause many kinds of biases that impact analysis in many different ways. Another impact of bias can be overlooking or underweighting the absence of data, not properly confirming results, or not considering alternatives. Consider, for example, how biasing terrorism contextual information can result in a false positive identification of dynamite. The Griess test, a presumptive color test for explosives, (50) was positive for nitroglycerine. However, the positive result could be gained from nitrocellulose in a range of innocent non-terrorism related products. Nevertheless, because of the contextual information, it was wrongly concluded that the appearance of a pink color meant the suspects handled dynamite, when actually the traces were generated from an old pack of playing cards that they had been shuffling. (51) The suspects were wrongfully convicted and were sentenced to life imprisonment, only to be exonerated and released from jail after serving 16 years following a review of the testing used to obtain their original convictions. (52) Such errors happen when analysts are biased to accept results that are expected or wanted, without carrying out proper confirmation or considering alternatives.
Another example is hair-strand drug and alcohol testing. Even though there is a known false-positive problem with immunoassays, and therefore, results must be confirmed by another more specific technique, these were not always property carried out when results were in line with contextual information. (53)
It is important to emphasize that contextual irrelevant information biases scientists and experts, and it can do so at an unconscious level–they may not be aware of the impact. The expectation biases what and how information is represented and processed in the brain. (41−46) These biases impact experts and cannot be properly controlled by mere willpower (see the six bias fallacies above).

(4) Base Rate

An important asset that experts bring to thier work is their experience from previous cases. However, such experience brings expectations to new cases, that are not derived from the specific case or analysis at hand, but nevertheless can still impact their interpretation. Hence, the sampling and analysis being conducted are impacted by factors that have nothing to do with the case at hand, but rather with expected base rates generated from previous unrelated cases, which influence how this case is conducted.
For example, when the cause of death is cerebral hypoxia caused by hanging, then the manner of death is most often suicide. In contrast, when cerebral hypoxia is caused by strangulation, then the manner of death is most often homicide. This is not necessarily the case since, although rarely, hanging can be a result of a homicide and strangulation can be a suicide. However, the base rate of associations between the cause and manner of death can bias the interpretation and determination of manner of death. Such base rate biases are common in many domains from medical diagnosis to security X-rays at airports. (54)
The biasing effects of base rates are not limited to how the results of the analysis are interpreted. They can impact other stages of the analysis, even the sampling and data collection, as well as detection of relevant marks, items, or signals, or even verification. For example, low target prevalence base rate bias shows that if in past experiences it was rare or uncommon to find, then observers are more likely to miss it in the future. (55,56) Hence, even when observers are searching for an item, mark, or signal, then base rate can bias their search: relatively rare and uncommon items, signals, or objects will make observers more likely to miss them even when they are present.
Base rate bias derives from expectations generated from past similar cases. (46) The issue is that this case is biased because its analysis is actually based on other cases. The crux of the bias is that perception and decisions are not based on the case itself. This type of bias is even more potent when the similarity to past cases is superficial and more in the eye of the beholder than in reality. (57)

(5) Organizational Factors

Organizational factors that can cause bias are many and varied, and have been well documented in a variety of domains. When it comes to DNA and other forensic evidence, where analysis and work is often conducted within the adversarial legal system, cognitive bias may emerge from an allegiance effect and myside bias. (58) Indeed, a study showed that when forensic experts are presented with identical data, they nevertheless reach conclusions biased toward the side that retained them (59)—an adversarial allegiance and myside bias. These are implicit biases, not explicit partiality when one side is openly favored over the other.
Many forensic science laboratories are part of law enforcement (or even part of the DA prosecution office). Such organizational influences have been recognized as biasing by the National Academy of Sciences Report, calling for “removing all public forensic laboratories and facilities from the administrative control of law enforcement agencies or prosecutors’ offices” (ref (60), Recommendation #4, page 24). The point is that organizational factors and the administration surrounding forensic science induces biases. (61)
The impact of organizational factors applies to any and every laboratory—they work within a variety of contexts, structures, and frameworks that can bias their work. For example, laboratories have clear hierarchy and dependencies. If there is a senior person who “signs off” on reports or analyses, there can be the danger of “writing what that person wants to read” and a lack of challenge of their scientific decisions. Thus, science is muddled with managerial authority and other organizational pressures. (62,63)
Other organizational factors relate to time pressure, expectations to reach certain results, stress, budget controls, pressure to obtain publications and other targets, and a whole range of organizational factors that can impact the work carried out in laboratories and other professional settings.

(6) Education and Training

Education and training play an important role in how work is conducted. For example, whether forensic examiners see their role more as supporting the police rather than as scientists. When approaching a case, training and education may instil the pursuit of a single hypothesis vs examining multiple hypotheses, considering alterative hypotheses (including scenarios proposed by the opposing side), conducting differential diagnosis, considering categorical decisions (such as “match” and “nonmatch”, often used in fingerprinting and firearms) vs using statistics and other methods to determine the strength of the evidence, etc. Digital forensics, firearms, fingerprinting, and many forensic domains have actually grown out of police work with minimal-to-no proper education and training in science.

(7) Personal Factors

Many personal factors impact biases and decision making. These include motivation, personal ideology and beliefs. Furthermore, some people are more risk takers and others are more risk averse, and people also vary in their tolerance to ambiguity. (64) Other individual differences between people can bias results, for example, tests that use color can be biased because people differ in what color they perceive when looking at the same item. (37)
In areas where there is more objective quantification and instrumentation, these factors are minimized. However, in areas where the human examiner has a greater role in deciding how to collect, sample and interpret the data, and where there is subjectivity in evaluatating the data and conclusions, then such personal factors play a greater role in how work is carried out.
Even when technology is used, human biases are still at play. Technology cannot be totally objective as humans are involved in construction and operation of the technology, as well as calibrating it, maintaining it, and interpreting the results and deciding if and what action to take. (65) Indeed, ISO (International Organization for Standardization) standard 17025:2017 (general requirements for the competence of testing and calibration of laboratories that rely more on instrumentation and objective quantification) (66) now follows the standard for the more subjective laboratory domains, ISO 17020 and ISO 17025, and includes specific requirements for impartiality and freedom from bias. Hence, it acknowledges the role of the human examiner, even when quantification and instrumentation are used. It recognizes that even the use of instrumentation does not guarantee freedom from bias. (67)
Other personal factors that can cause bias in decisions include the need for closure that can result in premature decisions or opting to reach inconclusive decisions, (68) how people respond to stress and fatigue, personality, and a whole range of personal factors that can impact expert decision making. (69−71)

(8) Human and Cognitive Factors, and the Human Brain

The workings of our brain create architectural and capacity constraints that do not allow it to process all the incoming information. The brain therefore engages in a variety of processes (mainly known as “top-down”) to make sense of the world and data around us. The human mind is not a camera, the active nature of human cognition means that we do not see the world “as it is.”
Beyond many cognitive processes and how the human brain is wired which can cause biases, there are biasing affects related to social interaction, in-group and availability biases, processing fluency, and other biasing influences that impact all of us. (72−75)

Snowball and Cascade Bias

ARTICLE SECTIONS
Jump To

Bias does not impact only the individual in isolation or just one aspect of the work; often the bias cascades from one person to another, from one aspect of the work to another, influencing different elements of an investigation. As people and various aspects are influenced, they then influence others, turning from influenced to influencers, perpetuating the bias and impacting others. Then, biases are not only cascaded but gather momentum and snowball. (32)

Overcoming Bias

ARTICLE SECTIONS
Jump To

First, we need to acknowledge the existence of bias and move beyond believing the fallacies about its nature. When people have a bias blind spot or think that as experts they are immune to it or that by mere willpower they can overcome it (see the six bias fallacies in Table 1), then biases only perpetuate.
Second, as a general principle to combat bias, we need to take actions that will cause us to focus solely on the relevant data and not work backward. These need to be part of ongoing training and laboratory procedures. Accreditation to the appropriate standards and certification in the appropriate discipline may not solve the problems, but they do force laboratories to document procedures. External scrutiny can also be very helpful for illuminating areas of bias (especially given that ISOs (e.g., 17020 and 17025 (66)) specifically require that steps are taken to make sure there is freedom from bias. (67)
Third, specifically, we can combat various sources of bias by the following:
(A)

Using blinding and masking techniques that prevent exposure to task irrelevant information. (76)

(B)

Using methods, such as Linear Sequential Unmasking (LSU), to control the sequence, timing, and linearity of exposure to information, so as to minimize “going backward” and being biased by the reference materials. (32)

(C)

Using case managers that screen and control what information is given to whom and when.

(D)

Using blind, double blind, and proper verifications when possible.

(E)

Rather than have one “reference target” or hypothesis, having a “line up” of competing and alternative conclusions and hypotheses.

(F)

Adopting a differential diagnosis approach, where all different conclusions and their probability are presented, rather than one conclusion. (77,78)

Summary and Conclusions

ARTICLE SECTIONS
Jump To

Biases, often without our awareness or consciousness, impact how we sample and perceive data, decide on testing strategies and how we interpret the results. People’s beliefs in various fallacies about bias prevent a proper understanding of bias, which in turn precludes taking steps to fight it. Understanding biases and acknowledging them enables discussion about their sources and to develop means to minimize their impact.
Six fallacies are presented (Table 1), which are widely held beliefs about biases that prevent correct conceptualization of bias. Then, sources of biases are presented (Figure 1), which emerge from the specific case being analyzed, the specific person doing the analysis, and from basic human nature that impacts us all. By presenting those fallacies and sources of bias, the hope is that people will reflect on their own practices and consider whether cognitive biases may unwillingly influence their work. Discussing bias often encounters defensive responses, not many want to consider and acknowledge their own biases, let alone research them. However, it is essential to do so if we are to combat and minimize bias. It is the hope of the author that this paper contributes to that direction.

Author Information

ARTICLE SECTIONS
Jump To

Acknowledgments

ARTICLE SECTIONS
Jump To

I want to thank Hilary Hamnett, Nikolas P. Lemos, Joseph Almog, Roderick Kennedy, and anonymous reviewers for their helpful comments on an earlier version of this perspective.

References

ARTICLE SECTIONS
Jump To

This article references 78 other publications.

  1. 1
    McCord, B. R.; Gauthier, Q.; Cho, S.; Roig, M. N.; Gibson-Daw, G. C.; Young, B.; Taglia, F.; Zapico, S. C.; Mariot, R. F.; Lee, S. B.; Duncan, G. Anal. Chem. 2019, 91, 673688,  DOI: 10.1021/acs.analchem.8b05318
  2. 2
    Barrio, P.A.; Crespillo, M.; Luque, J.A.; Aler, M.; Baeza-Richer, C.; Baldassarri, L.; Carnevali, E.; Coufalova, P.; Flores, I.; Garcia, O.; Garcia, M.A.; Gonzalez, R.; Hernandez, A.; Ingles, V.; Luque, G.M.; Mosquera-Miguel, A.; Pedrosa, S.; Pontes, M.L.; Porto, M.J.; Posada, Y.; Ramella, M.I.; Ribeiro, T.; Riego, E.; Sala, A.; Saragoni, V.G.; Serrano, A.; Vannelli, S. Forensic Sci. Int.: Genet. 2018, 35, 156163,  DOI: 10.1016/j.fsigen.2018.05.005
  3. 3
    Butler, J. M.; Kline, M. C.; Coble, M. D. Forensic Sci. Int.: Genet. 2018, 37, 8194,  DOI: 10.1016/j.fsigen.2018.07.024
  4. 4
    Bright, J.-A.; Cheng, K.; Kerr, Z.; McGovern, C.; Kelly, H.; Moretti, T. R.; Smith, M. A.; Bieber, F. R.; Budowle, B.; Coble, M. D.; Alghafri, R.; Allen, P. S.; Barber, A.; Beamer, V.; Buettner, C.; Russell, M.; Gehrig, C.; Hicks, T.; Charak, J.; Cheong-Wing, K.; Ciecko, A.; Davis, C. T.; Donley, M.; Pedersen, N.; Gartside, B.; Granger, D.; Greer-Ritzheimer, M.; Reisinger, E.; Kennedy, J.; Grammer, E.; Kaplan, M.; Hansen, D.; Larsen, H. J.; Laureano, A.; Li, C.; Lien, E.; Lindberg, E.; Kelly, C.; Mallinder, B.; Malsom, S.; Yacovone-Margetts, A.; McWhorter, A.; Prajapati, S. M.; Powell, T.; Shutler, G.; Stevenson, K.; Stonehouse, A. R.; Smith, L.; Murakami, J.; Halsing, E.; Wright, D.; Clark, L.; Taylor, D. A.; Buckleton, J. Forensic Sci. Int.: Genet. 2019, 40, 18,  DOI: 10.1016/j.fsigen.2019.01.006
  5. 5
    Cooper, G. S.; Meterko, V. Forensic Sci. Int. 2019, 297, 3546,  DOI: 10.1016/j.forsciint.2019.01.016
  6. 6
    Dror, I. E. J. Appl. Res. Mem. Cog. 2016, 5 (2), 121127,  DOI: 10.1016/j.jarmac.2016.03.001
  7. 7
    Dror, I. E.; Murrie, D. Psych. Pub. Policy Law. 2018, 24 (1), 1123,  DOI: 10.1037/law0000140
  8. 8
    Segall, M.; Chadwick, A. Future Med. Chem. 2011, 3 (7), 771774,  DOI: 10.4155/fmc.11.33
  9. 9
    Hamnett, H.; Jack, R. Sci. Justice 2019, 59 (4), 380389,  DOI: 10.1016/j.scijus.2019.02.004
  10. 10
    Forensic Science Regulator. Contextual Bias in Forensic toxicology , 2019. https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/800561/Lessons_Learnt_May19__L-B03_-final.pdf (accessed June 2020).
  11. 11
    Downs, U., Swienton, A. R., Eds.; Ethics in Forensic Science; Academic Press: Waltham, MA, 2012; pp 1441.
  12. 12
    Gannett, C. A130 Ethics in Forensic Science, 2020. https://cci.gosignmeup.com/public/Course/browse?courseid=2552 (accessed June 2020).
  13. 13
    Nickerson, R. S. Rev. Gen. Psych. 1998, 2, 175220,  DOI: 10.1037/1089-2680.2.2.175
  14. 14
    Bidgood, J. Chemist’s Misconduct Is Likely to Void 20,000 Massachusetts Drug Cases. New York Times , April 18, 2017.
  15. 15
    Mettler, K. How a Lab Chemist Went from ‘Superwoman’ to Disgraced Saboteur of More than 20,000 Drug Cases. Washington Post , April 21, 2017.
  16. 16
    Thompson, W. C. Southwestern Uni. Law Rev. 2009, 37, 971994
  17. 17
    Kukucka, J.; Kassin, S.; Zapf, P.; Dror, I. E. J. Appl. Res. Mem. Cog. 2017, 6 (4), 452459,  DOI: 10.1016/j.jarmac.2017.09.001
  18. 18
    Dror, I. E.; Kukucka, J.; Kassin, S.; Zapf, P. J. Appl. Res. Mem. Cog. 2018, 7 (2), 316317,  DOI: 10.1016/j.jarmac.2018.03.005
  19. 19
    Dror, I. E. In The Paradoxical Brain; Kapur, N., Ed.; Cambridge University Press: Cambridge, UK, 2011; pp 177188.
  20. 20
    Shanteat, J. In Advances in Design Research; Rohrmann, B., Beach, L. R., Vlek, C., Watson, S. R., Eds.; Elsevier: Amsterdam, 1989; pp 203215.
  21. 21
    Soller, J. M.; Ausband, D. E.; Gunther, S. M. PLoS One 2020, 15 (3), e0229762,  DOI: 10.1371/journal.pone.0229762
  22. 22
    Dror, I. E.; Wertheim, K.; Fraser-Mackenzie, P.; Walajtys, J. J. Forensic Sci. 2012, 57 (2), 343352,  DOI: 10.1111/j.1556-4029.2011.02013.x
  23. 23
    Pronin, E.; Lin, D. Y.; Ross, L. Person. Soc. Psych. Bull. 2002, 28, 369381,  DOI: 10.1177/0146167202286008
  24. 24
    Zapf, P.; Kukucka, J.; Kassin, S.; Dror, I. E. Psych., Pub. Policy Law 2018, 24 (1), 110,  DOI: 10.1037/law0000153
  25. 25
    Thornton, J. I. J. Forensic Sci. 2010, 55 (6), 1663,  DOI: 10.1111/j.1556-4029.2010.01497.x
  26. 26
    Wegner, D. M. Psych. Rev. 1994, 101, 3452,  DOI: 10.1037/0033-295X.101.1.34
  27. 27
    Steblay, N.; Hosch, H. M.; Culhane, S. E.; McWethy, A. Law Hum. Beh. 2006, 30, 469492,  DOI: 10.1007/s10979-006-9039-7
  28. 28
    Zajonc, R. B. Am. Psychol. 1980, 35, 151175,  DOI: 10.1037/0003-066X.35.2.151
  29. 29
    Finucane, M. L.; Alhakami, A.; Slovic, P.; Johnson, S. M. J. Behav. Decis. Making 2000, 13, 117,  DOI: 10.1002/(SICI)1099-0771(200001/03)13:1<1::AID-BDM333>3.0.CO;2-S
  30. 30
    Damasio, A. R. Descartes’ Error: Emotion, Reason, and the Human Brain; Penguin Books: New York, 2005; pp 1312.
  31. 31
    Jeanguenat, A. M.; Budowle, B.; Dror, I. E. Sci. Justice 2017, 57 (6), 415420,  DOI: 10.1016/j.scijus.2017.07.005
  32. 32
    Dror, I. E. Science 2018, 360 (6386), 243,  DOI: 10.1126/science.aat8443
  33. 33
    Starr, D. Forensics gone wrong: When DNA snares the innocent. Science 2016, 7, nana,  DOI: 10.1126/science.aaf4160
  34. 34
    Hanna, J.; Valencia, N. DNA Analysis Clears Georgia Man Who Served 17 Years in Wrongful Rape Conviction. CNN, January 10, 2020.
  35. 35
    Dror, I. E.; Hampikian, G. Sci. Justice 2011, 51 (4), 204208,  DOI: 10.1016/j.scijus.2011.08.004
  36. 36
    A Review of the FBI’s Handling of the Brandon Mayfield Case; Office of the Inspector General, Oversight & Review Division, U.S. Department of Justice, 2006.
  37. 37
    Marqués-Mateu, Á.; Moreno-Ramón, H.; Balasch, S.; Ibáñez-Asensio, S. Catena 2018, 171, 4453,  DOI: 10.1016/j.catena.2018.06.027
  38. 38
    Berlin, L. AJR, Am. J. Roentgenol. 2007, 189, 517522,  DOI: 10.2214/AJR.07.2209
  39. 39
    Kriegeskorte, N.; Simmons, W K.; Bellgowan, P. S F; Baker, C. I Nat. Neurosci. 2009, 12, 535540,  DOI: 10.1038/nn.2303
  40. 40
    Vul, E.; Kanwisher, N. In Foundational Issues for Human Brain Mapping; Hanson, S., Bunzl, M., Eds.; MIT Press: Cambridge, MA, 2010; pp 7192.
  41. 41
    Richter, D.; Ekman, M.; de Lange, F. P. J. Neurosci. 2018, 38, 74527461,  DOI: 10.1523/JNEUROSCI.3421-17.2018
  42. 42
    Luck, S. J.; Ford, M. A. Proc. Natl. Acad. Sci. U. S. A. 1998, 95 (3), 825830,  DOI: 10.1073/pnas.95.3.825
  43. 43
    Simons, D. J.; Chabris, C. F. Percept. 1999, 28, 10591074,  DOI: 10.1068/p281059
  44. 44
    Stein, T.; Peelen, M. V. J. Exp. Psychol. Gen. 2015, 144, 10891104,  DOI: 10.1037/xge0000109
  45. 45
    Kok, P.; Brouwer, G. J.; van Gerven, M. A. J.; de Lange, F. P. J. Neurosci. 2013, 33, 1627516284,  DOI: 10.1523/JNEUROSCI.0742-13.2013
  46. 46
    de Lange, F. P.; Heilbron, M.; Kok, P. Trends Cognit. Sci. 2018, 22, 764779,  DOI: 10.1016/j.tics.2018.06.002
  47. 47
    Gardner, B. O.; Kelley, S.; Murrie, D. C.; Blaisdell, K. N. Forensic Sci. Int. 2019, 297, 236242,  DOI: 10.1016/j.forsciint.2019.01.048
  48. 48
    Eeden, C. A. J.; de Poot, C. J.; van Koppen, P. J. J. Forensic Sci. 2019, 64 (1), 120126,  DOI: 10.1111/1556-4029.13817
  49. 49
    Morewedge, C. K.; Kahneman, D. Trends Cognit. Sci. 2010, 14, 435440,  DOI: 10.1016/j.tics.2010.07.004
  50. 50
    Moorcroft, M.; Davis, J.; Compton, R. G. Talanta 2001, 54, 785803,  DOI: 10.1016/S0039-9140(01)00323-X
  51. 51
    Almog, J.; Zitrin, S. In Aspects of Explosive Detection; Marshal, M., Oxley, M., Eds.; Elsevier: Amsterdam, 2009; pp 4748.
  52. 52
    Lissaman, C. Birmingham Pub Bombers Will Probably Never Be Found. BBC News, March 14, 2011.
  53. 53
    Lang, S. E. Report of the Motherisk Hair Analysis Independent Review; Ontario Ministry of the Attorney General, Canada, 2015.
  54. 54
    Egglin, T. K.; Feinstein, A. R. J. Am. Med. Assoc. 1996, 276 (21), 17521755,  DOI: 10.1001/jama.1996.03540210060035
  55. 55
    Wolfe, J. M.; Horowitz, T. S.; Kenner, N. M. Nature 2005, 435, 439440,  DOI: 10.1038/435439a
  56. 56
    Wolfe, J. M.; Horowitz, T. S.; Van Wert, M. J.; Kenner, N. M.; Place, S. S.; Kibbi, N. J. Exp. Psychol. Gen. 2007, 136, 623638,  DOI: 10.1037/0096-3445.136.4.623
  57. 57
    Shafffi, E. B.; Smith, E. E.; Osherson, D. N. Mem Cog. 1990, 18 (3), 229239,  DOI: 10.3758/BF03213877
  58. 58
    Simon, D.; Ahn, M.; Stenstrom, D. M.; Read, S. J. Psych. Public Policy. Law 2020, n, na,  DOI: 10.1037/law0000226
  59. 59
    Murrie, D. C.; Boccaccini, M. T.; Guarnera, L. A.; Rufino, K. A. Psych Sci. 2013, 24, 18891897,  DOI: 10.1177/0956797613481812
  60. 60
    Strengthening Forensic Science in the United States: A Path Forward; National Academies Press: Washington, DC, 2009.
  61. 61
    Whitman, G.; Koppl, R. Law Prob. Risk 2010, 9, 6990,  DOI: 10.1093/lpr/mgp028
  62. 62
    Howard, J. Cognitive Errors and Diagnostic Mistakes: A Case-Based Guide to Critical Thinking in Medicine; Springer: New York, 2019.
  63. 63
    Cosby, K. S.; Croskerry, P. Acad. Emergency Med. 2004, 11, 13411345,  DOI: 10.1197/j.aem.2004.07.005
  64. 64
    Saposnik, G.; Redelmeier, D.; Ruff, C. C.; Tobler BMC Med. Inf. Decis. Making 2016, 16, 138,  DOI: 10.1186/s12911-016-0377-1
  65. 65
    Dror, I. E.; Mnookin, J. Law Prob. Risk 2010, 9 (1), 4767,  DOI: 10.1093/lpr/mgp031
  66. 66
    General Requirements for the Competence of Testing and Calibration Laboratories, 3rd ed.; ISO/IEC 17025; International Organization for Standardization/International Electrotechnical Commission, Geneva, Switzerland, 2017.
  67. 67
    Dror, I. E.; Pierce, M. L. J. Forensic Sci. 2020, 65 (3), 800808,  DOI: 10.1111/1556-4029.14265
  68. 68
    Dror, I. E.; Langenburg, G. J. Forensic Sci. 2019, 64 (1), 1015,  DOI: 10.1111/1556-4029.13854
  69. 69
    Gok, K.; Atsan, N. Intern J. Business Soc. Res. 2016, 6 (3), 3847,  DOI: 10.18533/ijbsr.v6i3.936
  70. 70
    Neal, T. PLoS One 2016, 11 (4), e0154434  DOI: 10.1371/journal.pone.0154434
  71. 71
    Miller, A. K.; Rufino, K. A.; Boccaccini, M. T.; Jackson, R. L.; Murrie, D. C. Assessment 2011, 18 (2), 253260,  DOI: 10.1177/1073191111402460
  72. 72
    Griffin, D.; Ross, L. In Advances in Experimental Social Psychology; Zanna, M. P., Ed.; Academic Press: San Diego, CA, 1991; pp 319359.
  73. 73
    Ross, L.; Greene, D.; House, P. J. Exp. Soc. Psych. 1977, 13, 279301,  DOI: 10.1016/0022-1031(77)90049-X
  74. 74
    Oppenheimer, D. M. Trends Cognit. Sci. 2008, 12, 237241,  DOI: 10.1016/j.tics.2008.02.014
  75. 75
    Goldstein, D. G.; Gigerenzer, G. Psychol. Rev. 2002, 109, 7590,  DOI: 10.1037/0033-295X.109.1.75
  76. 76
    Robertson, C., Kesselheim, A., Eds.; Blinding as a Solution to Bias: Strengthening Biomedical Science, Forensic Science, and Law; Academic Press: New York, 2016; pp 1388.
  77. 77
    Maude, J. Diagnosis 2014, 1, 107109,  DOI: 10.1515/dx-2013-0009
  78. 78
    Barondess, J. A., Carpenter, C. C., Eds.; Differential Diagnosis; Lea & Febiger: Philadelphia, PA, 1994; pp 1800.

Cited By

ARTICLE SECTIONS
Jump To

This article is cited by 131 publications.

  1. Mohammed A. Almazrouei. Comment on “Cognitive and Human Factors in Expert Decision Making: Six Fallacies and the Eight Sources of Bias”. Analytical Chemistry 2020, 92 (18) , 12725-12726. https://doi.org/10.1021/acs.analchem.0c03002
  2. Itiel E. Dror. Reply to Comment on “Cognitive and Human Factors in Expert Decision Making: Six Fallacies and the Eight Sources of Bias”. Analytical Chemistry 2020, 92 (18) , 12727-12728. https://doi.org/10.1021/acs.analchem.0c03051
  3. Lee Curley, Till Neuhaus. Are legal experts better decision makers than jurors? A psychological evaluation of the role of juries in the 21st century. Journal of Criminal Psychology 2024, 127 https://doi.org/10.1108/JCP-12-2023-0079
  4. Stefan K. Schauber, Anne O. Olsen, Erik L. Werner, Morten Magelssen. Inconsistencies in rater-based assessments mainly affect borderline candidates: but using simple heuristics might improve pass-fail decisions. Advances in Health Sciences Education 2024, 16 https://doi.org/10.1007/s10459-024-10328-0
  5. Frank Crispino. Towards a forensic semiotics. Forensic Science International 2024, 357 , 111968. https://doi.org/10.1016/j.forsciint.2024.111968
  6. Aleksandr Segal, Francesco Pompedda, Shumpei Haginoya, Goda Kaniušonytė, Pekka Santtila. Avatars with child sexual abuse (vs. no abuse) scenarios elicit different emotional reactions. Psychology, Crime & Law 2024, 30 (3) , 250-270. https://doi.org/10.1080/1068316X.2022.2082422
  7. Tayyaba Masood, Scheila Mânica, Hemlata Pandey. The Most Common Types of Bias in a Human Bitemark Analysis. Oral 2024, 4 (1) , 113-125. https://doi.org/10.3390/oral4010010
  8. Meghan Prusinowski, Pedram Tavadze, Zachary Andrews, Logan Lang, Divyanjali Pulivendhan, Cedric Neumann, Aldo H. Romero, Tatiana Trejos. Experimental results on data analysis algorithms for extracting and interpreting edge feature data for duct tape and textile physical fit examinations. Journal of Forensic Sciences 2024, 69 (2) , 498-514. https://doi.org/10.1111/1556-4029.15449
  9. Raphael G. De Vittoris, Norbert Lebrument, Carole Bousquet. Listening to experts is not always wise: Unravelling the dynamics of decision‐making in the crisis cell. Journal of Contingencies and Crisis Management 2024, 32 (1) https://doi.org/10.1111/1468-5973.12553
  10. Bethany Growns, Tess M.S. Neal. Forensic Science Decision-Making. 2024, 193-210. https://doi.org/10.1017/9781009119375.013
  11. Charlotte A. Bücken, Ivan Mangiulli, Brenda Erens, Aniek Leistra, Henry Otgaar. International researchers and child protection service workers beliefs about child sexual abuse disclosure and statement validity. Psychology, Crime & Law 2024, 85 , 1-25. https://doi.org/10.1080/1068316X.2024.2318370
  12. Wim Neuteboom, Alastair Ross, Lyndal Bugeja, Sheila Willis, Claude Roux, Kevin Lothridge. Quality management and competencies in forensic science. WIREs Forensic Science 2024, https://doi.org/10.1002/wfs2.1513
  13. Jenny Skrifvars, Veronica Sui, Jan Antfolk, Tanja van Veldhuizen, Julia Korkman. Psychological assumptions underlying credibility assessments in Finnish asylum determinations. Nordic Psychology 2024, 76 (1) , 55-77. https://doi.org/10.1080/19012276.2022.2145986
  14. Olof Svensson, Peter Andiné, Sara Bromander, Karl Ask, Ann-Sophie Lindqvist Bagge, Malin Hildebrand Karlén. Experts' decision-making processes in Swedish forensic psychiatric investigations: A case vignette study. International Journal of Law and Psychiatry 2024, 92 , 101947. https://doi.org/10.1016/j.ijlp.2023.101947
  15. Sofia Holguin Jimenez, Xavier Godot, Jelena Petronijevic, Marc Lassagne, Bruno Daille-Lefevre. Considering cognitive biases in design: an integrated approach. Procedia Computer Science 2024, 232 , 2800-2809. https://doi.org/10.1016/j.procs.2024.02.097
  16. Lyndsie Ferrara. Ethical Culture in Forensic Science. 2024, 15-27. https://doi.org/10.1007/978-3-031-58392-6_3
  17. Zachary Andrews, Meghan Prusinowski, Evie Nguyen, Cedric Neumann, Tatiana Trejos. Assessing physical fit examinations of stabbed and torn textiles through a large dataset of casework‐like items and interlaboratory studies. Journal of Forensic Sciences 2023, https://doi.org/10.1111/1556-4029.15452
  18. Sarah Barrington, Hany Farid. A comparative analysis of human and AI performance in forensic estimation of physical attributes. Scientific Reports 2023, 13 (1) https://doi.org/10.1038/s41598-023-31821-3
  19. Marcele Elisa Fontana, Natallya de Almeida Levino, Patrícia Guarnieri, Sattar Salehi. Using Group Decision-Making to assess the negative environmental, social and economic impacts of unstable rock salt mines in Maceio, Brazil. The Extractive Industries and Society 2023, 16 , 101360. https://doi.org/10.1016/j.exis.2023.101360
  20. Itiel E. Dror. Racial Bias in Forensic Decision Making. Comment on Yim, A.-D.; Passalacqua, N.V. A Systematic Review and Meta-Analysis of the Effects of Race in the Criminal Justice System with Respect to Forensic Science Decision Making: Implications for Forensic Anthropology. Humans 2023, 3, 203–218. Humans 2023, 3 (4) , 319-320. https://doi.org/10.3390/humans3040024
  21. Kathy Pezdek, Tamar Lerer. The New Reality: Non-Eyewitness Identifications in a Surveillance World. Current Directions in Psychological Science 2023, 32 (6) , 439-445. https://doi.org/10.1177/09637214231182582
  22. , Yung-Fou Chen, Paul Kuei-chi Tseng, . The Boundary of Artificial Intelligence in Forensic Science. DIALOGO 2023, 10 (1) , 83-90. https://doi.org/10.51917/dialogo.2023.10.1.5
  23. Hedayat Selim, Julia Korkman, Elina Pirjatanniemi, Jan Antfolk. Asylum claims based on sexual orientation: a review of psycho-legal issues in credibility assessments. Psychology, Crime & Law 2023, 29 (10) , 1001-1030. https://doi.org/10.1080/1068316X.2022.2044038
  24. Alexandre Giovanelli. The forensic´s scientist craft: toward an integrative theory. Part 2: meso- and macroapproach. Australian Journal of Forensic Sciences 2023, 67 , 1-16. https://doi.org/10.1080/00450618.2023.2283418
  25. Hedayat Selim, Julia Korkman, Peter Nynäs, Elina Pirjatanniemi, Jan Antfolk. A review of psycho-legal issues in credibility assessments of asylum claims based on religion. Psychiatry, Psychology and Law 2023, 30 (6) , 760-788. https://doi.org/10.1080/13218719.2022.2116611
  26. Michelle D. Sullivan, William Pinson, Troy Eberhardt, John J. Ross,, Tyler W. Wood. Deposition order and physicochemical process visualization of ink intersections using X‐ray photoelectron spectroscopy imaging for forensic analysis. Surface and Interface Analysis 2023, 55 (11) , 808-821. https://doi.org/10.1002/sia.7246
  27. Itiel E. Dror. The most consistent finding in forensic science is inconsistency. Journal of Forensic Sciences 2023, 68 (6) , 1851-1855. https://doi.org/10.1111/1556-4029.15369
  28. Mario S Staller, Swen Koerner. A case example of teaching reflective policing to police students. Teaching Public Administration 2023, 41 (3) , 351-366. https://doi.org/10.1177/01447394211067109
  29. Christine J. Ko, Earl J. Glusac. Cognitive bias in pathology, as exemplified in dermatopathology. Human Pathology 2023, 140 , 267-275. https://doi.org/10.1016/j.humpath.2023.03.003
  30. Michael Odei Erdiaw-Kwasie, Matthew Abunyewah, Charles Baah. Corporate social responsibility (CSR) and cognitive bias: A systematic review and research direction. Resources Policy 2023, 86 , 104201. https://doi.org/10.1016/j.resourpol.2023.104201
  31. Sally F. Kelty, Olivier Ribaux, James Robertson. Identifying the critical skillset of top crime scene examiners: Why this matters and why agencies should develop top performers. WIREs Forensic Science 2023, 5 (5) https://doi.org/10.1002/wfs2.1494
  32. Frances A. Whitehead, Mary R. Williams, Michael E. Sigman. Analyst and machine learning opinions in fire debris analysis. Forensic Chemistry 2023, 35 , 100517. https://doi.org/10.1016/j.forc.2023.100517
  33. Trayci A. Dahl, Melinda DiCiro, Emily Ziegler, Sean Sterling. Field Reliability of Forensic Mental Health Evaluations of Offenders for Post-Prison Civil Commitment. Journal of Forensic Psychology Research and Practice 2023, 25 , 1-19. https://doi.org/10.1080/24732850.2023.2249451
  34. Nathalie Bugeja, Cameron Oliver, Nicole McGrath, Jake McGuire, Chunhui Yan, Felicity Carlysle-Davies, Marc Reid. Teaching old presumptive tests new digital tricks with computer vision for forensic applications. Digital Discovery 2023, 2 (4) , 1143-1151. https://doi.org/10.1039/D3DD00066D
  35. Jonathan W. Hak. “The pedagogical expert witness: teaching complex science in the courtroom”. Canadian Society of Forensic Science Journal 2023, 56 (3) , 182-189. https://doi.org/10.1080/00085030.2022.2135742
  36. Meghan Prusinowski, Evie Brooks, Cedric Neumann, Tatiana Trejos. Forensic interlaboratory evaluations of a systematic method for examining, documenting, and interpreting duct tape physical fits. Forensic Chemistry 2023, 34 , 100487. https://doi.org/10.1016/j.forc.2023.100487
  37. Radina Stoykova, Katrin Franke. Reliability validation enabling framework (RVEF) for digital forensics in criminal investigations. Forensic Science International: Digital Investigation 2023, 45 , 301554. https://doi.org/10.1016/j.fsidi.2023.301554
  38. Paulina Salazar-Aguilar, Carlos Zaror-Sánchez, Gabriel M. Fonseca. Forensic odontology: Wrong convictions, “bad apples” and “the innocence files”. Journal of Forensic and Legal Medicine 2023, 96 , 102528. https://doi.org/10.1016/j.jflm.2023.102528
  39. John Morgan. Wrongful convictions and claims of false or misleading forensic evidence. Journal of Forensic Sciences 2023, 68 (3) , 908-961. https://doi.org/10.1111/1556-4029.15233
  40. Sara L. Gleasman-DeSimone, . Identifying and Addressing Bias in Nursing Teaching: A Creative Controversy Essay. Nursing Forum 2023, 2023 , 1-6. https://doi.org/10.1155/2023/3459527
  41. Aleksandr Segal, Aistė Bakaitytė, Goda Kaniušonytė, Laura Ustinavičiūtė-Klenauskė, Shumpei Haginoya, Yikang Zhang, Francesco Pompedda, Rita Žukauskienė, Pekka Santtila. Associations between emotions and psychophysiological states and confirmation bias in question formulation in ongoing simulated investigative interviews of child sexual abuse. Frontiers in Psychology 2023, 14 https://doi.org/10.3389/fpsyg.2023.1085567
  42. Marion Davidson, Sherry Nakhaeizadeh, Carolyn Rando. Cognitive bias and the order of examination in forensic anthropological non-metric methods: a pilot study. Australian Journal of Forensic Sciences 2023, 55 (2) , 255-271. https://doi.org/10.1080/00450618.2021.1998625
  43. Soroush Korivand, Nader Jalili, Jiaqi Gong. Experiment protocols for brain-body imaging of locomotion: A systematic review. Frontiers in Neuroscience 2023, 17 https://doi.org/10.3389/fnins.2023.1051500
  44. Carla L. MacLean, Itiel E. Dror. Measuring base-rate bias error in workplace safety investigators. Journal of Safety Research 2023, 84 , 108-116. https://doi.org/10.1016/j.jsr.2022.10.012
  45. Colleen M. Berryessa, Itiel E. Dror, Chief Justice Bridget McCormack. Prosecuting from the bench? Examining sources of pro‐prosecution bias in judges. Legal and Criminological Psychology 2023, 28 (1) , 1-14. https://doi.org/10.1111/lcrp.12226
  46. Meghan Prusinowski, Zachary Andrews, Cedric Neumann, Tatiana Trejos. Assessing significant factors that can influence physical fit examinations – Part I. Physical fits of torn and cut duct tapes. Forensic Science International 2023, 343 , 111567. https://doi.org/10.1016/j.forsciint.2023.111567
  47. Wouter A. Karst, Hubert G. T. Nijs. Bayesian Interpretation of Paediatric Fractures. 2023, 543-547. https://doi.org/10.1007/978-3-031-12041-1_17
  48. Mario S. Staller, Benjamin Zaiser, Swen Koerner. Kognitive Verzerrungen: Ein Problemaufriss zum polizeilichen Interaktionsverhalten. 2023, 397-419. https://doi.org/10.1007/978-3-658-40118-4_20
  49. Sally Kelty, Nathan Green, Olivier Ribaux, Claude Roux, James Robertson. Assessment of Occupational Stress. 2023, 209-220. https://doi.org/10.1016/B978-0-12-823677-2.00017-9
  50. Noel W.F. Woodford, Matthew J. Lynch. Evidence at Autopsy. 2023, 306-315. https://doi.org/10.1016/B978-0-12-823677-2.00125-2
  51. Itiel E. Dror. Cognitive Bias. 2023, 586-590. https://doi.org/10.1016/B978-0-12-823677-2.00162-8
  52. Susan Miller, Walter Moos, Barbara Munk, Stephen Munk, Charles Hart, David Spellmeyer. Drug discovery: Chaos can be your friend or your enemy. 2023, 417-511. https://doi.org/10.1016/B978-0-12-824304-6.00012-2
  53. Paul Andrewes. Physical sampling practices and principles: Is it an underappreciated facet of dairy science?. International Dairy Journal 2023, 136 , 105491. https://doi.org/10.1016/j.idairyj.2022.105491
  54. Amit Khanna, Graham Jones. Toward Personalized Medicine Approaches for Parkinson Disease Using Digital Technologies. JMIR Formative Research 2023, 7 , e47486. https://doi.org/10.2196/47486
  55. Jemmy T. Bouzin, Thais Lópes, Anna L. Heavey, Jessie Parrish, Georgina Sauzier, Simon W. Lewis. Mind the gap: The challenges of sustainable forensic science service provision. Forensic Science International: Synergy 2023, 6 , 100318. https://doi.org/10.1016/j.fsisyn.2023.100318
  56. Kimberly S. Kunkler, Tiffany Roy. Reducing the impact of cognitive bias in decision making: Practical actions for forensic science practitioners. Forensic Science International: Synergy 2023, 7 , 100341. https://doi.org/10.1016/j.fsisyn.2023.100341
  57. Jodi E. Allen, Gemma Clunie, Joan K.-Y. Ma, Margaret Coffey, Katharina Winiker, Sally Richmond, Soren Y. Lowell, Anna Volkmer. Translating Ultrasound into Clinical Practice for the Assessment of Swallowing and Laryngeal Function: A Speech and Language Pathology-Led Consensus Study. Dysphagia 2022, 37 (6) , 1586-1598. https://doi.org/10.1007/s00455-022-10413-9
  58. Yuki Yamada, Jaime A. Teixeira da Silva. A psychological perspective towards understanding the objective and subjective gray zones in predatory publishing. Quality & Quantity 2022, 56 (6) , 4075-4087. https://doi.org/10.1007/s11135-021-01307-3
  59. Maria Cuellar, Jacqueline Mauro, Amanda Luby. A Probabilistic Formalisation of Contextual Bias: from Forensic Analysis to Systemic Bias in the Criminal Justice System. Journal of the Royal Statistical Society Series A: Statistics in Society 2022, 185 (Supplement_2) , S620-S643. https://doi.org/10.1111/rssa.12962
  60. Joseph O. C. Coyne, Aaron J. Coutts, Robert U. Newton, G. Gregory Haff. The Current State of Subjective Training Load Monitoring: Follow-Up and Future Directions. Sports Medicine - Open 2022, 8 (1) https://doi.org/10.1186/s40798-022-00433-y
  61. Kate Sloan, Macarthur Fergusson, James Robertson. Textile damage science—Is it a reliable science?. WIREs Forensic Science 2022, 4 (6) https://doi.org/10.1002/wfs2.1468
  62. Carla L. MacLean. Cognitive bias in workplace investigation: Problems, perspectives and proposed solutions. Applied Ergonomics 2022, 105 , 103860. https://doi.org/10.1016/j.apergo.2022.103860
  63. . References. 2022, 213-229. https://doi.org/10.1002/9781119582021.refs
  64. Mario S. Staller, Swen Koerner. (Non-)learning to police: A framework for understanding police learning. Frontiers in Education 2022, 7 https://doi.org/10.3389/feduc.2022.730789
  65. Melanie‐Angela Neuilly. Sources of bias in death determination: A research note articulating the need to include systemic sources of biases along with cognitive ones as impacting mortality data. Journal of Forensic Sciences 2022, 67 (5) , 2032-2039. https://doi.org/10.1111/1556-4029.15080
  66. Annelies Vredeveldt, Eva A. J. van Rosmalen, Peter J. van Koppen, Itiel E. Dror, Henry Otgaar. Legal psychologists as experts: guidelines for minimizing bias. Psychology, Crime & Law 2022, 1 , 1-25. https://doi.org/10.1080/1068316X.2022.2114476
  67. Nicole A. Mantl, Sherry Nakhaeizadeh, Rebecca Watts, Carolyn Rando, Ruth M. Morgan. Evaluating intuitive decision-making in non-metric sex estimation from the cranium: an exploratory study. Australian Journal of Forensic Sciences 2022, 3 , 1-22. https://doi.org/10.1080/00450618.2022.2104371
  68. Eric Rassin. ‘Anyone who commits such a cruel crime, must be criminally irresponsible’: context effects in forensic psychological assessment. Psychiatry, Psychology and Law 2022, 29 (4) , 506-515. https://doi.org/10.1080/13218719.2021.1938272
  69. Hans H. de Boer, Judith Fronczek, Charles E. H. Berger, Marjan Sjerps. The logic of forensic pathology opinion. International Journal of Legal Medicine 2022, 136 (4) , 1027-1036. https://doi.org/10.1007/s00414-021-02754-1
  70. Sher-Lin Chiam, Jennie Louise, Denice Higgins. “Identified”, “probable”, “possible” or “exclude”: The influence of task-irrelevant information on forensic odontology identification opinion. Science & Justice 2022, 62 (4) , 461-470. https://doi.org/10.1016/j.scijus.2022.06.002
  71. Zhengmin Peng, Kunhui Ye, Jiale Li. Break the Cycle of Collusion: Simulation to Influence Mechanism of Cognitive Bias on To-Collude Decision Making. Buildings 2022, 12 (7) , 997. https://doi.org/10.3390/buildings12070997
  72. Sanjeev Kumar Appicharla. From Nobel Prizes to Safety Risk Management: How to Identify Latent Failure Conditions in Risk Management Practices. 2022https://doi.org/10.5772/intechopen.98960
  73. Adedoyin Eisape, André Nogueira. See Change: Overcoming Anti-Black Racism in Health Systems. Frontiers in Public Health 2022, 10 https://doi.org/10.3389/fpubh.2022.895684
  74. Mario S Staller, Benjamin Zaiser, Swen Koerner. The problem of entanglement: Biases and fallacies in police conflict management. International Journal of Police Science & Management 2022, 24 (2) , 113-123. https://doi.org/10.1177/14613557211064054
  75. Frank Crispino, Céline Weyermann, Olivier Delémont, Claude Roux, Olivier Ribaux. Towards another paradigm for forensic science?. WIREs Forensic Science 2022, 4 (3) https://doi.org/10.1002/wfs2.1441
  76. Lizel Göranson, Olof Svensson, Peter Andiné, Sara Bromander, Ann-Sophie Lindqvist Bagge, Malin Hildebrand Karlén. Decision-Making Within Forensic Psychiatric Investigations: The Use of Various Information Sources by Different Expert Groups to Reach Conclusions on Legal Insanity. Frontiers in Psychiatry 2022, 13 https://doi.org/10.3389/fpsyt.2022.822519
  77. Ning He, Ling Wang, Hongxia Hao. Contextual bias on decision-making in forensic toxicology: First survey from China. Forensic Science International 2022, 333 , 111232. https://doi.org/10.1016/j.forsciint.2022.111232
  78. Vanessa Meterko, Glinda Cooper. Cognitive Biases in Criminal Case Evaluation: A Review of the Research. Journal of Police and Criminal Psychology 2022, 37 (1) , 101-122. https://doi.org/10.1007/s11896-020-09425-8
  79. Nina Sunde. Strategies for safeguarding examiner objectivity and evidence reliability during digital forensic investigations. Forensic Science International: Digital Investigation 2022, 40 , 301317. https://doi.org/10.1016/j.fsidi.2021.301317
  80. Doyeon Lee, Keunhwan Kim. Public R&D Projects-Based Investment and Collaboration Framework for an Overarching South Korean National Strategy of Personalized Medicine. International Journal of Environmental Research and Public Health 2022, 19 (3) , 1291. https://doi.org/10.3390/ijerph19031291
  81. Enide Maegherman, Karl Ask, Robert Horselenberg, Peter J. van Koppen. Law and order effects: on cognitive dissonance and belief perseverance. Psychiatry, Psychology and Law 2022, 29 (1) , 33-52. https://doi.org/10.1080/13218719.2020.1855268
  82. Vitaliy Tsyganok, Oleh Andriichuk, Sergii Kadenko, Yaroslava Porplenko, Oksana Vlasenko. An Approach to Reduction of the Number of Pair-Wise Alternative Comparisons During Individual and Group Decision-Making. 2022, 163-183. https://doi.org/10.1007/978-3-030-94910-5_9
  83. Barbara A. Spellman, Heidi Eldridge, Paul Bieber. Challenges to reasoning in forensic science decisions. Forensic Science International: Synergy 2022, 4 , 100200. https://doi.org/10.1016/j.fsisyn.2021.100200
  84. Adele Quigley-McBride, Itiel E. Dror, Tiffany Roy, Brandon L. Garrett, Jeff Kukucka. A practical tool for information management in forensic decisions: Using Linear Sequential Unmasking-Expanded (LSU-E) in casework. Forensic Science International: Synergy 2022, 4 , 100216. https://doi.org/10.1016/j.fsisyn.2022.100216
  85. Maria Cuellar, Cleotilde Gonzalez, Itiel E. Dror. Human and machine similarity judgments in forensic firearm comparisons. Forensic Science International: Synergy 2022, 5 , 100283. https://doi.org/10.1016/j.fsisyn.2022.100283
  86. Itiel E. Dror, Dwayne A. Wolf, Garrett Phillips, Si Gao, Yijiong Yang, Stacy A. Drake. Contextual information in medicolegal death investigation decision-making: Manner of death determination for cases of a single gunshot wound. Forensic Science International: Synergy 2022, 5 , 100285. https://doi.org/10.1016/j.fsisyn.2022.100285
  87. Olof Svensson, Peter Andiné, Sara Bromander, Karl Ask, Ann-Sophie Lindqvist Bagge, Malin Hildebrand Karlén. The decision-making process in Swedish forensic psychiatric investigations. International Journal of Law and Psychiatry 2022, 80 , 101709. https://doi.org/10.1016/j.ijlp.2021.101709
  88. Stephanie Hartley, Allysha Powanda Winburn, Itiel E. Dror. Metric forensic anthropology decisions: Reliability and biasability of sectioning‐point‐based sex estimates. Journal of Forensic Sciences 2022, 67 (1) , 68-79. https://doi.org/10.1111/1556-4029.14931
  89. Gerald Young, Jane Goodman-Delahunty. Revisiting Daubert: Judicial Gatekeeping and Expert Ethics in Court. Psychological Injury and Law 2021, 14 (4) , 304-315. https://doi.org/10.1007/s12207-021-09428-8
  90. María Manuela Moreno-Fernández, Fernando Blanco, Helena Matute. The tendency to stop collecting information is linked to illusions of causality. Scientific Reports 2021, 11 (1) https://doi.org/10.1038/s41598-021-82075-w
  91. Mohammed A. Almazrouei, Ruth M. Morgan, Itiel E. Dror. Stress and support in the workplace: The perspective of forensic examiners. Forensic Science International: Mind and Law 2021, 2 , 100059. https://doi.org/10.1016/j.fsiml.2021.100059
  92. Itiel E. Dror, Judy Melinek, Jonathan L. Arden, Jeff Kukucka, Sarah Hawkins, Joye Carter, Daniel S. Atherton. Authors’ Response to Speth et al Commentary on. Journal of Forensic Sciences 2021, 66 (6) , 2580-2581. https://doi.org/10.1111/1556-4029.14845
  93. Itiel E. Dror, Judy Melinek, Jonathan L. Arden, Jeff Kukucka, Sarah Hawkins, Joye Carter, Daniel S. Atherton. Authors’ Response to Gill et al Response. Journal of Forensic Sciences 2021, 66 (6) , 2559-2560. https://doi.org/10.1111/1556-4029.14846
  94. Itiel E. Dror, Judy Melinek, Jonathan L. Arden, Jeff Kukucka, Sarah Hawkins, Joye Carter, Daniel S. Atherton. Authors’ Response to Obenson Commentary on. Journal of Forensic Sciences 2021, 66 (6) , 2585-2586. https://doi.org/10.1111/1556-4029.14847
  95. Itiel E. Dror, Judy Melinek, Jonathan L. Arden, Jeff Kukucka, Sarah Hawkins, Joye Carter, Daniel S. Atherton. Authors’ Response to Graber Commentary on. Journal of Forensic Sciences 2021, 66 (6) , 2575-2576. https://doi.org/10.1111/1556-4029.14848
  96. Itiel E. Dror, Judy Melinek, Jonathan L. Arden, Jeff Kukucka, Sarah Hawkins, Joye Carter, Daniel S. Atherton. Authors’ Response to Gill et al Commentary on. Journal of Forensic Sciences 2021, 66 (6) , 2555-2556. https://doi.org/10.1111/1556-4029.14850
  97. Douglas H. Ubelaker. Research Integrity in Forensic Anthropology. Forensic Sciences Research 2021, 6 (4) , 285-291. https://doi.org/10.1080/20961790.2021.1963515
  98. Paul Andrewes, Shannon Bullock, Robyn Turnbull, Tim Coolbear. Chemical instrumental analysis versus human evaluation to measure sensory properties of dairy products: What is fit for purpose?. International Dairy Journal 2021, 121 , 105098. https://doi.org/10.1016/j.idairyj.2021.105098
  99. Sher-Lin Chiam, Itiel E. Dror, Christian D. Huber, Denice Higgins. The biasing impact of irrelevant contextual information on forensic odontology radiograph matching decisions. Forensic Science International 2021, 327 , 110997. https://doi.org/10.1016/j.forsciint.2021.110997
  100. , , Roberto Puch-Solis, Susan Pope. Interpretation of DNA data within the context of UK forensic science — evaluation. Emerging Topics in Life Sciences 2021, 5 (3) , 405-413. https://doi.org/10.1042/ETLS20200340
Load all citations
  • Abstract

    Figure 1

    Figure 1. Eight sources of bias that may cognitively contaminate sampling, observations, testing strategies, analysis, and conclusions, even by experts. They are organized in a taxonomy within three categories: starting off at the top with sources relating to the specific case and analysis (Category A), moving down to sources that relate to the specific person doing the analysis (Category B), and at the very bottom sources that relate to human nature (Category C).

  • References

    ARTICLE SECTIONS
    Jump To

    This article references 78 other publications.

    1. 1
      McCord, B. R.; Gauthier, Q.; Cho, S.; Roig, M. N.; Gibson-Daw, G. C.; Young, B.; Taglia, F.; Zapico, S. C.; Mariot, R. F.; Lee, S. B.; Duncan, G. Anal. Chem. 2019, 91, 673688,  DOI: 10.1021/acs.analchem.8b05318
    2. 2
      Barrio, P.A.; Crespillo, M.; Luque, J.A.; Aler, M.; Baeza-Richer, C.; Baldassarri, L.; Carnevali, E.; Coufalova, P.; Flores, I.; Garcia, O.; Garcia, M.A.; Gonzalez, R.; Hernandez, A.; Ingles, V.; Luque, G.M.; Mosquera-Miguel, A.; Pedrosa, S.; Pontes, M.L.; Porto, M.J.; Posada, Y.; Ramella, M.I.; Ribeiro, T.; Riego, E.; Sala, A.; Saragoni, V.G.; Serrano, A.; Vannelli, S. Forensic Sci. Int.: Genet. 2018, 35, 156163,  DOI: 10.1016/j.fsigen.2018.05.005
    3. 3
      Butler, J. M.; Kline, M. C.; Coble, M. D. Forensic Sci. Int.: Genet. 2018, 37, 8194,  DOI: 10.1016/j.fsigen.2018.07.024
    4. 4
      Bright, J.-A.; Cheng, K.; Kerr, Z.; McGovern, C.; Kelly, H.; Moretti, T. R.; Smith, M. A.; Bieber, F. R.; Budowle, B.; Coble, M. D.; Alghafri, R.; Allen, P. S.; Barber, A.; Beamer, V.; Buettner, C.; Russell, M.; Gehrig, C.; Hicks, T.; Charak, J.; Cheong-Wing, K.; Ciecko, A.; Davis, C. T.; Donley, M.; Pedersen, N.; Gartside, B.; Granger, D.; Greer-Ritzheimer, M.; Reisinger, E.; Kennedy, J.; Grammer, E.; Kaplan, M.; Hansen, D.; Larsen, H. J.; Laureano, A.; Li, C.; Lien, E.; Lindberg, E.; Kelly, C.; Mallinder, B.; Malsom, S.; Yacovone-Margetts, A.; McWhorter, A.; Prajapati, S. M.; Powell, T.; Shutler, G.; Stevenson, K.; Stonehouse, A. R.; Smith, L.; Murakami, J.; Halsing, E.; Wright, D.; Clark, L.; Taylor, D. A.; Buckleton, J. Forensic Sci. Int.: Genet. 2019, 40, 18,  DOI: 10.1016/j.fsigen.2019.01.006
    5. 5
      Cooper, G. S.; Meterko, V. Forensic Sci. Int. 2019, 297, 3546,  DOI: 10.1016/j.forsciint.2019.01.016
    6. 6
      Dror, I. E. J. Appl. Res. Mem. Cog. 2016, 5 (2), 121127,  DOI: 10.1016/j.jarmac.2016.03.001
    7. 7
      Dror, I. E.; Murrie, D. Psych. Pub. Policy Law. 2018, 24 (1), 1123,  DOI: 10.1037/law0000140
    8. 8
      Segall, M.; Chadwick, A. Future Med. Chem. 2011, 3 (7), 771774,  DOI: 10.4155/fmc.11.33
    9. 9
      Hamnett, H.; Jack, R. Sci. Justice 2019, 59 (4), 380389,  DOI: 10.1016/j.scijus.2019.02.004
    10. 10
      Forensic Science Regulator. Contextual Bias in Forensic toxicology , 2019. https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/800561/Lessons_Learnt_May19__L-B03_-final.pdf (accessed June 2020).
    11. 11
      Downs, U., Swienton, A. R., Eds.; Ethics in Forensic Science; Academic Press: Waltham, MA, 2012; pp 1441.
    12. 12
      Gannett, C. A130 Ethics in Forensic Science, 2020. https://cci.gosignmeup.com/public/Course/browse?courseid=2552 (accessed June 2020).
    13. 13
      Nickerson, R. S. Rev. Gen. Psych. 1998, 2, 175220,  DOI: 10.1037/1089-2680.2.2.175
    14. 14
      Bidgood, J. Chemist’s Misconduct Is Likely to Void 20,000 Massachusetts Drug Cases. New York Times , April 18, 2017.
    15. 15
      Mettler, K. How a Lab Chemist Went from ‘Superwoman’ to Disgraced Saboteur of More than 20,000 Drug Cases. Washington Post , April 21, 2017.
    16. 16
      Thompson, W. C. Southwestern Uni. Law Rev. 2009, 37, 971994
    17. 17
      Kukucka, J.; Kassin, S.; Zapf, P.; Dror, I. E. J. Appl. Res. Mem. Cog. 2017, 6 (4), 452459,  DOI: 10.1016/j.jarmac.2017.09.001
    18. 18
      Dror, I. E.; Kukucka, J.; Kassin, S.; Zapf, P. J. Appl. Res. Mem. Cog. 2018, 7 (2), 316317,  DOI: 10.1016/j.jarmac.2018.03.005
    19. 19
      Dror, I. E. In The Paradoxical Brain; Kapur, N., Ed.; Cambridge University Press: Cambridge, UK, 2011; pp 177188.
    20. 20
      Shanteat, J. In Advances in Design Research; Rohrmann, B., Beach, L. R., Vlek, C., Watson, S. R., Eds.; Elsevier: Amsterdam, 1989; pp 203215.
    21. 21
      Soller, J. M.; Ausband, D. E.; Gunther, S. M. PLoS One 2020, 15 (3), e0229762,  DOI: 10.1371/journal.pone.0229762
    22. 22
      Dror, I. E.; Wertheim, K.; Fraser-Mackenzie, P.; Walajtys, J. J. Forensic Sci. 2012, 57 (2), 343352,  DOI: 10.1111/j.1556-4029.2011.02013.x
    23. 23
      Pronin, E.; Lin, D. Y.; Ross, L. Person. Soc. Psych. Bull. 2002, 28, 369381,  DOI: 10.1177/0146167202286008
    24. 24
      Zapf, P.; Kukucka, J.; Kassin, S.; Dror, I. E. Psych., Pub. Policy Law 2018, 24 (1), 110,  DOI: 10.1037/law0000153
    25. 25
      Thornton, J. I. J. Forensic Sci. 2010, 55 (6), 1663,  DOI: 10.1111/j.1556-4029.2010.01497.x
    26. 26
      Wegner, D. M. Psych. Rev. 1994, 101, 3452,  DOI: 10.1037/0033-295X.101.1.34
    27. 27
      Steblay, N.; Hosch, H. M.; Culhane, S. E.; McWethy, A. Law Hum. Beh. 2006, 30, 469492,  DOI: 10.1007/s10979-006-9039-7
    28. 28
      Zajonc, R. B. Am. Psychol. 1980, 35, 151175,  DOI: 10.1037/0003-066X.35.2.151
    29. 29
      Finucane, M. L.; Alhakami, A.; Slovic, P.; Johnson, S. M. J. Behav. Decis. Making 2000, 13, 117,  DOI: 10.1002/(SICI)1099-0771(200001/03)13:1<1::AID-BDM333>3.0.CO;2-S
    30. 30
      Damasio, A. R. Descartes’ Error: Emotion, Reason, and the Human Brain; Penguin Books: New York, 2005; pp 1312.
    31. 31
      Jeanguenat, A. M.; Budowle, B.; Dror, I. E. Sci. Justice 2017, 57 (6), 415420,  DOI: 10.1016/j.scijus.2017.07.005
    32. 32
      Dror, I. E. Science 2018, 360 (6386), 243,  DOI: 10.1126/science.aat8443
    33. 33
      Starr, D. Forensics gone wrong: When DNA snares the innocent. Science 2016, 7, nana,  DOI: 10.1126/science.aaf4160
    34. 34
      Hanna, J.; Valencia, N. DNA Analysis Clears Georgia Man Who Served 17 Years in Wrongful Rape Conviction. CNN, January 10, 2020.
    35. 35
      Dror, I. E.; Hampikian, G. Sci. Justice 2011, 51 (4), 204208,  DOI: 10.1016/j.scijus.2011.08.004
    36. 36
      A Review of the FBI’s Handling of the Brandon Mayfield Case; Office of the Inspector General, Oversight & Review Division, U.S. Department of Justice, 2006.
    37. 37
      Marqués-Mateu, Á.; Moreno-Ramón, H.; Balasch, S.; Ibáñez-Asensio, S. Catena 2018, 171, 4453,  DOI: 10.1016/j.catena.2018.06.027
    38. 38
      Berlin, L. AJR, Am. J. Roentgenol. 2007, 189, 517522,  DOI: 10.2214/AJR.07.2209
    39. 39
      Kriegeskorte, N.; Simmons, W K.; Bellgowan, P. S F; Baker, C. I Nat. Neurosci. 2009, 12, 535540,  DOI: 10.1038/nn.2303
    40. 40
      Vul, E.; Kanwisher, N. In Foundational Issues for Human Brain Mapping; Hanson, S., Bunzl, M., Eds.; MIT Press: Cambridge, MA, 2010; pp 7192.
    41. 41
      Richter, D.; Ekman, M.; de Lange, F. P. J. Neurosci. 2018, 38, 74527461,  DOI: 10.1523/JNEUROSCI.3421-17.2018
    42. 42
      Luck, S. J.; Ford, M. A. Proc. Natl. Acad. Sci. U. S. A. 1998, 95 (3), 825830,  DOI: 10.1073/pnas.95.3.825
    43. 43
      Simons, D. J.; Chabris, C. F. Percept. 1999, 28, 10591074,  DOI: 10.1068/p281059
    44. 44
      Stein, T.; Peelen, M. V. J. Exp. Psychol. Gen. 2015, 144, 10891104,  DOI: 10.1037/xge0000109
    45. 45
      Kok, P.; Brouwer, G. J.; van Gerven, M. A. J.; de Lange, F. P. J. Neurosci. 2013, 33, 1627516284,  DOI: 10.1523/JNEUROSCI.0742-13.2013
    46. 46
      de Lange, F. P.; Heilbron, M.; Kok, P. Trends Cognit. Sci. 2018, 22, 764779,  DOI: 10.1016/j.tics.2018.06.002
    47. 47
      Gardner, B. O.; Kelley, S.; Murrie, D. C.; Blaisdell, K. N. Forensic Sci. Int. 2019, 297, 236242,  DOI: 10.1016/j.forsciint.2019.01.048
    48. 48
      Eeden, C. A. J.; de Poot, C. J.; van Koppen, P. J. J. Forensic Sci. 2019, 64 (1), 120126,  DOI: 10.1111/1556-4029.13817
    49. 49
      Morewedge, C. K.; Kahneman, D. Trends Cognit. Sci. 2010, 14, 435440,  DOI: 10.1016/j.tics.2010.07.004
    50. 50
      Moorcroft, M.; Davis, J.; Compton, R. G. Talanta 2001, 54, 785803,  DOI: 10.1016/S0039-9140(01)00323-X
    51. 51
      Almog, J.; Zitrin, S. In Aspects of Explosive Detection; Marshal, M., Oxley, M., Eds.; Elsevier: Amsterdam, 2009; pp 4748.
    52. 52
      Lissaman, C. Birmingham Pub Bombers Will Probably Never Be Found. BBC News, March 14, 2011.
    53. 53
      Lang, S. E. Report of the Motherisk Hair Analysis Independent Review; Ontario Ministry of the Attorney General, Canada, 2015.
    54. 54
      Egglin, T. K.; Feinstein, A. R. J. Am. Med. Assoc. 1996, 276 (21), 17521755,  DOI: 10.1001/jama.1996.03540210060035
    55. 55
      Wolfe, J. M.; Horowitz, T. S.; Kenner, N. M. Nature 2005, 435, 439440,  DOI: 10.1038/435439a
    56. 56
      Wolfe, J. M.; Horowitz, T. S.; Van Wert, M. J.; Kenner, N. M.; Place, S. S.; Kibbi, N. J. Exp. Psychol. Gen. 2007, 136, 623638,  DOI: 10.1037/0096-3445.136.4.623
    57. 57
      Shafffi, E. B.; Smith, E. E.; Osherson, D. N. Mem Cog. 1990, 18 (3), 229239,  DOI: 10.3758/BF03213877
    58. 58
      Simon, D.; Ahn, M.; Stenstrom, D. M.; Read, S. J. Psych. Public Policy. Law 2020, n, na,  DOI: 10.1037/law0000226
    59. 59
      Murrie, D. C.; Boccaccini, M. T.; Guarnera, L. A.; Rufino, K. A. Psych Sci. 2013, 24, 18891897,  DOI: 10.1177/0956797613481812
    60. 60
      Strengthening Forensic Science in the United States: A Path Forward; National Academies Press: Washington, DC, 2009.
    61. 61
      Whitman, G.; Koppl, R. Law Prob. Risk 2010, 9, 6990,  DOI: 10.1093/lpr/mgp028
    62. 62
      Howard, J. Cognitive Errors and Diagnostic Mistakes: A Case-Based Guide to Critical Thinking in Medicine; Springer: New York, 2019.
    63. 63
      Cosby, K. S.; Croskerry, P. Acad. Emergency Med. 2004, 11, 13411345,  DOI: 10.1197/j.aem.2004.07.005
    64. 64
      Saposnik, G.; Redelmeier, D.; Ruff, C. C.; Tobler BMC Med. Inf. Decis. Making 2016, 16, 138,  DOI: 10.1186/s12911-016-0377-1
    65. 65
      Dror, I. E.; Mnookin, J. Law Prob. Risk 2010, 9 (1), 4767,  DOI: 10.1093/lpr/mgp031
    66. 66
      General Requirements for the Competence of Testing and Calibration Laboratories, 3rd ed.; ISO/IEC 17025; International Organization for Standardization/International Electrotechnical Commission, Geneva, Switzerland, 2017.
    67. 67
      Dror, I. E.; Pierce, M. L. J. Forensic Sci. 2020, 65 (3), 800808,  DOI: 10.1111/1556-4029.14265
    68. 68
      Dror, I. E.; Langenburg, G. J. Forensic Sci. 2019, 64 (1), 1015,  DOI: 10.1111/1556-4029.13854
    69. 69
      Gok, K.; Atsan, N. Intern J. Business Soc. Res. 2016, 6 (3), 3847,  DOI: 10.18533/ijbsr.v6i3.936
    70. 70
      Neal, T. PLoS One 2016, 11 (4), e0154434  DOI: 10.1371/journal.pone.0154434
    71. 71
      Miller, A. K.; Rufino, K. A.; Boccaccini, M. T.; Jackson, R. L.; Murrie, D. C. Assessment 2011, 18 (2), 253260,  DOI: 10.1177/1073191111402460
    72. 72
      Griffin, D.; Ross, L. In Advances in Experimental Social Psychology; Zanna, M. P., Ed.; Academic Press: San Diego, CA, 1991; pp 319359.
    73. 73
      Ross, L.; Greene, D.; House, P. J. Exp. Soc. Psych. 1977, 13, 279301,  DOI: 10.1016/0022-1031(77)90049-X
    74. 74
      Oppenheimer, D. M. Trends Cognit. Sci. 2008, 12, 237241,  DOI: 10.1016/j.tics.2008.02.014
    75. 75
      Goldstein, D. G.; Gigerenzer, G. Psychol. Rev. 2002, 109, 7590,  DOI: 10.1037/0033-295X.109.1.75
    76. 76
      Robertson, C., Kesselheim, A., Eds.; Blinding as a Solution to Bias: Strengthening Biomedical Science, Forensic Science, and Law; Academic Press: New York, 2016; pp 1388.
    77. 77
      Maude, J. Diagnosis 2014, 1, 107109,  DOI: 10.1515/dx-2013-0009
    78. 78
      Barondess, J. A., Carpenter, C. C., Eds.; Differential Diagnosis; Lea & Febiger: Philadelphia, PA, 1994; pp 1800.

Pair your accounts.

Export articles to Mendeley

Get article recommendations from ACS based on references in your Mendeley library.

Pair your accounts.

Export articles to Mendeley

Get article recommendations from ACS based on references in your Mendeley library.

You’ve supercharged your research process with ACS and Mendeley!

STEP 1:
Click to create an ACS ID

Please note: If you switch to a different device, you may be asked to login again with only your ACS ID.

Please note: If you switch to a different device, you may be asked to login again with only your ACS ID.

Please note: If you switch to a different device, you may be asked to login again with only your ACS ID.

MENDELEY PAIRING EXPIRED
Your Mendeley pairing has expired. Please reconnect