21 Mar

Confirmation Bias, Ethics, and Mistakes in Forensics

Confirmation Bias, Ethics, and Mistakes in Forensics

“The eyes are not responsible when the mind does the seeing.”

– Publilius Syrus 1


Confirmation bias is when people observe more, give extra emphasis to, or intentionally look for evidence that would validate their existing beliefs and expectations and are likely to excuse or completely ignore evidence which could reject their beliefs. As such, it can be seen as a type of bias in gathering and analyzing evidence. 1 Although some might disagree, this type of bias does not exclude scientists who pride themselves on their objectivity. 2

Scientist and researchers have recognized for centuries that bias influences human thought and behavior. In 1620, a philosopher named Francis Bacon found that once people adopt an opinion, they will look for anything to support and agree with that opinion. Bacon also noted that it is a “peculiar human tendency” to be more moved by positives than by negatives. 3 In 1852, a journalist named Charles Mackay stated “When men wish to construct or support a theory, how they torture facts into their service!” 3 However, in spite of all the previous work on bias, a study completed in 1959 by psychologist Peter C. Wason was considered by most as the beginning for much of the work on confirmation bias.

“Confirmation bias is perhaps the best known and most widely accepted notion of inferential error to come out of the literature on human reasoning.” 4

To begin this study of biases, ethics, and mistakes, there are a few questions that need to be asked.

1. Should irrelevant information or opinions about a case be shared with the analyst prior to examining the evidence? Does confirmation bias limit itself only to verifications?

2. In reviewing other analysts’ work, is there an analyst that is correct 99.9% of the time? Is there an analyst with consistent errors? Is their work reviewed the same?

3. Would there be more time spent on a verification from another agency, a coworker, or a supervisor? Are the same criteria for verifying used for everyone?

4. When a print is identified, is there ever a question of identity prior to turning the case in for verification? What if, after matching the latent print to the suspect, it was found that the DNA in the case did not match the suspect? Could that information make an analyst question his or her results? Could that information make an analyst change the results?

5. What if another examiner with more experience asked for a tough latent print to be verified? Surely, a more experienced examiner would not make a mistake. If there were questions about the print or the identification, would they be asked? What if an acceptable answer were not given? Would the identification be verified anyway? What if pressure were applied to verify?

6. What if an identification is found in a serial rapist case and the police need this match as probable cause to arrest? Now the identification that needs to be verified was made at 5pm on Friday afternoon and the verifying analyst has unchangeable plans for 5:30pm that he/she had been waiting on all year? Does he/she feel pressure? Is the analyst that is asking for verification always right? If the latent is tough, is the time taken for verification or is the analyst’s word taken for it?

These questions should have helped to get the reader on track to see how easily bias, carelessness, and ethics can enter into the decision-making process. Bias is everywhere; it is the very fabric with which most people clothe themselves in daily. It is in politics, science, medicine, media, research, and almost everything that requires thought. Once this bias is realized, awareness of it makes us start to second-guess ourselves, not just in forensics but also in everyday life.

Everyone is biased to some extent, some of us more than others. There are several ways that bias can enter our lives. For example, we read an article and form our opinion on the topic of that article. How do we know the author of the article is telling the truth? It should be evident that we cannot research every subject and know the truth on everything in our lives.

Perhaps Rutherford D. Rogers said it best: “We’re drowning in information and starving for knowledge.”

For this reason, we want to believe what people tell us. We almost find it necessary to accept a person’s word, because we are overrun with information on a daily basis. However, in order to take a person’s word, we have to trust that person to begin with. All of us must trust to a certain extent because we know we cannot do it all ourselves. We have to focus on the importance of knowing the truth and search for it ourselves.

Becoming emotionally involved in a case can also allow bias to enter the analysis of the scene or evidence. The more our emotions are involved with a belief, the easier it is for us to disregard details and opinions that may have a tendency to challenge that belief. 5 An example of this would be at a crime scene where one becomes emotionally involved because of the information obtained from the investigating officer and then uses that information to determine how the crime scene was committed, what needs to be collected, and what needs to be processed.

Knowing what confirmation bias is and how it can affect your objectivity should make one rethink how to conduct day-to-day activities. Laziness is another way bias can affect our opinions. The application of laziness to forensics is when we allow other people to do the thinking for us, resulting in a loss of objectivity and a desire to learn, thereby steering our preference toward supporting rather than refuting. However, we should not support the identification of a print without challenging the validity of the opinion. An opinion on a verification should never be made by taking the word of the original examiner.

As Arthur Bloch stated “Don’t let your conclusion be the place where you got tired of thinking”

The last way that bias can affect our opinions is by grouping with people that think like us or have the same beliefs. People tend to group with others that share the same beliefs, because associating with people who do not share the same beliefs would require that person to think of a way to defend their beliefs. For this reason, it is better to spend more time with people who challenge our beliefs or opinions, because this will have a tendency to keep a person thinking, by having them process information instead of accepting information. This is an important part in learning to overcome bias, because when a person searches out all possibilities, it will be easier for that person to give an unbiased opinion.

Why does confirmation bias occur?

Confirmation bias occurs when we lose our ability to be objective. The reason that confirmation bias is so common is because, mentally, it is easier to deal with. 6 Studies of social judgment show that when people are in favor of a certain belief, they tend to seek out evidence and interpret information that follows their beliefs by giving positive evidence more weight than it deserves. 3 On the other hand, they do not look for or even reject information that would disprove their beliefs by giving less weight to negative evidence. This does not mean that we completely ignore negative information, but it does mean that we give it less weight than positive information. This is usually accomplished by leaving out, altering, or diluting any of the negative observations. 3 This is exactly what happened in the case Robert Millikan, who won the Nobel Prize in physics, for his research on finding “the electric charge of a single electron.” Millikan only reported a little more that half (58) of his (107) observations, excluding from the publication the observations that did not fit his hypothesis. 7

If we can only see one possible explanation of an event, then we tend not to interpret data as supportive of any other alternate explanation. There are others who tend to be so strongly committed to their position that they even disregard interpretations or explanations of others. 3 However, it is important to note that if our conclusions are based on solid evidence and objective experiments, then our tendencies to overweigh evidence based on our personal beliefs should not affect us as a general rule. 6 But, if we start overlooking evidence that refutes our conclusions, then we lose our objectivity and cross over to subjectivity, based on our preconceived beliefs.

Most people would admit that they do not like to be wrong. It is part of our human nature to argue in favor of our beliefs, even when confronted with contradictory evidence. Evidence that confirms our theories are typically easier to deal with cognitively, which is why we prefer supporting evidence instead of evidence which may refute our claims. It is easier to think of a reason to support our claims than to think of a reason that might contradict them. This is mainly because it is difficult to think of a reason why we might be wrong. 5 Bias can penetrate our objective thoughts and challenge or even change our conclusions. 5

The effects of confirmation bias The main difference between confirmation bias and other biases is that confirmation bias consistently keeps us floundering in deceit by preventing us from seeing the truth. 5 We might ask ourselves, what is the truth? Is the truth what we see or what we believe we saw, what we hear or what we thought we heard, what we read or how we interpret what we read? The truth is what it is, but it has to be sought out.

Those who take things at face value, without checking on the validity, are setting themselves up for disaster. When one objectively assesses evidence that leads to an unprejudiced conclusion, as opposed to constructing a case to rationalize a previously drawn assumption, an obvious difference can be seen. In the first instance, one takes a holistic view of the evidence and arrives at a conclusion that is based on an objective evaluation. In the second, one is selective with the evidence that is gathered and discards other evidence that seems to disagree with the supported position. 3 This is not to suggest that someone would intentionally mistreat evidence; one may interpret or select evidence along with one’s beliefs without necessarily being aware of a bias. 3 This would be consistent with an investigator’s not collecting evidence at the crime scene because it does not fit his/her theory on how the crime happened. However, some research has discounted the likelihood of people intentionally seeking to prove rather than falsify their hypotheses. 4 There is evidence to support that confirmation bias does not arise from a longing to confirm, but rather from people not thinking in openly negative terms. The basis of this phenomenon has been argued as cognitive breakdown and not motivation: “Subjects confirm, not because they want to, but because they cannot think of the way to falsify. The cognitive failure is caused by a form of selective processing which is very fundamental indeed in cognition – a bias to think about positive rather than negative information.” 4

Types of Errors

To understand confirmation bias, we must identify the three basic types of errors that are found in forensic science:

1. Ethics violation

A. Fabricated prints

B. Dry benching (estimating results without completing an examination)

C. Intentional erroneous results

D. Covering up mistakes

2. Honest errors

A. Lack of training and mentoring

B. Feeling pressure to complete work or being overwhelmed with work

C. Administrative errors or complacency in one’s work

3. Biased oversight

It should be noted that bias and honest miscalculations can sometimes overlap. An example would be the interpretation of a distorted latent could be wrong because of the lack of training and the fact that the officer informed the examiner that the suspect had already been visually identified by the victim as the perpetrator. Bias can also be interconnected with ethical violations; however, honest mistakes and ethical violations should never overlap.

Types of Bias in the Workplace

There has been a great deal of empirical evidence to support the fact that confirmation bias exists and that it appears in many forms. 3 Understanding how bias can enter into the thought process is vital to understanding bias, so before we can talk about how to avoid bias, we must first understand the levels in which we can have bias.

1. Observer or Expectancy Effect : This is when person anticipates the outcome from an initial observation or has preconceived expectations about the outcome of an analysis. This type of bias allows the anticipation to drive the outcome of the desired result. An example would be expecting an identification because the officer stated that the suspect was caught on tape and the officer just needs to have the print identified. This could also be seen when working crime scenes. If the initial contact at the scene tells the investigating officer that it is a suicide, then the investigating officer might anticipate finding a suicide. However, we cannot allow our expectations to form our conclusions. Previous experimental studies on hair analysis found that the error rates were much larger when the examiners were given suggestive information about the scene and the suspect being present. Similar errors should be expected from fingerprint examiners given suggestive information. 8

2. Selective Attention : This is the ability to give attention to items of interest and disregard contradictory information. The human cognitive system allows flexibility in where we focus our attention. In sports, it is like “being in the zone”, which means that we are so in tune to what we are doing that everything else in the peripheral is nonexistent. Prior expectations, along with selective attention, can push us into that “zone” by priming our mind to look for information that will confirm our preexisting beliefs and disregard all other information. This cognitive bias is stronger than one might believe. It can overpower the correct conclusion in an ambiguous case; however, it will not affect the conclusion in an unambiguous case. An example of selective attention in a latent print case would be using the matching details but filtering out the questionable areas in a close call. A crime scene example would be where the investigating officer collects evidence that supports his or her theory and disregards the evidence that does not.

3. Role Effect : This is the reason two people can enter a crime scene and see two different things. The selection of data by those two people and what they will remember will vary because of their roles at the scene. An example of this would be a dentist and barber entering a room full of people. The dentist would focus on or notice everyone’s teeth; the barber would focus on or notice everyone’s hair. Whatever role we play is where our primary focus will be.

4. Conformity effect : This is the act of conforming to the opinions, beliefs and behavior of one’s peers. For example, if several of your peers say it is so, then it must be so.

5. Need-Determined Perception : This is caused by having a strong motivation to reach a desired result. This motivation could come from wanting to help in solving crimes. It is part of our human nature to try and help when we can; however, making things fit is not helping. We must remain vigilant is in trying to win the battle of good and evil, but we have to be careful not to fall to the level of that which we detest. An example would be to want to help solve a high-profile case so badly that we look to the known print to find out what we need to be looking for in the latent print. Looking at the known print first can clutter our minds with information that may help us to see the preferred outcome.

6. Positivity bias : When there is a lack of compelling evidence, people are more prone to believe that a statement or conclusion is true rather than to believe that it is false. We tend to affirm the things that we want to be true. This is a prime example of why analysts need to be careful with verifications.

7. Primacy effect : When a person gathers information over a period of time and then comes to a conclusion, it is the information that is gathered earlier in the process (rather that later) that is likely to carry more weight in the decisionmaking. There is a tendency to form opinions early and then evaluate any subsequent information based on the previously formed opinion. An example would be reading or listening to any information being provided by the investigating officer about the suspect and or scene, prior to working the case. The information that is received in the beginning can drive the outcome or conclusion. Once the mind is primed, the eyes will fill in the details to see whatever it is that we want to see.

8. Overconfidence : No matter how much contradictory evidence is presented, an overconfident person will always argue that he or she is right. The more intelligent a person is, the easier it is to defend biases and beliefs. 9

The Desire to Believe

The previously mentioned types of bias are real and can be found all around us. Conversely, these biases are bound to reason. Forensically, the desire to believe may come from the items of evidence being presented in a suggestive fashion by an investigator or other analysts. 8 It is these suggestive comments that can bias an examination or analysis and ultimately taint the conclusions. But there is a limit to how much information can influence our decisions. As individuals, we are free to believe whatever we want, however, the significance we add to our beliefs can determine how much suggestion we allow. Additionally, “what we believe must appear to us believable.” 3 We are not going to believe something that is not believable. The evidence that is being interpreted would have to be very similar or provide a reason to believe that our opinions or our conclusions are correct. We can pick and choose the evidence that we collect, and we can even add weight to what we find, but in spite of this, we cannot totally ignore opposing evidence of which we are aware. 3 “When the questioned and known impressions are very similar, it is often easy to be convinced that the questioned impression could have been, or was in fact, made by the known tire.” 10

Avoiding confirmation bias and other errors

The complete knowledge of any science should include knowledge about how mistakes can occur and how those mistakes can be revealed and dealt with. 11 The key to avoiding bias and errors is awareness. The following is a list of eight steps that will help in ensuring against possible errors.

1. Seek training and participate in proficiency testing : In any area of expertise, there is a degree of variation in the training, experience, and knowledge of each expert. 12 This includes forensics. Every latent print has to be evaluated on quality and quantity of visual information to determine whether it is comparable. Thus, the training and experience of the examiner becomes the vital element in the identification process. 13 Methods in scientific reasoning must be taught if we are to avoid bias. 14 Analysts in training need to be taught how to handle certain data; scientific models provide the structure. 11 The bottom line is that our analyses and conclusions should be reproducible, verifiable, and open to all that want to scrutinize them. Wertheim summarized Judge Pollak’s second ruling as, “The reliability of the examiner through training, experience, and testing is the key to the reliability of the evidence they present in court.”15

2. Accept bias : Biases are part of out psychological makeup. Understanding this is paramount to have an opportunity to correct it. Past research has shown that biases are harder for older, more qualified analysts to overcome because “…their commitment to their theories grow strong…” 16 When we recognize that we can subconsciously add weight to evidence to make it fit or explain away differences, we will have a better chance at changing our way of thinking. 5 The greatest regulators for biases are an individual’s integrity and objectivity. 23 Each analyst must determine what his or her biases could be and then devise a way to correct them.

3. Limit daily pressures : The greater the pressure, the greater chance for mistake. An examiner should never allow someone to watch over his or her shoulder as a comparison is conducted; the need for instant results should never happen. Usually, pressure can be found during the verification process, especially when the results are needed quickly in a high-profile case. However, pressure can occur for negative results as well. (i.e., under pressure, it would be easier to accept a negative conclusion.)

4. Remain objective. Objectivity is paramount in the comparison process. The analysis is objective, the comparison takes into account objective observations, and the evaluation is based on the subjective information that has been gathered during the comparison process. 17 Any information that might lead to bias in the comparison process should be taken away. “It takes a conscious effort by the analyst to remain impartial…” 18

5. Seek to disprove : We only stand to gain in our scientific understanding by consistently trying to prove our opinions false. 19 All possibilities need to be sought out, not just the ones that support our beliefs. Search for the negative and seek to falsify, and if the information turns out to be reliable, then the opinion can be based on sound reasoning. 3 “This search for inconsistencies is so important that it cannot be overstated. Only by adopting this approach during the comparison process can the examiner exercise a truly unbiased attitude.” 10

6. Limit outside influence : Even though confidence improves comparatively with given information, accuracy does not improve comparatively with confidence. When evidence is given to forensic scientists, any information that does not pertain directly to the evidence could introduce bias and pose a danger to the objectivity and accuracy of the scientist’s findings. However, it should be noted that having irrelevant information does not constitute bias, but using that information to support or justify a conclusion does. The examiner must depend wholly upon what is seen, leaving out of consideration all suggestions or hints from interested parties…Where the expert has no knowledge of the moral evidence or aspects of the case…there is nothing to mislead him… 20

7. Use scientific protocols (ACE-V) : Counting points can lead an examiner to overlook more important factors in the comparison process that ensure reliability. 17 By using the ACE-V methodology, the comparison process includes a holistic evaluation of all the available ridge detail.

8. Limit overconfidence – The statement “it could never happen to me” is troubling because biases are real and everybody has them. While some harbor a hidden fear that seeking consultation at any stage of the analysis, comparison, or evaluation is an admission of weakness, the reverse is actually the truth. No one person knows everything or has so much knowledge and experience that he or she is never in a position of requiring some form of consultation. Those having that opinion are a demonstration of the very attitude that prevented this science from reaching its potential years ago. Experts must be free to discuss issues with their peers, confirm the presence of specific shapes or features that they see, and then form their own opinions. The comparison is objective; others must be capable of seeing the physical attributes that one sees. If one feels his or her objectiveness has been compromised due to the consultation, one should ask a third party to carry out the verification. 21

Cole states that, since 1981, five members of the IAI have had their certifications revoked because of erroneous identifications. 22 These figures could be twisted to show that, on average, one examiner every five years loses their certification because of an erroneous identification. However, these figures are actually from two erroneous identifications: three analysts in one case and two analysts in another case. The problem with these numbers is that they only include the erroneous identifications and verifications of certified examiners. The certified examiners are supposed to be the competent ones. Believing that one has obtained certification and is now beyond mistakes would be a grave mistake. Certification does not mean the examiner becomes infallible; it tests competency and excellence in the field of latent prints.


It is specifically because of mistakes, ethics, and biases that science in general incorporates the standard of outside verification and the testing of scientific conclusions. These types of errors reveal why there is such a critical role in forensic science for what most see as aggressive opposition from other analysts. However, if we can not presume that everyone will effectively test their own conclusions, usually we can presume that our peers will. 5 It should be part of our professional responsibility to challenge all findings and to reveal any erroneous or biased findings. For those analysts who accept this responsibility, their integrity and objectivity should never be in question. This is not to say that there should be a witch hunt, but rather an objective look at all of the evidence in any given case to determine the truth.

Confirmation bias is not the sole reason for mistakes. However, confirmation bias is real and it can play a significant part in evaluating and examining evidence from the crime scene to the laboratory. 3 We have to constantly remind ourselves that biases exist and aggressively seek out data contrary to our beliefs. Because this is not a natural thought process, it would appear that the average person will always have bias. 6 It is the responsibility of analyst to remain objective, seek quality training, and above all, to keep their integrity. 23

A pilot study on confirmation bias will be conducted at the University of Southern Mississippi over the next few years under close supervision and guidelines. This preliminary research will be presented at the IAI conference in Boston, Massachusetts, July 2-7, 2006.

For further information, please contact:

Jon S. Byrd, CLPE

Criminalist II / Latent Print Examiner

Colorado Bureau of Investigation

301 South Nevada Ave.

Montrose, CO 81401

Phone: (970) 249-8621

Fax: (970) 249-6308


1. Confirmation Bias, Wikipedia, The Free Encyclopedia, Wikipedia.com

2. Kowit, Steve, The Mass Suicide of the Xhosa: A Study in Collective Self

Deception, Skeptic, Vol. 11, No. 1, 2004, www.skeptic.com

3. Nickerson, Raymond, Confirmation Bias: A Ubiquitous Phenomenon in Many

Guises, Review of General Psychology, Vol. 2, No. 2, 175-220, 1998.

4. Evans, J. St. B. T., Bias in human reasoning: causes and consequences.

Hillsdale, NJ: Erlbaum. (1989).

5. Cline, A. Flaws in Reasoning and Arguments: Confirmation Bias


6. Confirmation Bias, The Skeptics Dictionary, www.skepdic.com

7. Henrion, M., and Fischhoff, B. (1986) Assessing uncertainty in physical

constants, American Journal of physics, Vol. 54, 791-798.

8. Steele, Lisa J., The Defense Challenge to Fingerprints, Criminal Law Bulletin,

Vol. 40, No. 3, (2004): 213-240.

9. Confirmation Bias, Disenchanted Dictionary, www.disenchanted.com

10. Nause, Lawren S/Sgt. Forensic Tire Impression Identification: Royal

Canadian Police Research Centre, 2001.

11. Allchin, Douglas, Error and the Nature of Science, 2004 at


12. Plumtree, Wayne, Expert Opinion – Fact or Fiction? Responsibilities of the

Expert Witness, SCAFO, Vol. 10, Issue 2, 1994.

13. Leo, William, Identification Standards – The quest for Excellence, California

Identification Digest. Vol. 12, No. 1, Jan/Feb 1996.

14. Ulrich, Clare, Dissecting the process of reasoning, Human Ecology, October

2004, 32 (2), 15-19.

15. Wertheim, P. Faulty Forensics. National Public Radio broadcast, June 10,


16. Sackerman, J., Cognitive Process and the Suppression of Sound Scientific

Ideas, 1997.

17. German, Ed, Latent Print Examination, onin.com

18. Bevel, Tom; Gardner, Ross, Blood Pattern Analysis with an Introduction to

Crime Scene Reconstruction, 2nd edition, CRC Press 2002.

19. Grobman, K. H. Developmental Psychology.org, devpsy.org

20. Proia, Michael, “Development of a model for minimizing the effects of

confirmation bias on fingerprint identification results,” May 1, 2005, FSC 491.

21. Ashbaugh, David R., Quantitative-Qualitative Friction Ridge Analysis; An

introduction to Basic and Advanced Ridgeology, CRC Press 1999.

22. Cole, Simon A., Suspect Identities: A history of fingerprinting and criminal

identification, Harvard University Press, 2001.

23. Monsoor, Doug, Do the facts tell the whole story, Identification Canada,

Vol.18, No. 3, Nov/Dec 1995.

Share this