The Human Touch
It's all right to be wrong.
The FBI Laboratory's Latent Print Unit is a jewel of the American criminal justice system, contributing to the identification and convictions of violent offenders nationwide. Its fingerprint examiners are top-notch and its database of more than 36 million sets of prints is a godsend to investigators.
But in 2004 the lab made a regrettable error; examiners wrongly identified an innocent Portland, Ore., lawyer, Brandon Mayfield, as the match to a print found on a bag of detonators linked to the March 11, 2004, terrorist train bombings in Madrid that killed 191 people and injured more than 1,400.
The lab had checks built into its identification system to ensure accurate matches. An initial examiner's conclusion that Mayfield's print was a match was verified by a second examiner, a supervisor and even a court-appointed independent examiner.
The first clue that the lab was wrong came from the Spanish National Police, who alerted the FBI that its examiners didn't think Mayfield's print was a match. The FBI dispatched an examiner to Spain to encourage the police to re-examine their conclusion. The police agreed to do so. By then FBI lab examiners knew that investigators had discovered that Mayfield was a Muslim and had represented a convicted terrorist in court. One examiner conceded to the inspector general that if Mayfield had been someone ordinary, like a "Maytag repairman" rather than a Muslim lawyer who knew a terrorist, then the FBI lab might have been more skeptical of its conclusion and listened to the Spanish police.
FBI lab officials not only questioned the police department's conflicting evidence, they also maintained that they were "absolutely confident" in their own findings-even before meeting with the police to learn more. The FBI lab should have questioned itself, too.
A few weeks after the meeting in Spain, the FBI arrested Mayfield and put him in jail. Two weeks later, the Spanish police contacted the FBI again, this time to say the print matched those of an Algerian man. After reviewing the Algerian's prints, the FBI lab retracted its conclusion, and Mayfield was exonerated.
The two sets of prints were remarkably similar. Nonetheless, the Justice Department Office of Inspector General concluded in a recent report that "overconfidence" was a cause of the misidentification, which was supposed to carry "100 percent certainty."
The inspector general found that the lab had used "circular reasoning," or reasoning "backward from features that were visible in the known prints of Mayfield. . . . Having found as many as 10 points of unusual similarity, the FBI examiners began to 'find' additional features in [the bag print] that were not really there."
To its credit, the lab has since instituted new procedures to double-check its findings. What officials recognized is they are not infallible. When mistakes occur, managers commonly make things worse by taking actions to ensure that a similar mistake "never happens again." But infallibility is an impossible goal. What is possible is to create ways to catch mistakes before they lead to consequences like those that befell Mayfield. It's also possible to create a culture in which employees can be confident in their conclusions, but still be willing to revisit those decisions if evidence suggests they were wrong.
The goal is not to eliminate mistakes, but to acknowledge a basic truth of life: Mistakes happen.
NEXT STORY: Birthday Wishes