The First Criminal Trial That Used Fingerprints as Evidence
Thomas Jennings used a freshly painted railing to flee a murder scene but unwittingly left behind something that would change detective work forever
Just after 2 a.m. on the night of September 19, 1910, Clarence Hiller woke to the screams of his wife and daughter in their home at 1837 West 104th Street in Chicago. After a spate of robberies, residents of this South Side neighborhood were already on edge. Hiller, a railroad clerk, raced to confront the intruder. In the ensuing scuffle, the two men fell down the staircase. His daughter, Clarice, later recalled hearing three shots, followed by her mother screaming upstairs. Neighbors came running but the man had fled the home, leaving a dying Hiller by his front door.
The unknown assailant didn’t make it far. Thomas Jennings – an African-American man who had been paroled six weeks earlier - was stopped a half-mile away wearing a torn and bloodied coat and carrying a revolver. But it was what he left behind that would be the focal point of his trial—a fingerprint from a freshly painted railing that he used to hoist himself through a window at the Hiller house. Police photographed and cut off the railing itself, claiming it would prove the identity of the burglar. In the eyes of the court, they were right; Hiller’s murder would lead to the first conviction using fingerprint evidence in a criminal trial in the United States. At times controversial, this method of solving cases endures more than a century later.
Not only has fingerprinting had staying power in the legal system, the underlying method is fundamentally the same as when it was first introduced to American police departments. Prints are still evaluated based on the same descriptions of arches, loops and whorls written by Sir Francis Galton in the late 19th century. Further, the basic technique of collecting and comparing remains remarkably similar to what was applied to that rudimentary set of prints discovered at the Hiller home.
Jennings’ defense attorneys raised questions about this new—and little understood—technique, as well as whether such evidence could even be legally introduced in court (the first time it was used in Britain, they claimed, a special law was needed to make such evidence legal). The defense team even solicited prints from the public in an effort to find a match and disprove the theory that fingerprints were never repeated. A courtroom demonstration, however, backfired badly: Defense attorney W.G Anderson’s print was clearly visible after he challenged experts to lift the impression from a piece of paper that he had touched.
This made a distinct impression on the jury as well; they voted unanimously to convict Jennings, who was sentenced to hang. The Decatur Herald called it “the first conviction on finger-printing evidence in the history of this country,” adding with dramatic flourish that “the murderer of Hiller wrote his signature when he rested his hand upon the freshly painted railing at the Hiller home.”
It’s unclear the degree to which Jennings’s race played a part in his trial. News reports at the time didn’t sensationalize race in their coverage, or even mention Hiller’s race. Yet it’s not hard to envision that a jury, presented with an unfamiliar technique, would have been more skeptical with a white defendant.
The concept of identifying people by unique fingerprints, first laid out 18 years earlier in Europe, even had its origin in pseudoscientific racial beliefs. It was thoroughly studied and chronicled in Galton’s 1892 epic tome Finger Prints (A cousin of Darwin, Galton had long focused on a series of experiments hoping to tie myriad personal and intellectual characteristics to physical traits and heredity). Galton, who had also studied anthropometry in an effort to deduce the meaning behind physical measurements, did not find any major difference between races in his exhaustive collection of prints for research—but not for lack of effort. He wrote in Finger Prints that “it seemed reasonable to expect to find racial differences in finger marks, the inquiries were continued in varied ways until hard fact had made hope no longer justifiable.”
As journalist Ava Kofman recently outlined in the Public Domain Review, Galton’s pursuit of fingerprint science meshed well with colonialist ideology of the time. “Fingerprints were originally introduced for Europeans to distinguish between the otherwise indistinguishable mass of extra-European peoples, who themselves produced “indecipherable” fingerprints,” she wrote. Later in his career, according to Kofman, Galton would later engage in quantifying racial differences, inventing “scientific,” numerical measurements to categorize humans by race.
Nonetheless the system Galton outlined was to identify unique characteristics proved effective and caught on quickly. Police in the United States were just beginning to emulate their European colleagues and started to gather prints for the purpose of identification in the early 20th century. During the 1904 World’s Fair in St. Louis, Scotland Yard sent representatives to host an exhibit to demonstrate the technique, which was growing in popularity in British courts. Even Mark Twain was caught up in the speculation of how they could be used to apprehend criminals, placing “the assassin’s natal autograph” – which is to say the “blood-stained finger-prints” found on a knife- at the center of the dramatic courtroom finale in his novel Puddn’head Wilson, published years before the Jennings case.
After Jennings’ conviction, however, lawyers mounted a challenge to the notion that such a newfangled and little-understood technique could be admitted in court. After more than a year in the appeals process, on December 21, 1911, the Illinois Supreme Court upheld the conviction in the People v. Jennings, affirming his sentence would be carried out soon after. They cited prior cases in Britain and published studies on the subject to lend credibility to fingerprinting. Several witnesses in the Jennings trial, it pointed out, had been trained by the venerable Scotland Yard. “This method of identification is in such general and common use that the courts cannot refuse to take judicial cognizance of it,” the ruling stated.
Fingerprinting had thereby been “proclaimed by the Supreme Court of Illinois to be sufficient basis for a verdict of death by hanging,” the Chicago Tribune reported, and it was the beginning of a shift toward the largely unquestioned use of fingerprint evidence in courtrooms across the United States. “The Jennings case really is the earliest case – earliest published case – in which you’ll find any discussion of fingerprint evidence,” says Simon A. Cole, author of Suspect Identities: A History of Fingerprinting and Criminal Identification and professor of criminology, law and society at the University of California, Irvine School of Social Ecology. “So, in that sense it really is a precedent for the whole country.”
People v. Jennings further specified that fingerprint evidence was something that the average juror would have to rely on interpretation to understand. “Expert testimony is admissible when the subject matter of the inquiry is of such a character that only persons of skill and experience are capable of forming a correct judgment as to any facts connected therewith.” The inclusion of this statement was crucial in legal terms: some level of human judgment and interpretation was a given, built into the courtroom process when fingerprint evidence was presented to a jury. The degree of subjectivity that represents and what potential room for error - however small – is acceptable is still actively debated more than a century later.
Beginning with the Jennings trial, two fundamental questions have formed the basis of any challenge to its admissibility in court. Is the technique itself sound (the primary issue when it was first introduced)? And how accurate the evidence is when interpreted and applied to any specific case? “The uniqueness of fingerprints is really kind of beside the point of the accuracy of the identification,” says Cole. “The best way to understand that is to think about eyewitness identification – nobody disputes that all human faces are in some sense unique, even those of identical twins, but nobody reasons from that that eyewitness identification must be 100 percent accurate.” Juries like the one that convicted Jennings were initially focused on whether prints were repeated, “whereas really what we need to know is can people match them accurately.”
It is this gray area that defense attorneys seize on in thorny legal cases. Following a 1993 Supreme Court ruling in Daubert vs. Merrell Dow Pharmaceuticals Inc., judges were required to apply what is known as the Daubert standard to determine if a witness’ testimony can be considered scientific. This is based on a list of factors, including how the technique itself has been tested, error rates and what regulations govern its usage. These standards were more stringent than what had previously been required, putting the onus on judges to determine what could be considered by a jury as scientific evidence.
Fingerprinting techniques came under marked public scrutiny in 2004 when an Oregon lawyer named Brandon Mayfield was arrested in connection with a terrorist attack on a commuter train in Madrid based on a mistaken match of a partial print gathered at the scene . The FBI later publicly apologized to Mayfield, but such high-profile incidents inevitably introduce questions about if other mistakes have gone unnoticed and fuel skeptics and lawyers who contest the often presumed infallibility of such evidence.
As part of a broader re-examination of forensics that had come to be widely accepted over the years, the National Academy of Sciences released a report in 2009 that addressed some of these shortcomings, acknowledging that “not all fingerprint evidence is equally good, because the true value of the evidence is determined by the quality of the latent fingerprint image. These disparities between and within the forensic science disciplines highlight a major problem in the forensic science community: The simple reality is that the interpretation of forensic evidence is not always based on scientific studies to determine its validity.”
Fingerprint examiners rely on years of experience, testing and verification by a second examiner to bolster the reliability of their determination. Echoing the reasoning in the People v. Jennings ruling, fingerprint examiner William Leo writes that “the purpose of the expert witness in the legal system is to interpret information and form a conclusion that a jury of lay persons would be incapable of doing…A fingerprint examiner’s conclusion is not based upon a personal opinion, but rather on an evaluation of the detail present using the knowledge and skills acquired through training, education and expertise.”
“You’ll probably find for the most part that most people are in agreement that most of the time if you have a decent print of some size that is of decent quality, you can make an identification in some reasonable percentage of cases,” says David A. Harris, professor of law at the University of Pittsburgh and author of Failed Evidence: Why Law Enforcement Resists Science. “Where things have begun to come into question in the last 20 years is the way that those identifications have been done, the certainty with which they have been presented, the terminology around that and just a general harder look at all the forensic sciences.”
When it comes to fingerprint evidence, uncertainty has not been eliminated, but is now more likely to be acknowledged and addressed. And despite greater skepticism in recent decades and the more stringent caveats introduced by Daubert, courts have not significantly curtailed the use of fingerprint evidence, nor the reliance on examiners to interpret this evidence for the jury.
“A hundred years is kind of an impressive run,” says Cole. “There are some reasons for that – I think the fingerprint patterns are very information rich, you can see that there’s a lot of information packed into a small area.” When Thomas Jennings placed his hand on a porch railing in the middle of the night, he unwittingly introduced that valuable information into American courtrooms, influencing the outcome of innumerable cases for more than a century and counting.