Fingerprints were used for identification in ancient China and Babylonia to mark business deals and correspondence. Though they were studied extensively since then, their value as a crime-solving tool wasn’t embraced until the 1880s — and it wasn’t until 1892, in Argentina, that they nailed their first murderer.
A horrible crime scene greeted local investigators in the beach town of Necochea, where Francesca Rojas lived with her young son and daughter. The 27-year-old’s throat was slashed, but she was still alive; the children had been stabbed to death. Rojas blamed a would-be suitor for the attack, and though the suspect was held and tortured for over a week — including spending a night tied to the corpses of the tiny victims — he refused to confess. (In fact, he had an alibi.) Nearly a month passed before the Rojas case attracted the attention of police in the nearby city of La Plata, including an inspector who’d been trained by Juan Vucetich, one of fingerprinting’s earliest champions.
Vucetich had extensively studied both the Bertillion system (which used a series of body measurements to identify criminals; it’s the reason why mug shots are taken in profile and front-facing) and “Galton’s Details,” which identified the unique characteristics of individual fingerprints. The Rojas case — solved when a bloody thumb print detected at the crime scene was traced to Rojas, who soon confessed to the crime — convinced him that fingerprints were the better crime-solving method, and he continued to refine it throughout his career, publishing the influential Dactiloscopía Comparada (“Comparative Dactyloscopy”) in 1904.
Fingerprint identification gained traction around the world with the turn of the 20th century, and was first used to solve a British murder case in 1905. It was a gruesome one: married shopkeepers Thomas and Ann Farrow were found beaten to death, with no discernible clues other than an empty cash box. When Scotland Yard searched for prints, they found one that didn’t match with either of the Farrows. Though the rogue print wasn’t in their ever-growing files (the image below offers an idea of what Scotland Yard’s individual fingerprint sheets looked like some 50 years later), a witness recalled seeing two men lurking around on the day of the murders: ne’er-do-well brothers Alfred and Albert Stratton.
Though the witness proved unreliable on the stand, the thumbprint which positively identified Alfred Stratton became the proverbial nail in the coffin — though the defense was hampered by its own unreliable witness: a fingerprint expert who lost credibility when it was discovered he’d offered his services to both sides, and ended up testifying for the one that offered him the highest price. Still, the print was a match, and though fingerprint science was still a new concept to the members of the jury, the prosecution offered a convincing crash course:
Detective Inspector Collins spoke as an expert witness, explaining how fingerprinting worked and informing the jury that of the 800,000+ individual digit impressions held on file by Scotland Yard, he had never found two impressions to appear the same. He produced enlarged images of the thumbprints and identified the points of similarities.
That, combined with some damning circumstantial testimony from the brothers’ girlfriends, was enough to convict Alfred and Albert. Both were hanged soon after.
In America, fingerprints were a little more familiar to the public — they had been used for years by both the Federal Bureau of Prisons and the US Army — by the time they actually helped solve a murder. The case in question became an important legal benchmark for future cases involving fingerprint evidence. It was 1910, and Illinois man Clarence B. Hiller was shot to death after struggling with an intruder who’d broken into the house where his family, including four children, was sleeping. The unknown killer had entered through a freshly-painted kitchen window ... leaving four clearly embedded prints behind.
The prints were matched to those of Thomas Jennings — recently paroled after serving time for burglary, he was nabbed near the Hiller home while carrying a loaded gun and sporting injuries consistent with a fight. (The ensuing People v. Jennings featured three fingerprint experts who had been schooled by Scotland Yard detectives when they were in St. Louis attending the 1904 World’s Fair.) Jennings was convicted and sentenced to death; as part of his unsuccessful appeal, which questioned the legitimacy of fingerprint evidence, the Supreme Court of Illinois ruled that “standard authorities on scientific subjects discuss the use of fingerprints as a system of identification, concluding that experience has shown it to be reliable.”
Thereafter, fingerprint evidence was no longer viewed as a novelty. In 1924, the FBI began collecting fingerprints through its Identification Division, amassing over 200 million cards. The Bureau went digital in 1991 with the Automated Fingerprint Identification System, and upgraded to the Integrated AFIS system in 1999.
Since then, the technology has gotten even more sophisticated with each new generation, and though other crime-solving methods have followed suit (imagine if Argentine criminologist Vucetich’d had access to DNA testing!), fingerprints are still widely relied-upon as an infallible identification technique, even in realms even beyond law enforcement. In the words of Apple, describing its Touch ID feature to unlock iPhones: “Your fingerprint is the perfect password. You always have it with you. And no one can ever guess what it is.”
Top photo: Keystone/Getty Images; lower photo: Hulton Archive/Getty Images