View more articles about


How rare is your fingerprint?

U. BUFFALO (US) — A computer scientist has figured out a way to determine how rare a fingerprint is—and how likely it is to belong to a particular crime suspect.

“When we look at DNA, we can say that the likelihood that another person might have the same DNA pattern as that found at a crime scene is one in 24 million,” says Sargur Srihari, a professor at the University at Buffalo.

“Unfortunately, with fingerprint evidence no such probability statement can be made. Our research provides the first systematic approach for computing the rarity of fingerprints in a scientifically robust and reliable manner.”

Srihari’s approach is the first computational method for determining the rarity of fingerprints. He presented the findings earlier this week at the Proceedings of Neural Information Processing Systems conference in Vancouver and discussed the work in a recent article in IEEE Spectrum.

By combining machine learning with the ability to automate the extraction of specific patterns or features in a fingerprint and then comparing it with large databases of random fingerprints, Srihari and co-researchers are able to come up with a probability that a specific fingerprint would randomly match another in a database of a given size.

“Current procedures for forensics do not provide a measured accuracy for fingerprint analysis,” says Srihari, who in 2001 provided the first scientific evidence that fingerprints truly are unique.

The research lays the groundwork for the development of computational systems that could, for the first time, quickly and objectively reveal just how meaningful is the fingerprint evidence in a given case.

The research directly addresses some of the profound shortfalls identified by the National Academy of Sciences’ Committee on Identifying the Needs of the Forensic Science Community, which Srihari served on with other national experts from 2007-2009. Some of the committee’s recommendations dealt specifically with fingerprints, including the need for baseline standards to be used with computer algorithms to map, record and recognize features in fingerprint images.

Part of the challenge is due to the intrinsic nature of fingerprint evidence, he says, where fingerprints are invisible to the naked eye and have to be lifted using either powder or ultraviolet illumination.

According to Srihari, two types of uncertainty are involved in fingerprint analysis, similarity between two fingerprints and the rarity of a given configuration of ridge patterns.

“Human examiners describe the results of their analyses in one of three ways: likely to confirm identity, called individualization, unlikely to confirm identity, called exclusion, or inconclusive,” he says. “A probability statement as to how rare a specific finger print is would be a dramatic improvement in the way that such evidence is currently described to juries.”

Forensic analysis depends on something called a likelihood ratio, which is the ratio between the probability that the evidence found at the scene and the known data—for example, a suspect’s fingerprint—come from the same source and the probability that they come from different sources.

The new method uses machine learning—a type of artificial intelligence where machines learn from examples, through the use of statistics and probability—to predict the core point, usually the center in the finger around which the ridges flow.

“In forensic analysis, the fingerprints are usually incomplete,” Srihari explains. “Thus a guess has to be made as to which part of the finger it came from. Our approach allows us to predict the core point and thus orient the print for further analysis.”

The research was supported by a grant from the U.S. Department of Justice.

More news from the University at Buffalo:

Related Articles