An efficient framework for automated latent fingerprint recognition
Loading...
Date
item.page.authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
Latent fingerprints, that are imperative for forensic investigations are seldom uplifted perfectly. These unintentional impressions left at crime sites are mostly partial with insufficient features that are not suitable for automatic recognition and analysis. Further, the existing acquisition approaches rely on the single-shot touchbased capturing mechanism wherein the reagents are physically applied to the crucial evidence for examination. The thesis presents an Automated Patch-based Latent Fingerprint Recognition (AP-LFR) System for reliable recognition based on partial samples. The experiments were conducted on the samples digitally captured using the touchless Reflected Ultra Violet Imaging System (RUVIS) equipment that can uplift multiple instances of evidence with high resolution. The proposed patch estimation algorithm identifies features to counter manual minutiae matching for estimating optimal patch size. Classical and GAN-based augmentations were applied to simulate prints from a realistic crime site and deep feature extraction respectively. A Patch based Latent Fingerprint database (PLF-RUVIS-DB) of 9000 partial samples is thus created from the initial 370 complete samples. The recognition capability of partial samples is then evaluated for different shallow and deep learning models, where the VGG16 and ResNet50 architectures outperformed. After fine-tuning, the configured model achieved the maximum accuracy of 96% with ResNet50 as the backbone architecture and multiclass SVM as the subject classifier. Weighted average fusion further improved the accuracy by ~2%. The existing patch-based recognition approaches cite accuracy between 68% to 84% on different benchmark datasets. However, the proposed model achieved an accuracy of 98% on the RUVIS dataset and 96% when tested on the standard NISTSD27 dataset, indicating better generalizability.
newline