Noncontact Fingerprint Algorithms Competition Overview
1. Register by the timeline to participate in the competion (indicate if you wish to be an anonymous participant).
2. Sign and submit Database License Agreement.
3. Follow instructions for sending softwares.
4. The LivDet team will evaluate the submitted algorithm only if all the submission criterias are met.
5. Performance will be assessed.
6. Results to be presented at IJCB 2023.
7. LivDet 2023 Algorithm output liveness scores would be made available to the algorithm submitter after competition.
LivDet-2023 Noncontact Fingerprint Algorithms Competition dataset to be available after the IJCB 2023 conference date. Presently, the reports of the competition is submitted to IJCB for review. To request the dataset please contact livdet@gmail.com
Release of Training Dataset
Partcipants can use any publicly or proprietarily available datasets to train their algorithms. We will share the training dataset to the participants to provide comprehensive understanding about live and known spoof samples to be used in the test dataset.
Each participant must sign the database release agreement to receive the training dataset:
Participants shall request and send the signed noncontact fingerprint dataset release agreement to: purnaps@Clarkson.edu
The training dataset would contain live and spoof images of single fingertip, collected using various smartphone back cameras
The known spoof data catagories are:
- Ecoflex
- Woodglue
Submission Instructions for Noncontact-based Fingerprint Algorithms
Each submitted algorithm must meet the following parameters:
1. Participant will send noncontact fingerprint algorithm to the email address: purnaps@clarkson.edu
2. Installation Requirements:
- Windows 11-64-bit OS.
- - .Exe or similar executable only for windows OS.
- UBUNTU OS (Version 20.04 only).
- - Necessary software packages, versions and installation instructions for UBUNTU.
- All necessary software Modules for installation must be provided.
- Algorithms should be able to process common image formats (e.g. .png, .jpg, .jpeg, .bmp).
- Only the CPU versions of the algorithms would be accepted for evaluation.
The algorithm output file format must be a .txt file, with scores saved after the process of each single fingertip image (live or spoof) presented to the algorithm.
- The output score for the processed image (Liveness scores only) - is a posterior probability of the live class given the image or a degree of liveness
normalized in the range 0 and 100 (100 is the maximum degree of liveness, 0 means that the image is fake). In the
case that the algorithm has not been able to process the image, the correspondent output must be -1000 (failure
to enroll).
- The image processing duration (in seconds) - duration required for the data capture subsystem and comparison subsystem to acquire and process a sample, inclusive of PAD subsystem processing duration
Performance Evaluation of Noncontact Fingerprint Algorithms
Laboratory staff will attempt to evaluate the algorithm with one single fingertip (live or spoof) at a time and collect corresponding performance scores.
The submitted algorithms must have the capability to process different size of RGB single fingertip images.
The parameters adopted for the performance evaluation of each successfully submitted algorithms will be the following:
- Attack presentation classification error rate* (APCER):
proportion of attack presentations using the same PAI species incorrectly classified as bona fide presentations at the PAD subsystem in a specific scenario
- Bona fide presentation classification error rate (BPCER):
proportion of bona fide presentations incorrectly classified as presentation attacks at the PAD subsystem in a specific scenario
- Attack presentation non-response rate (APNRR):
proportion of attack presentations using the same PAI species that cause no response at the PAD subsystem or data capture subsystem
- Weighted Average of APCER (APCERaverage): average of APCER accross all PAIs, weighted by the sample counts in each PAI category
- Average Classification Error Rate (ACER): average of APCERaverage and BPCER. (Only for the purpose of competition ranking)
- Algorithm processing duration (A-PD): duration required for the algorithm to acquire and process a sample, inclusive of PAD subsystem processing duration
Definitions:
- Presentation Attack (PA)**: presentation to the biometric data capture subsystem with the goal of interfering with the operation of the biometric system
- Presentation Attack Instrument (PAI): biometric characteristic or object used in a presentation attack
- *Rates are based on the assumption of a threshold of 50. Quality and match scores will be computed based on the collected images to support efforts to maintain fairness between the submitted systems
- **In the case here PAI species are the known and unknown spoof recipes that will be used in this competition
Declaration of the Winner
- The winner of the algorithms competition category will be awarded based on minimum overall classification error. One winner will be awarded.