We are now accepting submissions for the DFW2019 competition! Deadline: July 20, 2019!

With recent advancements in deep learning, the capabilities of automatic face recognition has been significantly increased. However, face recognition in unconstrained environment with non-cooperative users is still a research challenge, pertinent for users such as law enforcement agencies. While several covariates such as pose, expression, illumination, aging, and low resolution have received significant attention, “disguise” is still considered an arduous covariate of face recognition. Disguise as a covariate involves both intentional and unintentional changes on a face through which one can either obfuscate his/her identity or impersonate someone else’s identity. The problem can be further exacerbated due to unconstrained environment or “in the wild” scenarios. However, disguise in the wild has not been studied in a comprehensive way, primarily due to unavailability of such as database. As part of the International Workshop on Disguised Face in the Wild at ICCV2019, a competition is being held in which participants are asked to show their results on the Disguised Faces in the Wild (DFW) 2019 database.

Researchers who submit to the competition are highly encouraged to submit their paper to the DFW Workshop@ICCV2019 as well! The DFW 2019 dataset, protocols, and instructions for the competition will be released soon. Please fill this form to get all the updates!

DFW2019 Competition Details

The DFW2019 competition builds upon the DFW2018 competition, and encourages researchers to develop algorithms robust to disguise variations. To this effect, we have prepared a novel DFW2019 dataset which will form the test set for this competition. The training partition of the DFW2018 dataset can be used as the training set for this competition, while test set of the DFW2018 dataset will form the validation set for this competition. That is, for the DFW2019 competition:

  • Training Set: Training partition of DFW2018 dataset
  • Validation Set: Testing partition of DFW2018 dataset
  • Testing Set: DFW2019 dataset

The competition is now live! Please register below to participate and get details regarding obtaining the datasets:

DFW2019 Dataset

The DFW2019 dataset contains over 3800 images of 600 subjects, encompassing different disguise variations including variations due to bridal make-up and plastic surgery. The dataset contains anonymized images with names 0.jpg to 3839.jpg. For this competition, the submitted algorithms will be evaluated on the DFW2019 dataset. The dataset is available as a password protected zip file along with the other files. The password is available to the participant after filling the registration form.

DFW2018 Dataset

The DFW2018 dataset contains over 11,000 images of 1000 subjects. The dataset follows a pre-defined protocol, where images of 400 subjects form the training set, and the remaining 600 subjects constitute the test set. For this competition, the training set corresponds to the training partition of the DFW2018 dataset and the validation set corresponds to the test set of the DFW2018 dataset.
The dataset contains a folder for each subject, which consists of the subject normal, validation, disguised, and impersonator images. The nomenclature of the dataset is as follows:

  • Subject normal images are named as xxx.jpg. For example: Anna_Nicole.jpg
  • Subject validation images follow xxx_a.jpg naming format. For example: Anna_Nicole_a.jpg
  • Disguised images are named as xxx_h_0xx.jpg. For example: Anna_Nicole_h_001.jpg
  • Impersonator images are named as xxx_I_0xx.jpg. For example: Anna_Nicole_I_001.jpg

Participants are encouraged to refer to the following two papers for the DFW2018 dataset:

  • M. Singh, R. Singh, M. Vatsa, N. Ratha, and R. Chellappa, Recognizing Disguised Faces in the Wild, IEEE Transactions on Biometrics, Behavior, and Identity Science, Volume 1, No. 2, Pages 97-108, 2019.
  • V. Kushwaha, M. Singh, R. Singh, M. Vatsa, N. Ratha, and R. Chellappa, Disguised Faces in the Wild, IEEE International Conference on Computer Vision and Pattern Recognition Workshop on Disguised Faces in the Wild, 2018

The dataset is available as a password protected zip file along with the other files. The password is available to the participant after filling the registration form.

Submission

At the time of submission, participants will be required to submit the following:

  • Participants are required to generate similarity scores (a larger value indicates greater similarity) from the biometric matchers. If a participant's matcher generates a dissimilarity score instead of a similarity score, the scores should be negated or inverted in some way so that the resulting value is a similarity measure. Participants in the competition have been provided with the testing set. From the data, the participants are required to generate and submit similarity matrices of size 3840x3840, the size of the testing data. The ordering of test images is same in both row and column. The (i,j) entry of a similarity matrix is the similarity score generated by the algorithm when supplied an image i from the testing set and query entry j as a probe sample. Entry (i,i) as it corresponds to matching the same image.
  • Participants are also required to submit the score matrix on the validation partition (7771x7771) for the overall protocol. The ordering should be exactly same as the order given in the text file containing subject names.
  • Participants are required to submit the matrices along with the companion data for the corresponding 1,000 points ROC curves. The match scores computed with validation and disguised images will comprise the genuine scores. Impostor scores will include match scores generated from impersonator images as well as cross subject matching scores.
  • While it is not mandatory, we also encourage the participants to submit their models/executable/API for verification of the results.
  • The participants can choose to remain anonymous in the analysis and report. Participants must explicitly make this request; the default position will be to associate results with participants. If you wish to keep your submission anonymous, kindly send an e-mail to maneets@iiitd.ac.in and rsingh@iiitd.ac.in with subject line "[DFW] Request for Anonymity".

Evaluations

Evaluations will be performed on the DFW2019 dataset for the following four protocols:

  • Impersonation
  • Obfuscation
  • Plastic Surgery
  • Overall

Important Dates

  • Result submission to organizers: July 10, 2019 Extended to July 20, 2019
  • Result declaration and invitation to top performing teams for paper submission: July 15, 2019 July 22, 2019
  • Paper submission for DFW2019 workshop: July 31, 2019
  • Camera-ready deadline: August 30, 2019