The human is the dominant object of interest in the huge network of surveillance cameras deployed in buildings, airports, streets, and so on. The captured images and videos enable various applications such as face recognition, person re-identification, crowd and behavior analysis, etc. Certain applications, such as face recognition, rely on the specific facial characteristics associated with identifiable information - typical for biometrics problems. On the other hand, for a wide range of other applications, it is not necessary to access identifiable information on faces, yet the recording of faces arouses potential privacy concern from the general public. To address this concern, one possible solution is face de-identification, which aims to eliminate the identifiable information by modifying a face image while preserving data utility – a counter problem to biometrics.

We recognize the need of de-identifying a face image while preserving a large set of facial attributes. The proposed approach jointly models face de-identification and attribute preservation in a unified optimization framework. Specifically, a face image is represented by the shape and appearance parameters of AAM. We select k images that share the most similar attributes with those of a test image. We formulate an objective function and use gradient descent to learn the optimal weights for fusing k images. As shown in Fig. 1, given a test image, our method can change the identity of the subject while preserving as many attributes as possible.

APFD Introduction

Figure 1: The proposed APFD method can change the identity of a face image while preserving facial attributes.

As shown in Fig. 2, the proposed method mainly consists of two parts. First, an AAM model, a set of facial attribute classifiers, and a face verification classifier are learned. We apply these classifiers and model on the gallery set to compute the attributes, shape and appearance parameters of all images. Second, given a test image from the same set, we find the top k images that share the most number of similar attributes to those of the test image and save the corresponding shape and appearance parameters. Similar to k-Same-M algorithm, we update the image by linearly weighting the shape and appearance parameters of k images. Instead of applying a constant weight, we formulate an objective function to estimate the optimal weights such that the deidentified image and the original image will have as many common attributes as possible while being classified as two different subjects.

APFD Overview

Figure 2: Overview of the proposed Attribute-Preserved Face De-identification (APFD) algorithm.

APFD Dataset

You can download the 200-image gallery/test dataset from here.

If you use APFD dataset, please cite to the paper:

Publications

  • Attribute Preserved Face De-identification
    Amin Jourabloo, Xi Yin, Xiaoming Liu
    Proc. 8th IAPR International Conference on Biometrics (ICB 2015), Phuket, Thailand, May. 2015 (equal contribution by first two authors)
    Bibtex | PDF
  • @inproceedings{ attribute-preserved-face-de-identification,
      author = { Amin Jourabloo and Xi Yin and Xiaoming Liu },
      title = { Attribute Preserved Face De-identification },
      booktitle = { Proc. 8th IAPR International Conference on Biometrics },
      address = { Phuket, Thailand },
      month = { May },
      year = { 2015 },
    }