7

Machine learning research into facial recognition and processing (e.g. emotion detection) often makes use of datasets containing large numbers of facial images. These are usually anonymised, so that the only other information contained is the file name(s) - such as 00001.jpg. How does GDPR affect the storage, processing, and sharing of such data?

For the sake of argument and to gain clarification of position, I would like to leave aside the provisions within GDPR that provide for processing data as scientific research. I am also working under the assumption that there is no other meta-data, such as a spreadsheet that links image filenames to the person depicted.

GDPR makes clear (Recital 26) that anonymous data is not covered by the regulations:

“… The principles of data protection should [therefore] not apply to anonymous information, namely information which does not relate to an identified or identifiable natural person or to personal data rendered anonymous in such a manner that the data subject is not or no longer identifiable.”

Data protection considerations would be significantly reduced if these datasets were considered out of scope of GDPR. Is it possible, however, for a face image in isolation to be considered truly anonymised? It would always be feasible to link it to a real person, because most people are capable of recognising faces. With the image, and the person standing in a crowd, it is likely that the average viewer could identify the depicted person.

4Oh4
  • 125
  • 1
  • 7

4 Answers4

5

Your data is not anonymous since from the picture of the face the individual can be identified.

It would be anonymous, if the face was blurred and other possible identifiable information was removed. Of course, that would defeat your purpose. Please note that, in any case, Anonymization Techniques are, themselves, a type of personal data processing that requires a legal ground, and achieving real anonymization is not a trivial matter (see Article 29 Working Party's opinion 0829/14/EN WP216 on the subject).

1.

The face of a person includes biometric information, which is defined in article 4 (14) among other types of personal information regulated by the GDPR as: "personal data resulting from specific technical processing relating to the physical, physio­logical or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data".

A facial recognition software's purpose is exactly to perform a specific technical processing based on the facial features of the persons, to achieve a unique identification of a person based on these biometric features.

Article 9 of the GDPR includes biometric data among the types of prohibited processing, unless one of the exceptions in § 2 applies.

There are 10 types of exceptions among which: consent of the person, employment context, personal data made public by the person, scientific or historical research,... (each exception having its own conditions).

You should check that you comply with one of these exceptions stated in article 9 § 2, if your application is about using facial images for unique identification of a person based on these biometric features.

2.

In turn, if your processing is not about unique identification of a person based on these biometric features, but only about emotions recognition (which you briefly mentioned at the beginning of your post), it could be considered as not falling under the requirements of Article 9.

That would still be a processing of personal information, but it would fall under the normal article 6 requirements.

Tardis
  • 527
  • 4
  • 10
3

Something is personal data, if there is an organisation which can link that data to a natural person, even if you cannot simply ask those organisations to do so for you. For example, an IP-Address is personal data, because an ISP can match an IP address with a household.

As far as I know, both google and facebook are able to identify a picture, based on other pictures of the same person. So that makes a picture of a face personal data, even if it is completely anonymised.

Because a face reveals racial and ethnic origin, article 9 GDPR applies;

  1. Processing of personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person's sex life or sexual orientation shall be prohibited.

But there are exceptions;

  1. Paragraph 1 shall not apply if one of the following applies:

    (a) the data subject has given explicit consent to the processing of those personal data for one or more specified purposes, except where Union or Member State law provide that the prohibition referred to in paragraph 1 may not be lifted by the data subject;

    [...]

Of course this consent can not be stored together with the image, as it would deanonymize it, which would defeat it's purpose.

Maybe it would be sufficient if a trusted third party, like the photographer could confirm the data subject has given consent for certain usage. You have to make sure the image is not used in any other way, as the consent must be very specific.

Otherwise you have to look at the other exceptions in Article 9.2. If nothing applies, it is not possible.

wimh
  • 2,925
  • 12
  • 16
2

Your face is pretty much the fundamental identifying thing about a person. Not only can people unconsciously identify faces they know, algorithms are getting increasingly better and if tied to a database, know way more people than any individual does.

While we don’t yet know how the courts will deal with this, I think it is naive to believe photos of individuals will not be captured by GDPR.

Dale M
  • 237,717
  • 18
  • 273
  • 546
0

While a court could narrow the application of "identifiable," a prudent person would, at this early stage, interpret the term broadly. "Anonymous information" is

information which does not relate to an identified or identifiable natural person or to personal data rendered anonymous in such a manner that the data subject is not or no longer identifiable.

Unless the image is altered so the person it depicts is no longer identifiable, the image is not "anonymous information."

Another answer asserts that "identifiable" implies "by an organization," but I see no basis for that in the regulation.

phoog
  • 42,299
  • 5
  • 91
  • 143