5

In recent years a lot of privacy-aware analytics solutions have started popping up. These generally work using a very similar principle (see e.g. Fathom's algorithm):

  1. Create a fingerprint of the user using a hash of some combination of their IP, User-Agent, possibly other personally-identifiable information (PII) and a rotating salt. For the purpose of this question, assume that this creates an identifier that cannot be reversed back into PII.
  2. Use this identifier to determine whether the user has visited before, by storing it in a database along with the most recently visited page and time of visit.
  3. Aggregate all analytics to such a degree that no individual user can be identified.

This conveniently gets around the issue of storing PII, since all stored information is either strongly anonymized (and minimal) or aggregated.

Some of these solutions advertise "No need for cookie banners!" (see e.g. Plausible's landing page), implying that the need for informed consent is bypassed by the anonymization. Is there any legal basis for this? For the purpose of this question, assume GDPR isn't relevant (since no PII is stored), and only compliance with the ePrivacy Directive is in question.


From my own reading the simple act of accessing the PII information in the first place, regardless of later anonymization, requires consent. As put by Article 5(3) of the Directive

Member States shall ensure that the storing of information, or the gaining of access to information already stored, in the terminal equipment of a subscriber or user is only allowed on condition that the subscriber or user concerned has given his or her consent, having been provided with clear and comprehensive information, in accordance with Directive 95/46/EC, inter alia, about the purposes of the processing. (emphasis mine)

As argued in a 2014 opinion on fingerprinting this also covers manufacturer-stored information such as User-Agents. In a 2014 opinion on anonymization it was argued that anonymization is a post-processing step, requiring consent to get the relevant information in the first place. In my opinion this means that, even despite the strong anonymization, and despite no PII being stored, these privacy-aware analytics solutions still require a cookie banner (as much as I hate to say it).

The reason I'm asking this question regardless is that the claims of "no need for cookie banners" appears to be based on legal advice, having been checked by a legal team. Having no experience with law, this makes me think that I am missing something; what am I misunderstanding that makes this usage exempt from the ePrivacy Directive consent requirements?

Birjolaxew
  • 153
  • 5

1 Answers1

5

First of all, in a GDPR contest, the process described is not strong anonymization. It may be hard for an outsider to go from the stored record to any PII, it is much easier for an outsider to "single out" an individual. This means that given a known individual, one can determine whether that person is among those listed in the records, or can determine this to a significant degree of probability. For this only the algorithm and the rotating salts are needed, one need not break the hash. Note also that the GDPR specifies that if a person can be singled out with the assistance of the site operator the data is not considered anonymized. Thus this data needs a lawful basis under the GDPR, and the various other GDPR requirement all apply.

However, even if the data were totally anonymized, and say just added to a count of users with this or that User Agent, the process of reading local data (including but not limited to cookies) itself requires informed consent, and so a cookie banner or other interaction with similar info under the e-Privacy directive (EPD). The EPD, being a directive and not a regulation, must be implemented by national laws, and the exact provisions in those laws may differ somewhat from country to country. But I believe that all of them require consent before any local data is read.

David Siegel
  • 115,406
  • 10
  • 215
  • 408