12

As of Friday I became aware of otter.ai as it was used in one of my online interviews. I'm not seeking legal advice about the situation but it has certainly raised some big questions for me on the legal implications of using AI tools in this way in general.

There are a number of problems I see here:

  1. I never consented to this. I saw the bot but I didn't know what it was and had no time to research because, well, it's an interview
  2. To later access the transcript it generated (which, at the time, I didn't know it was doing), I had to give the app access to my personal calendar
  3. It misquotes me during the interview at numerous points including getting my previous employer's name wrong, so this is not just subjective on my part
  4. It contains at least one screenshot of me, now clearly stored on their servers. I stopped reviewing it after that point so I don't know just how much of my conversation it retained - I just wanted the app out of my calendar

I find myself very upset about this, particularly as I was discussing detailed parts of my past work for other companies, which is now both misquoted and presumably stored on a 3rd party server.

I don't believe for one second that this was malicious but when I look at this now - does this contravene GDPR and is it a potential legal issue that companies could innocently stumble into?

roganjosh
  • 417
  • 4
  • 12

3 Answers3

6

High-level overview: there may be GDPR violations here. The relevant issues are:

  • Was your consent informed?
  • Was access to your calendar needed?
  • Are you allowed to have the incorrect information rectified?
  • Was the purpose of the information that they are storing clearly explained (including the screenshot(s) that you mention)?

It is important to note post-Brexit, the EU GDPR has been retained but, slightly modified under the Data Protection Act 2018. When I refer to GDPR below, I am referring to UK GDPR.

The company itself can be held responsible for GDPR breaches if they have not ensured that the AI tool is GDPR compliant. They may also be in breach if they have not informed and acquired the consent of those being interviewed. On the subject of sensitive data, which you allude to providing, it may be the case that the company is required to conduct a Data Protection Impact Assessment (DPIA). Below I have addressed your specific concerns:

Consent: Under GDPR, consent to the processing of personal data must be freely given, informed, specific, and unambiguous. While it is true that you provided consent to the processing of the disclosed information in your interview, it could be argued that this consent was not informed as you were unaware of their use of Otter.ai (see Article 7 and Recital 32 of the GDPR for more information).

In 2020, the Norwegian Consumer Council and NOYB filed complaints against Grindr for sharing users' personal data with third parties without valid consent. The Norwegian Data Protection Authority fined Grindr €6.5 million.

Access to your calendar: The issue is no longer one of consent. Instead, GDPR mandates that organisations must only collect data necessary for their purposes. If they do not have a good reason for needing calendar access, then there is an argument to be made that there has been a breach here (see Article 5(1)(c) of the GDPR for an explanation of the Data Minimisation Principle).

Explicitly, Article 5(1)(c) defines the Data Minimisation Principle in the following way: "Personal data shall be adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed (data minimisation)”

Accuracy: The accuracy of any information recorded in an interview always has a possibility for error - whether AI-generated or due to human mistake. GDPR does include a right to rectification. This means that you should be allowed to request corrections to inaccurate data (under Article 16 of the GDPR).

The Data Protection Act 2018 states that personal information must be "incorrect or misleading as to any matter of fact" for this aspect of the UK GDPR to apply. The factual requirement is particularly important here.

Data Storage: GDPR requires that you are informed of a) what data is stored, b) for how long it will be stored, and c) the purpose behind storing the data (see Article 13 and Article 14 of the GDPR for information directly from you and indirectly obtained by Otter.ai respectively). Additionally, storage on third-party servers requires adherence to data transfer rules. The exact nature of these rules depend on the country that those servers are in. For example, there are Standard Contractual Clauses (SCCs) safeguards that may apply if the servers are in the US.

There have been numerous high-profile cases of GDPR violations relating to issues of transparency. In 2019, France's data protection authority, CNIL, fined Google €50 million for insufficient transparency and lack of valid consent in personal data processing for ad personalisation. In 2020, H&M faced a €35.3 million fine in Germany for unlawfully surveilling employees. In 2020, the UK's Information Commissioner's Office (ICO) found that Experian had violated GDPR by processing personal data without sufficient transparency. Experian collected data from various sources and used it for marketing purposes without adequately informing individuals.

FD_bfa
  • 6,468
  • 1
  • 21
  • 80
4

The detail in the description of the circumstances is insufficient to be definitive about whether this organisation behaved lawfully or unlawfully.

Certainly there are "personal data" within the meaning of GDPR; it seems there is "special category" data too (the image of the interviewee's face); there is "transfer" to a third party (otter.ai), and (I think) to another country (the USA, where otter.ai is based). Therefore the UK GDPR and Data Protection Act 2018 apply, in which there are many rules.

What is not evident in the description is whether the organisation was forthcoming and transparent about its collection and processing of personal data, the purposes for which this is done, and the "lawful basis" (i.e. legal reason) for each purpose. If it wasn't - if the interviewee could not be said to have been informed of any of this - then on the face of it the organisation behaved unlawfully.

The interviewee - the "data subject" - has a "right to be informed" (GDPR Article 13) about all of that and more. The bare minimum is to provide a privacy policy or privacy notice. In some circumstances, particularly the more sensitive the data or activity, the organisation ought to be more proactive about bringing this information to the attention of the data subject at the relevant time.

This seems to be the first item to address.

Quoting from GDPR Article 12:

The controller shall take appropriate measures to provide any information referred to in Articles 13 and 14 and any communication under Articles 15 to 22 and 34 relating to processing to the data subject in a concise, transparent, intelligible and easily accessible form, using clear and plain language, in particular for any information addressed specifically to a child. The information shall be provided in writing, or by other means, including, where appropriate, by electronic means. When requested by the data subject, the information may be provided orally, provided that the identity of the data subject is proven by other means. ..

Quoting from Article 13(1):

  1. Where personal data relating to a data subject are collected from the data subject, the controller shall, at the time when personal data are obtained, provide the data subject with all of the following information: ...

(c) the purposes of the processing for which the personal data are intended

Two key points about purposes (Article 5 Principles relating to processing of personal data):

The personal data is "collected for specified, explicit and legitimate purposes and not further processed in a manner that is incompatible with those purposes;" and "adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed (‘data minimisation’)". That is to say, the organisation must tell you what is the purpose specifically and the collection and processing must be proportionate to this purpose - the system is disproportionate if the purpose can be reasonably achieved with less.

A real UK case was a leisure gym group that was deemed by the Information Commissioner's Office to have gone overboard on using facial recognition technology (FRT) and fingerprint scanning to monitor employee attendance, because the company wouldn't offer an alternative system to employees who were not comfortable with this and couldn't demonstrate that it could not achieve the same purpose by using ordinary ID cards.

as well as the legal basis for the processing;

There are six lawful bases (Article 6); "consent" is but one of these although the others do not seem relevant/appropriate in the circumstances described.

If the lawful basis is "consent", it must be a "freely given, specific, informed and unambiguous indication of the data subject's wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her" (Article 4(11)). The organisation must be able to show that the data subject consented to this processing. (Article 7 conditions for consent.)

(d) where the processing is based on point (f) of Article 6(1), the legitimate interests pursued by the controller or by a third party;

Another answer mentions the lawful basis of "legitimate interest". This paragraph (d) means that if the organisation relies on legitimate interest it must inform the data subject what specifically is the interest, it can't simply say "we have a legitimate interest". Furthermore the implication is that this interest outweighs the interests of the data subject, which is or ought to be a high bar to reach.

(e) the recipients or categories of recipients of the personal data, if any;

(f) where applicable, the fact that the controller intends to transfer personal data to a third country or international organisation and the existence or absence of [F1relevant adequacy regulations under section 17A of the 2018 Act], or in the case of transfers referred to in Article 46 or 47, or the second subparagraph of Article 49(1), reference to the appropriate or suitable safeguards and the means by which to obtain a copy of them or where they have been made available.

Transfers have a whole Chapter dedicated to them, Chapter V. Again the organisation must tell the data subject about all of this ahead of time. "We use this software, it is made by company X, which stores the data in country Y" etc.

The special category data - the facial image - deserves a mention of its own because it has its own conditions for processing (Article 9). Again the only relevant condition seems to be consent, unless the image was "manifestly made public by the data subject" (e.g. it is a Linkedin profile photo).

Lag
  • 20,104
  • 2
  • 46
  • 76
0

The precise details of how this tool works may be a bit of a red herring. Ultimately, a business will inevitably process a lot of very personal data over the course of an application and this is just one of the tools they use. Some businesses might still do everything with offline MS Office files but any cloud-oriented tool stack or modern Application Tracking Systems will involve multiple tools and data hosted on third-party servers, many of them in the US, and face a lot of legal issues. Otter does seem to have a rather poor culture when it comes to privacy but don't assume the stuff you don't see is any better (ATS in particular seem just as error prone when parsing and filtering resumes)!

From a data protection point of view, the business you interviewed with may have collected consent when you first submitted your application or plausibly invoke another basis for processing your data including article 6(1)(b) “taking steps at the request of the data subject prior to entering into a contract” or 6(1)(f) legitimate interest. As the hiring process necessarily involves processing a lot of personal data, they definitely need to spell out that basis in writing. It would be equally true if they didn't use otter.ai and merely kept resumes on a cloud drive but it's not obvious that they should ask for consent separately for each tool or refrain from transcribing interviews as part of that process.

Of course, their obligations do not stop there, they do need to keep track of all the data they collected about you, take steps to minimize and secure it, allow you to correct mistakes, respond to data access requests, etc. It's now become common to receive notifications from businesses asking for permission to retain application data, usually about one year after the original application. If you decline to consent to any further retention, the transcript then ought to be deleted with everything else, except perhaps some bare bones information they think they can retain longer.

Regarding Otter specifically, they recently added some “assistant” features (like an automated summary) and started talking even more about “AI” in their marketing material but at the core it's a transcription tool which has been around for years. Transcripts are not completely new and it seems that it would be no more (and no less) problematic than a “dumb“ recording or your interviewer's notes. Even resumes (which, in some countries, tend to include pictures) or annotation from recruiters are far from innocuous. In other words, this is not a legal grey area, any business processing personal data from job applicants has a number of obligations under the GDPR and hopefully reviews processes and vendors with this in mind.

Some of these obligations may be harder to satisfy while using otter.ai (like many businesses, their website name checks the GDPR but provides very little specific info so I would take that with a grain of salt) and it's unclear whether this potential employer is really prepared for it but you don't really know if you haven't checked their privacy policy or gone through the deletion process.

Relaxed
  • 1,174
  • 6
  • 6