EmploymentEuropeTechThought

AI risks for employers and employees

AI-powered transcription tools are increasingly used in the workplace. These products can join video conferences on various platforms like Zoom, Microsoft Teams, or Google Meet, record and transcribe conversations in real time, and synchronise with calendars and other applications. They are marketed as productivity enhancers; however, their deployment raises significant data protection and AI governance risks both for employees and employers.

This article takes Otter.ai as case study, drawing on EU data protection law and the AI Act. It also refers to the US class action lawsuit against Otter.ai to illustrate that these risks are not theoretical, already generating litigation that can affect many organisations.

According to Otter.ai’s documentation, the service can join online meetings as a participant and provide live transcription. It automatically synchronises with Microsoft Outlook or Google calendars and can start recording without the user’s action. Crucially, Otter.ai places responsibility on the account holder to obtain permission from other participants.

This means that one individual may trigger the recording or transcription of a meeting without the knowledge or consent of others. Importantly, all data recorded is transferred, stored and processed to servers in the United States.

Several provisions of the General Data Protection Regulation (GDPR) are directly engaged:

Legal basis: Otter.ai operating model relies on one participant securing permission for all others. Under Articles 6 and 7 GDPR, this is not valid consent. Consent must be informed, specific, and freely given, which cannot be achieved by delegation to a single meeting participant. Guidelines further stress that in the employment context, imbalance of power makes employee consent invalid.

Processing special categories of data: meetings often involve trade union matters, HR issues, or health information. Processing such data is prohibited under Article 9 GDPR unless a narrow exemption applies.

Transparency: Articles 13 and 14 GDPR require data subjects to be informed. A ‘silent’ transcription bot makes this impossible in practice.

International transfers: all data is transmitted to the US. Following Schrems II (C-311/18), such transfers are permissible only under the EU–US Data Privacy Framework or with supplementary safeguards. Given the sensitivity of workplace discussions, reliance on standard contractual clauses alone may not be sufficient.

Security: automatic synchronisation with calendars and meeting software gives Otter.ai broad access to organisational systems, which the IT department might not be aware that such systems have been installed by individual users. Article 32 GDPR requires appropriate technical and organisational measures, which cannot be demonstrated where third-party AI tools access internal infrastructure without control.

Breach notification: if meetings were recorded or transcribed without participants’ knowledge, this may constitute a personal data breach under Articles 33 and 34 GDPR, triggering obligations to notify the supervisory authority and, in some cases, the data subjects.

Litigation risks are not confined to Europe. In August 2025, a class action complaint was filed in the US District Court for the Northern District of California (Brewer v. Otter.ai, Inc., Case No. 5:25-cv-06911). The plaintiff alleges that Otter.ai records and transcribes conversations of non-users without their knowledge or consent and uses this data to train its machine learning models.

The complaint states: “Otter does not obtain prior consent, express or otherwise, of persons who attend meetings where the Otter Notetaker is enabled, prior to Otter recording, accessing, reading, and learning the contents of conversations.” Brewer further alleges that as a non-Otter user, he had no reason to suspect that his conversational data would be retained and processed by the company.

Computerworld framed the lawsuit as part of a ‘wider reckoning’ for enterprise AI note-taking apps. The legal claims include violations of the Electronic Communications Privacy Act, the California Invasion of Privacy Act, and the Computer Fraud and Abuse Act, as well as common law privacy torts. Although these statutes differ from the GDPR, the factual allegations mirror the same concerns: lack of valid legal basis, improper reliance on third-party consent and opaque use of data for AI training.

For EU workplaces, this case illustrates the litigation exposure that arises when consumer-grade AI tools are deployed without robust governance. Under Article 82 GDPR, any data subject who suffers material or non-material damage has the right to compensation. Silent transcription of workplace meetings could easily generate such claims.

The AI Act under Annex III, classifies AI systems used for worker management and monitoring as high-risk (Article 6, Annex III). Otter.ai advertises ‘sentiment analytics’ and other productivity features. In a workplace setting, this would presumably fall into the high-risk category.

Under Articles 9–15 AI Act, such systems will be subject to strict risk management, transparency, and human oversight obligations. Organisations that deploy them will carry compliance responsibilities even when the provider is established outside the EU.

This article takes Otter.ai as case study, drawing on EU data protection law and the AI Act. It also refers to the US class action lawsuit against Otter.ai to illustrate that these risks are not theoretical, already generating litigation that can affect many organisations

Beyond the legal analysis, several practical risks are apparent:

Surveillance: automatic transcription creates a record of every utterance. For employees, this is indistinguishable from constant monitoring. Research on workplace surveillance has already shown the (European Parliament Think Tank 2020).

Accuracy and bias: errors in AI transcription can distort meaning, particularly for non-native speakers or those with speech impairments. Sponholz et al (2025) and Eftekhari et al (2024) demonstrate how mis-transcription introduces bias in research, which is equally damaging in workplace decision-making.

Security and misuse: transcripts and recordings are stored in multiple locations, sometimes accessible to third parties. Once produced, such records may be repurposed or misused, including in litigation.

Accountability: managing, editing, and validating transcripts requires additional resources. It also raises questions about responsibility for the accuracy of the record and the consequences of errors.

For EU organisations, the following governance approach is advisable:

Rely on built-in enterprise tools only where a Data Protection Impact Assessment (DPIA) supports their use.

Adopt an internal policy on the type of recording and transcription services, when they are necessary. Recording should be only with prior notice and explicit consent from all participants. Extend the policy to external meetings and seminars: participants must be informed and enable the right to opt out.

Block consumer-grade transcription tools such as Otter.ai from connecting to internal systems.

Restrict access, limited retention, and secure effective deletion.

Clear prohibition on external transcription services without DPO and IT approval.

Consult worker representatives and trade unions before introducing such technology, consistent with data protection by design under Article 25 GDPR and the AI Act’s emphasis on human oversight and worker information.

Otter.ai demonstrates how easily consumer-grade AI tools can enter workplaces. Its features promise efficiency, but in practice they present legal non-compliance under the GDPR, high-risk classification under the AI Act, and significant organisational risks. The Brewer v. Otter.ai litigation shows that these risks are not speculative but already materialising in court.

As the European Data Protection Supervisor noted in its Orientations for Generative AI (2024), public and private entities must “place compliance and fundamental rights at the centre of digital innovation.” Transcription and notetaking tools are no exception.