Part 1: What should be considered under data privacy law with voice assistants?

Adrian Bieri and Nina Habicht • November 29, 2020

Introduction

Voice assistants such as Google Assistant, Alexa or Siri - also called "virtual assistants" - have been criticised by media because of their capacity to "listen" what users say. While tech providers like Google, Apple and Amazon provide vast amounts of information in privacy policies that are difficult to read and understand, companies still lack clarity on how to proceed with a chatbot or voice assistant project. 

In this first blog article Voicetechhub and Dr. Adrian Bieri, Partner of the Swiss law firm Bratschi AG, give concrete answers to the questions what should be considered from a technical point of view and under data privacy law when implementing digital assistants. It also gives a great deep-dive into data protection law (in particular GDPR) and its impact on digital assistants based on Natural Language Processing. 

When to involve legal in voice assistant projects?

Before discussing with your legal team, you must have a clear idea of what your digital assistant should solve. We experience that companies often do not know what they want to reach with a chatbot or voice assistant.  

A short checklist for voice product managers

During the early-stage of your project, you should have answered basic questions first before starting discussions with your responsible managers in the fields of legal and data protection law.


  • What is the goal of the chatbot or voice assistant? 
  • There are various methods such as creating a customer journey, scenarios and thinking about the pains and needs of your customers. 
  • How does the current process of your specific use case look like? Start thinking about this process and where a bot supports best the current problems you encounter. After that, you can go into a deep-dive with conversational design and user story definitions. 
  • Finally, you should have a list of goals you want to achieve with the bot. 
  • Which systems are needed? 
  • During the requirements phase, clarify which systems are needed. Note down any knowledge base, data base or system which might be an interface to your chatbot, e.g. CRM system. 
  • What kind of data is saved in which system? 
  • After having sketched down the system’s architecture, write down where you gather which kind of information of your users. 
  • Also, make a note, if there is any personalised data stored somewhere. And: do you authenticate your users (i.e. request for login data)?       
  • Verify with your project team how long you need the data to be stored. 
  • On which platforms do you deploy the chatbot? Is it a Facebook, Slack, MS Teams bot or is the assistant running on smart speakers or your website? Depending on the platform, control platform-specific guidelines for your developers. 

What has to be considered in terms of data privacy law (DSVGO) with chatbot and voice assistant projects?

General data protection law principles 

Data protection laws such as the European Union’s General Data Protection Regulation ("GDPR") and the Swiss Federal Act on Data Protection ("FADP") aim to protect personal data. Personal data means any information relating to an identified or identifiable natural person. An identifiable natural person in the sense of the GDPR and the FADP is one who can be identified, either directly by reference to an identifier such as a name, an identification number, location data, an online identifier (such as an IP-address or an E-Mail address) but also indirectly by comparing various pieces of information. Data that does not relate to a specific individual, on the other hand, usually falls outside the scope of data protection legislation. 

Chatbots and other applications based on artificial intelligence ("AI") process various amounts of personal data, in particular also for training purposes. Therefore, chatbots and other similar AI-applications (e.g. digital assistants) regularly fall within the scope of data protection law. And if chatbots and digital assistants process biometric data such as facial images or a person's voice (e.g. for voice and face recognition purposes), the stricter data protection regulations for processing sensitive personal data (i.e. processing of special categories of personal data) apply. It should be noted, that once personal data is anonymised – which requires irreversibly removing the possibility of identifying an individual through the data in question – it no longer constitutes personal data, thus becoming exempted from data protection law obligations. This option should be considered, where the purpose of the processing of data does not require the link to an identifiable individual. Besides anonymisation, pseudonymisation can be seen as a further means of restricting the applicability of data protection law.

Once it is clear that data protection law applies, the question arises as to which data protection obligations must be observed when using chatbots and other AI-applications. Data protection law imposes various obligations with respect to the processing of personal data. The central data protection obligations in connection with chatbots certainly include compliance with the information obligations under Art. 13 GDPR and ensuring that there is a legal basis for the processing of personal data according to Art. 6 GDPR. 

Other relevant obligations in connection with the processing of personal data by chatbots and other similar AI-applications include:
  • (i) the implementation of appropriate technical and organisational security measures (Art. 32 GDPR) to protect personal data against accidental or unlawful destruction or loss, alteration, unauthorised disclosure or access as well as 
  • (ii) conducting a data protection impact assessment according to Art. 35 GDPR to evaluate and, where necessary, mitigate the risks associated with the application in question. 
Further specific obligations may apply if chatbots and other AI-applications (in particular in combination with digital tracking tools) are used to create large user-profiles and, therefore, such processing activities are considered as profiling under Art. 22 GDPR. Finally, the GDPR and also the FADP impose various documentation obligations and other duties, which must be respected when processing personal data via a chatbot or other similar AI-applications.    

Information obligation and having a legal basis for processing in particular 

As previously mentioned, providing transparent information about the data processing that takes place via the chatbot and ensuring a legal basis for the processing is provided for are paramount considerations.

According to Art. 13 GDPR, the data controller (e.g. the company who wants to provide a chatbot or a voice assistant to its customers, see also section 4.3) shall, at the time when personal data are obtained, provide the chatbot user with various information about the relevant data processing activity. This information includes, among others, the purposes of the processing for which the personal data are intended, the legal basis for the processing as well as the recipients or categories of recipients of the personal data, if any. 

In order to fulfil the information obligations according to Art. 13 GDPR, it is necessary to inform the user at the beginning of the chat or before using the digital assistant about the purpose and scope of processing. This must be done, in accordance with Art. 12 GDPR, in a precise, transparent, comprehensible and easily accessible form in clear and straightforward language.  

In the case of a chatbot, it is advisable to indicate within the chat itself, in a basic form, how personal data are processed and to provide a link to the privacy policy which contains further information. This information must also be provided before personal data is processed, ideally before the digital assistant is installed, and must be kept available via the app or website. Informing the user about the purposes of processing is feasible as long as the chatbot’s functionalities are limited, thus, making it possible to predetermine which topics will be covered by the chatbot. The situation is, however, different and more complex with regard to the variety of possible voice commands in the field of digital assistants.

Art. 6 GDPR provides that processing of personal data is lawful only if, and to the extent that, a legal basis as provided under Art. 6 GDPR is applicable for a given data processing activity. These legal bases under Art. 6 GDPR include, among others, the user's consent to the processing (Art. 6(1)(a) GDPR), or, as an alternative, the necessity of processing the data to enter into or perform a contract or contractual negotiations (Art. 6(1)(b) GDPR; contractual necessity). Moreover, Art. 6(1)(f) GDPR establishes that a processing activity necessary for the legitimate interests pursued by a data controller (or by a third party) form a legal basis for processing, insofar as the controller's interests are not overridden by the interests, fundamental rights or freedoms of the affected data subjects. We recommended carrying out as many data processing operations as possible based on Art. 6(1)(b) GDPR because this legal basis neither provides for the possibility of withdrawal – as in the case of consent by the user – nor for a possibility of objection. 

Since many chatbots are currently still designed for a specific function (e.g. customer support) and the corresponding user input usually contains product- or service-related questions or orders from the user, it is often possible to base the data processing operation on contractual necessity according to Art. 6(1)(b) GDPR. Also, it seems feasible to argue that the purpose of data processing is aligned with the contractually-agreed provision of a functional digital assistant. However, should court case law or regulators opt against this approach, the legal basis of contractual necessity (Art. 6(1)(b) GDPR) could in many cases fail to serve as a legal basis for data processing by digital assistants. In view of the fact that the legal basis of the legitimate interest under Art. 6(1)(f) GDPR involves, depending on the specific individual case, significant legal uncertainties, the providers of digital assistants would in many instances be advised to obtain consent from the user, which conforms with the requirements under Art. 6 (1)(a) GDPR for legally valid consent. 

Moreover, if a user discloses sensitive data to a chatbot or voice assistant and if the company that offers the chatbot or voice assistant to its customers obtains the consent from the users, Art. 9 GDPR requires the explicit informed consent of the user. This requires among others that the user is informed by the controller before giving his consent as to which specific data processing activities will be covered by his consent. A mere voice command to activate a digital assistant, for example, is most likely not sufficient to be considered as legally valid consent according to the requirements of Art. 6 (1)(a) and 9 GDPR, particularly because of the absence of sufficient prior information about the relevant data processing activities for the user by the data controller. However, practically this is often done, which is solved by the chatbot providers through anonymisation of data.

Which role do you have: Controller or Processor?

When it comes to the question, who is responsible for the fulfilment of the various obligations data protection law provides for chatbots or voice assistants, one has to first clarify, who of the involved parties (i.e. the company, that offers the chatbot or voice assistant to its customers, and the various service providers the company engages to provide the chatbot or voice assistant) plays what role from a data protection law perspective. 

The primary distinction is between the controller and the processor:
  • The controller is the person who determines the purpose and means of a data processing activity (Art. 5(j) FADP). 
  • The processor processes personal data on behalf of the controller and not for his own purposes. In principle, the controller is responsible for the compliance with most of the obligations that the GDPR or the FADP set forth. 
  • In the particular scenario where two or more controllers jointly determine the purposes and means of the processing of personal data, they are joint controllers. Joint controllers must, by means of an "arrangement" between them, apportion data protection compliance responsibilities (e.g., it should be agreed which controller shall be responsible for providing clear information to data subjects).  
With respect to digital assistants, the company that offers them to the users is considered as a controller, and, therefore, is responsible for compliance with data protection law. It cannot be excluded that the company which uses the digital assistant and the company which develops and operates the digitalassistant are to be considered as joint controllers, however.  

Legal checklist for voice assistants and chatbots

To summarise, we developed the following final checklist, which helps companies from a data protection law point of view when implementing voice- or chatbots:

  1. Clarify (i) which kind of data (e.g. non-personal data, personal data, sensitive personal data) is processed by chatbots and digital assistants and (ii) how the flow of such data is structured (e.g. by whom and where is such data processed).
  2. Make sure you have a proper legal basis for the processing or consider to implement anonymisation or pseudonymisation procedures in order to exclude (at least in part) the applicability of data protection law. 
  3. Provide the users in advance with clear information about the associated processing of personal data (e.g. privacy policy). 
  4. Perform before the use of chatbots and digital assistants a data protection impact assessment ("DPIA") and think of additional legal requirements that apply in the event of profiling. 
  5. Verify which role from a data protection law point of view you have in a specific digital assistant project and make sure to conclude the appropriate agreements (e.g. data processing agreement, joint-controller agreements, controller-controller agreement) with all the parties that process personal data in the course of a chatbot or voice assistant project. 
  6. Comply with the additional legal requirements if personal data processed by digital assistants is transferred to a destination country which does not provide an adequate level of data protection.

Philipp Leuthold

Dr. Adrian Bieri

Partner at Bratschi Ltd., Head of PG Intellectual Property, Technology and Privacy

Adrian has an in-depth experience in the fields of legal technology, intellectual property and privacy. He has been working as a lawyer and partner at Bratschi Ltd., which is a full-service law firm with six offices throughout Switzerland. He is highly engaged in solving new legal issues related to emerging technologies. Adrian holds a degree in law from the University of Zurich.

Need support with your Generative AI Strategy and Implementation?

🚀 AI Strategy, business and tech support 

🚀 ChatGPT, Generative AI & Conversational AI (Chatbot)

🚀 Support with AI product development

🚀 AI Tools and Automation

Get in touch
By Nina Habicht April 29, 2025
AI-powered chatbots, whether developed in-house or deployed through trusted platforms, are revolutionizing customer service, knowledge access, and internal communication. However, alongside these opportunities come new legal obligations: data protection , transparency , and EU AI Act compliance must be addressed carefully. This article covers: Where AI chatbots bring business value What compliance risks you must manage How to implement AI chatbots successfully and securely
By Nina Habicht March 25, 2025
What's RAG? The goal is for the language model is not to draw on its own knowledge (from the model), but for information to be enriched in the prompt. This is usually your own data you provide to the model (PDFs, systems).
Video Creation: The Ultimate Guide to Runway, Luma AI, Haiper.ai, and Hailuo AI
By Nina Habicht February 16, 2025
Video Creation: The Ultimate Guide to Runway, Luma AI, Haiper.ai, and Hailuo AI
What are the best AI powerpoint tools
By Nina Habicht February 16, 2025
What are the best AI powerpoint tools. Discover tools that create presenations with AI.
Image Creator Tools
By Nina Habicht November 24, 2024
Ultimative review of all relevant image creation tools
Optimizing your Website for AI: How to get found by ChatGPT
By Nina Habicht August 24, 2024
Optimizing your Website for AI: How to get found by ChatGPT. This article provides concrete Large Language Model Optimization strategies for SMEs and companies.
A Practical Guide for Midjourney Image Generation. Learn how to create professional images.
By Nina Habicht August 24, 2024
Since August 2024, users have been able to use the web version of the image creation tool Midjourney. This simplifies usage by providing a user-friendly interface to experiment with one of the top Generative AI image creation tools available. We tested it for you and are sharing helpful tips and tricks. How to prompt images with Midjourney? If you use Midjourney on discord, there is a clear prompt structure and prompt parameters to adhere to. Usually, it makes sense to stick to it: 1) To prompt use "/Imagine" 2) Then enter your subject (description and details) you want to see on the image and it's environment (see yellow highlighted below in the prompt example) 3) Then enter composition, lightning, colours (see green highlighted below in the prompt example) 4) Finally add technical parameters to adjust and finalize your image. Please find a useful parameter library here.
Agentic AI vs. Gen. AI vs. RPA
By Nina Habicht August 11, 2024
This article explains agentic AI and why it is so important when building generative AI and chatbot applications. Overview about Agentic AI vs. Gen. AI vs. RPA and all you need to know about these terms.
LLM Benchmarks: Finding the right LLM for your Needs
By Nina Habicht July 29, 2024
LLM Benchmarks: Finding the right LLM for your Needs
RAG vs. Finetuning. Open Source vs. Proprietary Models
By Nina Habicht June 25, 2024
RAG vs. Finetuning. Open Source vs. Proprietary Models. We explain what makes sense when.
Show More