Part 1: What should be considered under data privacy law with voice assistants?
Introduction
When to involve legal in voice assistant projects?
A short checklist for voice product managers
During the early-stage of your project, you should have answered basic questions first before starting discussions with your responsible managers in the fields of legal and data protection law.
- What is the goal of the chatbot or voice assistant?
- There are various methods such as creating a customer journey, scenarios and thinking about the pains and needs of your customers.
- How does the current process of your specific use case look like? Start thinking about this process and where a bot supports best the current problems you encounter. After that, you can go into a deep-dive with conversational design and user story definitions.
- Finally, you should have a list of goals you want to achieve with the bot.
- Which systems are needed?
- During the requirements phase, clarify which systems are needed. Note down any knowledge base, data base or system which might be an interface to your chatbot, e.g. CRM system.
- What kind of data is saved in which system?
- After having sketched down the system’s architecture, write down where you gather which kind of information of your users.
- Also, make a note, if there is any personalised data stored somewhere. And: do you authenticate your users (i.e. request for login data)?
- Verify with your project team how long you need the data to be stored.
- On which platforms do you deploy the chatbot? Is it a Facebook, Slack, MS Teams bot or is the assistant running on smart speakers or your website? Depending on the platform, control platform-specific guidelines for your developers.
What has to be considered in terms of data privacy law (DSVGO) with chatbot and voice assistant projects?
General data protection law principles
- (i) the implementation of appropriate technical and organisational security measures (Art. 32 GDPR) to protect personal data against accidental or unlawful destruction or loss, alteration, unauthorised disclosure or access as well as
- (ii) conducting a data protection impact assessment according to Art. 35 GDPR to evaluate and, where necessary, mitigate the risks associated with the application in question.
Information obligation and having a legal basis for processing in particular
Which role do you have: Controller or Processor?
- The controller is the person who determines the purpose and means of a data processing activity (Art. 5(j) FADP).
- The processor processes personal data on behalf of the controller and not for his own purposes. In principle, the controller is responsible for the compliance with most of the obligations that the GDPR or the FADP set forth.
- In the particular scenario where two or more controllers jointly determine the purposes and means of the processing of personal data, they are joint controllers. Joint controllers must, by means of an "arrangement" between them, apportion data protection compliance responsibilities (e.g., it should be agreed which controller shall be responsible for providing clear information to data subjects).
Legal checklist for voice assistants and chatbots
- Clarify (i) which kind of data (e.g. non-personal data, personal data, sensitive personal data) is processed by chatbots and digital assistants and (ii) how the flow of such data is structured (e.g. by whom and where is such data processed).
- Make sure you have a proper legal basis for the processing or consider to implement anonymisation or pseudonymisation procedures in order to exclude (at least in part) the applicability of data protection law.
- Provide the users in advance with clear information about the associated processing of personal data (e.g. privacy policy).
- Perform before the use of chatbots and digital assistants a data protection impact assessment ("DPIA") and think of additional legal requirements that apply in the event of profiling.
- Verify which role from a data protection law point of view you have in a specific digital assistant project and make sure to conclude the appropriate agreements (e.g. data processing agreement, joint-controller agreements, controller-controller agreement) with all the parties that process personal data in the course of a chatbot or voice assistant project.
- Comply with the additional legal requirements if personal data processed by digital assistants is transferred to a destination country which does not provide an adequate level of data protection.

Dr. Adrian Bieri
Partner at Bratschi Ltd., Head of PG Intellectual Property, Technology and Privacy
Need support with your Generative AI Strategy and Implementation?
🚀 AI Strategy, business and tech support
🚀 ChatGPT, Generative AI & Conversational AI (Chatbot)
🚀 Support with AI product development
🚀 AI Tools and Automation


