boy-ball-sun-GDPR

Chatbots compliant with GDPR

Krzysztof Grabowski
7/2/2019
boy-ball-sun-GDPR
Virtual assistants are an increasingly common solution in the area of customer service. According to the Global Market Insights report, the global chatbot market is growing by more than 30% annually and by 2024 will reach the value of USD[1] 1.34 billion. Companies using such a form of communication with clients must remember to ensure that their personal data are processed in accordance with the GDPR guidelines.

Chatbots are most often used by banks, insurerance companies, retail sales chains, e-commerce sector, airlines, hotels, health service providers or restaurant chains. According to Gartner[2], by 2020 as much as 85% of customer contacts with brands will be conducted with the participation of artificial intelligence. Companies engaging robots to provide customer service significantly shorten the time of process implementation, so they can provide faster and cheaper support.

Good morning, this is chatbot

The easiest way to come accros a client support automation system is via hotline or company's website, where virtual assistants provide answers to the questions about services, products or process complaint. The chatbot owner collects information about each user who has had contact with it. 

According to GDPR, companies must identify the personal information that is collected during the conversation with the robot. These may include identification data such as name, surname, contact details, financial information such as payment details or IT data - location, IP address and cookies.

In practice, the scope of collected personal data may be much wider. It will depend on the purpose and specificity of the process within which the application is used.

How to adapt the process to RODO?

Both entities creating and using technologies based on artificial intelligence are obliged to ensure their full compliance with RODO.

The controller of the application should take into account the interests of the user of the chatbot in such a way that the privacy intrusion is not excessive in the context of the purpose pursued, in line with the prohibition of the processing of data into inventory. This should be taken into account while designing the application and justifying the scope of data collection and the length of data processing.

The user should be informed about the purpose of data collection at the beginning of the process. It is necessary to precisely define the scope of data and the appropriate legal basis for it's processing. It is necessary to take into account algorithms that ensure not only the correct manner of data processing, but also exclude cases of making automated decisions, which in advance may be discriminatory or otherwise unfavourable to certain groups of users. 

Transparent communication

The chatbot user should be given full, clear and comprehensible information about the purpose, scope, context and controllability of the processing of his or her own data. Administrators using artificial intelligence models in the customer service process should disclose as much information as possible about the operation of the algorithm to the person whose data they are processing. Each user should be aware of what the result of the data processing may be, why the decision has been made and what they can do to change it. This information should ensure that the individual can effectively exercise his or her rights[3].

The controller must obtain an actively expressed consent from the user to process his or her personal data.

In the case where chatbots provide conumer service, checkboxes would be the perfect solution confirming not only the acquisition of consent, but also its exact scope. It is also worth to create application rules, which will be presented to the user at the moment of starting the conversation.

A person consenting to the processing of personal data should be able to withdraw their consent in the same easy way as they have given them. The user should be provided with simple access to the data collected through the chatbot, the possibility to view and download their copies (in electronic form), delete, limit the processing and their transfer.

Adequate data protection

The dynamic development of artificial intelligence that is used by chatbots may pose some risks to data security and user privacy. It is essential that the rules for automated decision-making, including profiling, and the procedures for collecting information are properly implemented. It should be borne in mind that artificial intelligence cannot be the sole decision-maker when it comes to legal or similarly important decisions affecting users. Persons whose data are processed must have the means to challenge decisions charmful to them. If the human factor is to blame for the violation of user rights, the controller must prove this.

Each organisation should carry out a risk assessment analysis, implement a monitoring process and establish effective procedures for responding to specific types of incidents. Infringements that pose a risk to individuals should be reported to the supervisory authority within 72 hours and, where there is a high risk of a breach of the rights and freedoms of the data subject, without undue delay.

It is certainly worthwhile to carry out regular audits of the chatbot area. This will facilitate a more precise determination of the scope of data needed to achieve the intended purposes, as well as increase the level of security of the process.


Author:

Krzysztof Grabowski

Krzysztof Grabowski
Data Protection Officer
[email protected]

 


[1] https://www.gminsights.com/pressrelease/chatbot-market

[2] Gartner Top 10 Strategic Technology Trends for 2018, https://www.gartner.com/smarterwithgartner/gartner-top-10-strategic-technology-trends-for-2018/

[3] Guidelines of the Article 29 Data Protection Working Party on automated decision making in individual cases and profiling for the purposes of Regulation 2016/679 adopted on 6.2.2018. Artificial Intelligence and privacy. Report January 2018, The Norwegian Data Protection Authority, p. 22.

Find out how we can support you in personal data protection area.

Personal data protection services