20 January 2025

Voicebots and chatbots as a tool for interacting with employees and the requirements of personal data regulations

Chatbots and voicebots are being increasingly used for interactions between employers and employees. They help employers meet their obligations to employees, such as serving as a tool to provide information about procedures at the employer or how to carry out certain tasks. They also help in managing employment, as a communications tool for basic HR issues.

An important feature of how these applications operate is that, when interacting with employees, they collect and process information which the employees provide. This information may include, depending on how a given bot operates and its purpose:

  • information entered/spoken by the employee (such as questions or messages)
  • data on interaction time
  • technical data, such as regarding the equipment used by the employee.

The information referred to above constitutes the personal data of employees that will be processed by the employer acting, in principle, as the controller of the data.

Basis for processing personal data

In order to be lawful, any processing of personal data must be carried out on one of the grounds for processing that is specified:

  • in Article 6 par. 1 of the GDPR – for “ordinary” data,
  • in Article 9 per. 2 of the GDPR – for special categories of data (e.g. regarding health).

It is no different with the employees’ personal data which the employer processes while using voicebots or chatbots. No problems should arise in identifying a basis for the processing of “ordinary” data – most  often this will be what’s called “legitimate interest”. If the processing is to be based on this ground, the employer should carry out and document (in particular in the event of any inspection) a “balancing test”, setting out the justification for the employer’s conviction that it may carry out the processing on the above ground.

For special categories of data, such as involving health, it may be more difficult to identify a basis for the processing. Nevertheless, as long as the processing is carried out in compliance with the relevant requirements of employment law, such as with the objective of the employer exercising the employee’s rights under employment law, it should be feasible, in principle, to identify a relevant basis.

Significantly, in order for the bots to be considered legal and having a proper basis, the purpose of the bots’ use must be within the range of permitted purposes of the employer's activities in relation to employees. In other words, the purposes of the bots’ actions must be set out in such a way that they comply with the provisions of the Labour Code.

Minimisation of data

One of the basic principles of the GDPR is that of minimisation, which states that each  processing of personal data should be adequate, relevant and limited to what is essential for the purposes for which the data are being processed.

The principle of minimisation should be taken into account when putting into effect solutions such as voicebots and chatbots. Bots should be designed and should operate in such a way that the scope and manner in which they process employees’ personal data is limited to what is necessary for the purposes of the bots’ use. At the same time, the purposes of the bots must fit within the purposes of processing employee data that the Labour Code permits for the employer.

The principle of minimisation may be met, in particular, in such a way that certain data that an employee enters, which is not relevant to the purposes of the bot, will not be stored by the bot or deleted immediately.

Obligation to inform

As the controller of the employees’ data, the employer must provide the employees with information on the processing of their personal data, meeting the requirements of Articles 13 and 14 of GDPR. The fact that personal data will be processed by voicebots and chatbots should be adequately addressed in the information clauses (privacy policies, etc.) on the processing of personal data, which the employer provides to employees.

It is also advisable to provide a “first layer” of the information clause (with details of the data controller, the purposes of the processing and the employees’ rights under GDPR) in such a form that it is accessible to employees when interacting with the bot in question (so it is visible next to the interface for exchanging text messages with the chatbot).

Security of processing

In view of its obligations under the GDPR of ensuring an adequate level of security for the processing of personal data, the employer should put into place appropriate organisational and technical data protection measures for the voicebots and chatbots. The employer should determine those measures, pursuant to Article 32 par. 1 of the GDPR, taking into account the state of the art, their cost of implementation and the nature, scope, context and purposes of the processing and the risks of varying probability and seriousness of infringing employees’ rights or freedoms, in such a way that ensures a level of security corresponding to the identified risks.

It is advisable that the employer should carry out and document a data protection risk analysis in this respect to justify that the data protection measures meet the requirements of Article 32 par. 1 of GDPR.

Participation of external entities

In practice, voicebots and chatbots are commonly provided by external suppliers. Those suppliers most often have access to a greater or lesser extent to the data processed by their applications, such for providing maintenance.

If the suppliers are to have access to personal data processed by the applications they provide, the employer must (as the data controller)

  • verify (say, through a survey) that the supplier provides sufficient guarantees to implement appropriate technical and organisational measures so that the processing meets the requirements of GDPR and protects the rights of data subjects,
  • enter into a data processing agreement with the supplier (as a processor) that meets the requirements of Article 28 of GDPR.

The employer should also ensure that the third-party supplier’s application complies with data protection legislation, namely that it does not request personal data from employees that the employer should not be processing (the involvement of a third-party supplier does not, by itself, exempt the employer, the data controller, from liability under GDPR).

DPIA

In accordance with the GDPR, if due to its nature, scope, context and purposes, a given type of processing, in particular one using new technologies, is very likely to result in a high risk of infringement of the rights or freedoms of individuals, before starting the processing, the controller should carry out a DPIA (Data Protection Impact Assessment) of the intended processing operations.

It is advisable that before launching any services involving voicebots or chatbots, the employer should perform and document an analysis of whether the planned data processing using such solutions requires a DPIA. If so, the employer should, in addition, perform and document a DPIA and implement the recommendations following from the DPIA.

Automated decision-making

When implementing applications like voicebots or chatbots, additional consideration should be given to the possible automatic effects these solutions may have on employees. The use of voicebots and chatbots should not, in principle, lead to any decisions being taken against employees based solely on automated processing using such applications, including profiling, and which produce legal effects or similarly affect the employee in a material way (such as by automatically issuing a disciplinary penalty if the bot makes such a recommendation).

Employee monitoring

The employer may want to use the data that voicebots and chatbots collect to carry out other forms of employee monitoring within the meaning of Article 223 § 4 of the Labour Code (and this applies to monitoring the content of messages sent by employees as well as to monitoring metadata). If the employer wishes to use the data acquired by voicebots and chatbots for this purpose, it must regulate this issue adequately in its internal documents on employee monitoring.

***

As may be noted, the use of relatively unassuming tools such as voicebots and chatbots may involve a range of legal implications. The implementation of such applications should therefore be preceded by an analysis to identify possible legal requirements that should be met before those tools are implemented, particularly if they are to process large amounts of data or any “sensitive” data.