Don’t Become a GDPR Headline: Practical Data Protection Strategies for HR
Published on: 20/03/2024
Issues Covered:
Article Authors The main content of this article was provided by the following authors.

Emer Murphy and Andrew Desmond from A&L Goodbody LLP and Julie Holmes from Legal Island, discuss practical advice to ensure GDPR compliance in your workplace. From data protection intricacies to implementing effective safeguards, our expert-led webinar will empower you with the knowledge needed to protect sensitive data.

In this webinar, Emer, Andrew and Julie will discuss:

  • How advances in technology can create positive and negative issues for businesses
  • Key lessons and recommended actions for employers to prevent confidential information being compromised
  • How employers can implement robust policies and training for all staff

This webinar is ideal for HR professionals, legal specialists, and business leaders. 

Don't miss out on this invaluable opportunity to ensure your business stays compliant and resilient in the face of data challenges.

You can Download the SLIDES HERE.

Recording:

 

Transcript:

Julie:   Welcome to our webinar, "Don't Become a GDPR Headline: Practical Data Protection Strategies for HR". My name is Julie Holmes and I work in the Knowledge Team at Legal-Island. And today's guests, we have Emer Murphy and Andrew Desmond, who are both senior associates from A&L Goodbody. Good morning, Emer and Andrew. Thanks very much for being here. And I just want to thank all of you who have decided to spend some time with us this morning.

 Just want to offer my thanks as well to MCS Group, who sponsor Legal-Island's webinars and podcasts. MCS help people find careers that match their skill sets perfectly. They also support employers to build high-performing businesses by connecting them with the most talented candidates in the market. So if you want to learn a little bit more about MCS and how they can help you, then head to www.mcsgroup.jobs.

So now I'm going to provide a little bit of background information on our experts.

We have Emer, who works with employers from a range of sectors on the full suite of employment law matters. This includes policy formation, investigations, grievance and disciplinary procedures, industrial relations, dismissals and redundancies, employment equality, and equal status claims.

Now, Andrew specialises in privacy, data protection, intellectual property, and commercial contracts.

So I think after hearing that, you'll appreciate why I'm so pleased that they're both here with us today.

GDPR isn't new, but there are so many ways that it can go wrong. Emer and Andrew will talk to us about some of the common pitfalls that we all have as employers, as well as developments . . . and yes, I'm talking about artificial intelligence . . . that you also need to be aware of.

That's a lot for them to cover in this short space of time, but they prepared some great slides, which will be shared with you after the session. And as you know, you also get a recording as well.

They're also going to be able to answer your questions as well. So don't miss this opportunity to ask your questions by adding them to the questions box. And we'll try and pick up as many of those at the end of the session as we can.

Just to get your head into the headspace of GDPR in your workplace, we have a couple of poll questions. And so Maria, our tech expert, is going to share a question with you now.

So the first question is "Does your organisation currently have an up-to-date GDPR policy?" Nice and simple to start off with. Just yes or no. So we'll give you a second to make your choice. And again, remember that there was that term "up-to-date". Maria is just going to share the results there for us, and good to see that the majority of you do. So, again, Andrew and Emer will be reassured by that.

So second question then is "Do you currently provide all staff with GDPR awareness training?" Again, yes or no. And so I've given you a chance to choose your answer from that. Thanks very much for participating, everybody. Maria is going to show us the results for that as well.

And last question is "Would you like more information on Legal-Island's all-staff training course?" So, again, that's just a question available to you, yes or no.

The reason why we're asking you that is because some of you may be interested in Legal-Island's GDPR eLearning training course. The course is tailored to provide all of your employees with comprehensive training and you with an evidence trail just in case the Data Protection Commission comes to your door should a data breach occur.

So if you want any more information specifically about the package or if you wish to get access to a free demo on behalf of your organisation, you can email Glen Bell directly. His email address is glen@legal-island.com. And we'll maybe revisit that at the end or we'll add something to the chat for you.

So, as for me, I'm now going to hand over to Andrew and Emer. If you have any questions, please pop them into the question box and I'll be back at the end to help with Q&A. But in the meantime, enjoy. You're going to get lots of information today.

All right. So thanks very much, Emer and Andrew.

Andrew:  Hi, everybody. It's great to be here. So just to kind of reintroduce ourselves, I'll be kind of discussing this from a more data protection and privacy perspective and Emer will be looking at this more from the employment law perspective. And between the two of us, we'll cover where the two areas intersect and overlap.

So if you could just take a look at the next slide to begin with, I think it'll be useful to start with a kind of overview of the principles of data protection. So these are kind of the fundamentals of the GDPR. The GDPR takes kind of a principles-based approach and sets out the very essence of the compliance obligations in Article 5. And this includes things like these seven principles that are on the screen, basically.

So the first is that personal data must be processed lawfully and also fairly and transparently. Lawfully here means that there must be an appropriate legal basis in place for processing under Article 6. Emer is going to drill into that in a little more detail next.

But for now, another aspect of this is transparency. It's from this obligation that the idea of a data protection notice flows from. So this transparency information must be presented to the individuals and data subjects about the nature of how and why their personal data is processed. Personal data must also be processed for specific purposes and it shouldn't be processed otherwise in excess of what is necessary for that particular purpose.

From that flows the next principle, which is data minimisation. So you identify the purpose of which you're processing data, you identify the data that you need to fulfil that purpose, and you don't go beyond that in terms of processing any further data.

The personal data you process and collect must also be accurate. So if you're gathering personal data on employees or on anybody else, you have to take steps to ensure that it's accurate and that it's up to date. You have to respond to requests for erasure or deletion if the personal data isn't accurate and also take steps to delete any data that isn't accurate as well.

So another aspect of this is storage limitation. This is the idea that the personal data you collect you should only keep for as long as is necessary for the purpose for which you're processing it.

And from that flows the idea of a data retention policy. So that's the idea that you determine how long you need any different type of personal data, and you erase it from your systems when you no longer need it, or else you could anonymise it, in which case it's no longer personal data.

Another aspect of the principle is the principle of integrity and confidentiality. So there is a requirement to put in place appropriate technical and organisational security measures to ensure the security of the personal data process to keep it safe from malicious actors, but also accidental disclosure and similar incidents that could compromise the data.

And then the last principle is a kind of overarching principle, and that's the principle of accountability. So this provides that you must be able to stand over having implemented measures to comply with all those foregoing principles via documentation and you must be able to provide this documentation on request to the Data Protection Commission or to data subjects.

Emer: Thanks, Andrew. So following on from what Andrew's gone through and the broad requirements of GDPR, employers need to be cognisant of the legal basis for their collection of employee personal data. So what exactly can they gather and retain?

Employers often assume they can rely on consent for processing of employee data, but while consent is one of the most familiar bases for gathering data, it's not the most appropriate legal basis in an employment context.

Why is that? The primary reason being that there's a question mark over whether an employee can give free and informed consent in certain situations given the power imbalance between an employer and an employee. Consent is also revocable at any time at the option of the employee, and an employee can't suffer a detriment for withdrawing consent.

So the most common legal bases that employers rely on are to perform the employment contract, so whatever data is necessary to carry out the contract. Probably one of the best known is payroll data, so you get their bank details so you can process the payments. To discharge legal obligations, so EU national law and maintaining employee records, and that it's necessary to achieve a legitimate interest of the organisation.

So if you're relying on this, employers have to identify the interest, demonstrate that the processing is necessary to achieve that interest, and have a balancing exercise of that interest against the rights, interests, and freedoms of the employee. Andrew will touch on that process a bit further on in the presentation.

And then the data should only be obtained for one or more specified, explicit, and legitimate purposes. So it can't be further processed in a manner that's incompatible with these processes.

What are the other considerations? And it's in the next slide here. The first principle of data protection is that all the processing, as Andrew has already said, must be lawful, fair, and transparent.

And employers, in particular, you have to ensure that you're transparent about the processing that you're undertaking of your employees' data. So you must inform employees about what personal data will be collected, if it's going to be collected by a third party, how that data will be processed, and why.

This is usually contained in a data privacy notice, and that should be easily accessible by the employees and easy to understand. So it's good to see in the poll that a huge majority of you have that in place. They are something you should keep under review on a regular basis.

You shouldn't use the data for ancillary purposes. It can only be used for the stated purpose that's in your policy, and that should be determined at the time of processing. So it's not enough to go back in a later date and change that.

You should have the minimum amount of collection. So implement measures to avoid processing of employees' data where it can be done by other means and process only where and what is necessary. Andrew had already gone through this, and we will have some examples of recent complaints and cases that kind of touch on that.

And then have retention periods and security measures in place and keep them well documented. Consider how long information on employees needs to be held in order to comply with your legal obligations. It's not just employment legislation. Taxation legislation can feed into that as well.

Security measures that you should have in place are, let's say, limitations on access to certain folders in terms of HR, special category data, medical/health reports, that kind of thing. There is limited access by a small number of people, password protected. Have measures in place for employees who are working remotely from home, policies in place about locking away documents, that kind of thing.

So what should employers steer away from? The one to watch out for is where there's no explicit or legitimate purpose. So what's your legal basis? Why are you gathering this information? Have you balanced it against the employees' rights? Where there's no real purpose or it's disproportionate, that's going to go against you and you're going to fall afoul of the GDPR.

And the one to really watch out for is special category data. So what is special category data? It's personal data that's revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, trade union membership, genetic data and biometric data processed for the purpose of uniquely identifying a natural person, and there's a recent decision on that that Andrew will touch on, data concerning health, and data concerning a natural person's sex life or sexual orientation.

So processing of special category data is only permissible per one of the six local bases where processing personal data under Article 6 of the GDPR applies and one or more of the additional legal bases for processing special category data under Article 9 applies.

So common reasons for processing special category data that come up for employers are where there's explicit consent for a specific purpose. Now, while it is technically permissible to process employees' special category data on foot of the explicit consent from an employee, as I've already gone through, that consent is generally not regarded as the most appropriate legal basis in an employment context because it's not given freely and it can also be withdrawn.

Also, it can be gathered where it's to satisfy legal obligations, let's say health and safety.

In defence of a legal claim. So if there's a claim against the employer, they can gather data in order to defend it.

And then occupational health and safety. So a common one that comes up is where you refer an employee to occupational health, and quite often employees will query that in GDPR context.

Employers have a legal obligation under the Safety, Health and Welfare at Work Act, 2005, to ensure health and safety of their employees. And Section 23 of that Act provides that an employer can require an employee to undergo medical assessment of their fitness to work.

So there are scenarios where you can require employees to go and be medically examined, but you just need to be very, very careful about the use of that data.

So Andrew is going to just touch on next a bit about rights that employees have.

Andrew:  Yes, sure. So another aspect of the GDPR, one of the big kind of fundamental introductions it made, is rebalancing the balance of power between data controllers and data subjects, the individuals whose data is being processed. And the way it did it is by introducing new rights for data subjects.

So on the screen here, we see some of the kind of more rights that you would see most often in practice. But particularly in the employment context, the right that really jumps out that's coming up most often is the right of access.

As with any of the rights under the GDPR, the right of access isn't absolute. It's subject to certain exceptions under both the GDPR and the Data Protection Act.

So, under the GDPR, one of the exceptions is personal data shouldn't be disclosed to the extent that it would adversely affect the rights and freedoms of third parties. This also includes things like the impact on the controllers of the employing organisations.

There's language in the recitals protecting trade secrets and intellectual property and confidential information. So to the extent the disclosure of personal data would negatively impact the business in that respect, you can look at making redactions.

You can also look at making redactions under the Data Protection Act in light of the exemptions for things like legal advice privilege, and litigation privilege, and expressions of opinion given in confidence.

But this is, again, just to give you a quick overview of some of the key introductions and changes of the GDPR.

So just turning to the next slide then, taking back on a background of the general overview of the GDPR and how it applies, we thought it useful in this context to take kind of a deep dive on introducing new technologies into the workplace and how you can comply with the above principles and processes in that context. This is one of the key kind of risk areas for any organisation from a data protection and employment law perspective.

So, at the outset, any business that's considering introducing a new workplace technology that will interact with employees should consider what we can broadly call privacy review processes. These are provided for in the GDPR, and there are two main types. Sorry, the first of these is actually provided for in the GDPR, but they both stem from obligations under the GDPR.

The first of these is a data protection impact assessment. Now, this is set out in Article 35, and Article 35 provides that any controller who's considering implementing a high-risk form of processing must introduce a data protection impact assessment to weigh up the risks that are presented by the processing and determine whether it's fair to proceed.

So performing a data protection impact assessment is actually mandatory where high risks are presented, and the GDPR specifically lists introducing new novel technologies as an example of processing that can involve a high risk.

And performing the Data Protection Impact Assessment, or DPIA, involves setting out the proposed processing of the purposes and assessing whether it is necessary and proportionate to proceed with that processing.

It involves identifying and assessing the risks, the rights, and freedoms of the individuals concerned and identifying appropriate measures to address these risks and to demonstrate compliance with the GDPR.

All of this should be set out in a comprehensive document so that if the business does decide to proceed with the process, it can stand over the Data Protection Impact Assessment and say, "Here, we've weighed up the pros and cons and determined that the risks presented by the analytics here are not on an unacceptable level and it is fair to proceed if we have appropriate measures in place".

The other type of privacy review process that we see a lot in practice and is coming more to the fore lately is a legitimate interest assessment. So this stems from the point Emer was making that particularly in the workplace and employment context, legitimate interest is often the most appropriate legal basis on which to couch or base processing of employee data, especially because consent isn't really deemed appropriate given the inequality of power between employer and employee.

So for a lot of business-as-usual processing, you can just take it for granted that these are legitimate interests that you're pursuing. But if something is more new or novel or a more edge case, again, you want to look at actually documenting how you determine that legitimate interest is appropriate as a legal basis to rely on.

It overlaps a lot with the Data Protection Impact Assessment process, but it's kind of to serve a different end, which is determining that you do have a valid legal basis under Article 6 to proceed with the processing.

What we advise is that if it's sort of like a business-critical type of processing . . . so particularly a consumer-facing social media app or something like that, if they introduce a new feature on the basis of legitimate interests, or in the workplace if you're introducing a new monitoring technology or something like that, these would be more edge cases.

You want to be able to stand over legitimate interest assessment and have documented your determination that the balance between the interests you're pursuing on the one hand and the impacts on the individuals on the other hand are appropriate, and that the impact to individuals is not so unacceptable that it's unfair to proceed with its processing.

Performing this legitimate interest assessment involves determining the purpose that you're pursuing, so why you want to process the data, what the objective is, how it will benefit your organisation or third parties, and what will be the impact if you can go ahead and if you cannot go ahead.

Then you consider it in terms of necessity. So how will the processing help you achieve your purpose? Is it really necessary and proportionate to achieve the purpose, and can you achieve the purpose without pursuing such processing?

And then you can perform a balancing exercise. So you determine the nature of the data to be processed and whether it's private or sensitive, and what sort of impact processing this will have on the individuals.

You also determine the reasonable expectations of the individuals. So would they expect their personal data to be processed in this context? And if on balance it seems fair to proceed with the processing, you have a document supporting your determination that legitimate interest is a suitable basis for posting personal data and you can stand over having performed a privacy review for the purposes of processing this personal data.

Just to turn it to the next slide, another point worth to address because it's quite zeitgeist-y at the moment is that a lot of the developments in technologies in the workplace at the moment are coming from the AI space and things like the implementation of chatbot tools.

So not to further complicate the picture, but the GDPR is obviously the piece of legislation that governs privacy and data collection in this space, but it's not the only technology legislation to be aware of.

The Artificial Intelligence Act has just been finalised through the EU's legislative process, and that's going to come into force, but not actually come into effect as law. There are actually staggered timelines when it will come into effect, but there are some provisions in there that are specifically applicable to the use of technologies in the workplace.

If you are introducing an artificial intelligence tool in the workplace, the foregoing slide about data protection impact assessment and legitimate interest assessment would definitely be your first port of call to assess the privacy impacts of the new technology.

But you'd also want to consider whether there are hard stops in the Artificial Intelligence Act about what you can do.

Within six months of coming into force, so in six months' time because we expect the Act to be introduced in the next couple of weeks, the Artificial Intelligence Act will prohibit AI systems that kind of detect and monitor emotions in the workplace.

So maybe things that monitor conversations people are having to indicate mental health concerns or anything like that, those will be outright prohibited in six months' time.

Then in two years' time, a new sort of categorisation of certain systems as high risk will come in, and this will include AI systems for employment and the management of workers' purposes.

This includes things like evaluating performance in the workplace, so performing performance reviews, and it also includes things like assessing candidates for jobs. So as part of the recruitment process, using AI tools in that context.

These systems will be considered high risk, and high-risk systems are subject to quite detailed obligations under the Artificial Intelligence Act. That includes things like ensuring the individuals who actually use these systems are appropriately trained and that they act like human oversight of the systems so that they don't surrender the decision-making process completely to the system. There's still an element of human review involved.

They also have to be ensured that the data that's fed into all systems will confirm this process is fair and representative and relevant to the job it's performing.

And lastly, the organisation will have to keep logs of the system to make sure that people can look back retroactively and determine why and when it made such decisions and what led to making such decisions and certain actions on data that was fed into it.

Emer:  Thanks, Andrew. It's a really interesting space in terms of AI and the changes that are coming down the tracks, and it is something that more and more employers are looking at.

I suppose, in this context, what is surveillance? And in essence, it's closely monitoring of employees, watching their behaviour, their activities, and it's going to entail collecting personal data. The easy example is CCTV footage, swipe cards, and system monitoring, a lot of things that employers will already have had in place for a long time.

Employers have to think really carefully what's the purpose of gathering that surveillance data. Why do we have CCTV? How are we going to use the data or what we collect from it? What's the employee context for the use? And do we really need this or can it be done another way?

The DPC has detailed guidance on its website. Unfortunately, not on AI yet, but on the use of CCTV, and that includes a specific section of guidance for employers on CCTV in the workplace.

They've got extensive guidance on the use of vehicle tracking by employers, which is really useful, guidance on remote working, which is helpful just in terms of setting out kind of what's expected of employees in the remote workspace, which is really topical at the moment. We've had the new code of practice introduced on that. So there are other surveillance technologies available to employers.

Employers may have a legitimate reason for wanting to monitor employees, like it's good for business if you know what they're getting up to, but be aware that apart from very exceptional and special circumstances, you could fall afoul of GDPR.

In addition, you also have to consider employees have legitimate expectations of privacy, and that can be intruded upon disproportionately. So do not stray from the purposes which employees are aware of that you're collecting the data.

And I suppose a case that you should all be aware of that dates back to 2022 was The Data Protection Commissioner v Cormac Doolin. So very briefly, it was one where the Gardaí were called because an employer found graffiti carved into a table in the staff canteen. It was Our Lady's Hospice & Care Services. They advised the employer to review CCTV footage to determine who had done it.

They did review it. They didn't determine who had actually committed the graffiti, but on reviewing the footage, it indicated that Mr Doolin would be taking staff tearoom . . . unauthorised breaks, basically. There was no suggestion he was involved in the graffiti incident at all.

So he was investigated and he was disciplined for it. And he actually lodged a complaint with the Data Protection Commissioner, which made it all the way up to the Court of Appeal.

And the Court of Appeal backed up what the High Court had said, which was that the CCTV footage was used for a different purpose than that which was originally collected.

So the Court had said that this footage was collected for the express and exclusive purpose of security and was used permissibly for that purpose. It was also used for a distinct and separate purpose, i.e. disciplinary proceedings into unauthorised breaks by Mr Doolin.

So what are the key takeaways to think about that when you're not straying from the purpose? What I think we should be looking at is making sure you comply with the DPC guidance on use of CCTV, or if there's any other guidance, let's say, on vehicle tracking, etc. And keep aware of what the DPC are doing in that space.

Ensure that you've implemented clear policies and procedures for processing personal data relating to employees. And consider the purpose you're collecting that personal data for and ensure this purpose is set out in your Data Protection Notice or your policy, and it's communicated to the relevant employees.

Keep those policies under regular review. So I know you all would say that you've got policies in place, but review them on a regular basis.

And then following on, on the next slide, in terms of monitoring and surveillance of employees, there are lots of different areas where employers monitor them.

You've got computer networks where you're monitoring them, internet, acceptable usage policies. Email is a huge one, particularly when you come to DSARs. Obviously, CCTV I've mentioned, location data, swipe systems, coming in and out of buildings. There are scenarios where employees are using more and more detailed tracking facilities.

The thing to remember is all GDPR principles will continue to apply. So great as this information may be for business, it is balanced with the employee rights.

You've also got to consider European Convention on Human Rights and Privacy entitlements and your legitimate interest test, as Andrew has already set out.

How can this be achieved, and are there other ways to get around this? Covert software mechanisms to record and obtain data without an individual's knowledge are generally going to be unlawful. Consider what you're trying to achieve. What's your method of achieving it, and is it proportionate and not excessive?

So Bărbulescu v Romania, and Ribalda, which is a CJEU decision, are both kind of instructive of this. In Bărbulescu, it was that there was a workplace Yahoo chat being set up and he started to use it for personal chats.

It went up to the European Court of Human Rights as to whether his privacy had been infringed when the employer actually started to investigate and review those chats.

Article 8 was engaged, so in looking at those Yahoo messenger chats, the employer needed to act proportionately. There was no violation of Mr Bărbulescu's Article 8 rights in this case. It wasn't unreasonable for the employer to want to verify that employees are completing their professional tasks during working hours.

But importantly, the European Court of Human Rights found that the employer had only accessed his account in the belief based on what he had told the employer, that it contained client-related communications only and there wasn't anything personal in it.

So there is going to be an element of what exactly is the scenario in this particular instance and why are we accessing this data.

In the Ribalda case, it was where there was covert CCTV footage being obtained. So there was CCTV that the employees were aware of and there was other footage that they weren't aware of.

And actually, the Grand Chamber reviewed the concept of private life and said that one's reasonable expectation not to be recorded in their private or social life can extend into certain areas of the employment space, such as toilets, private rooms, and it still exists in open and public spaces. So, in this case, it was the entrance of the supermarket.

They reaffirmed the principles in Bărbulescu, but in that instance, they didn't find there was a violation of the employees' privacy rights with this covert surveillance because it was found in that case that disclosing prior information to the employees regarding the monitoring would have prevented the revealing of the thefts. And the court added that no less intrusive measures could have been applied in that case to safeguard the employer's right.

So Bărbulescu sets out a test on monitoring and surveilling of employees. That's in the next slide and where we discuss . . . So transparency is really important, as we've already kind of flagged a few times.

Receipt of prior notice. To take effect, the warning from an employer has to be given before the monitoring began, Ribalda being a bit of an exception in that particular instance. Particularly where it involves accessing contents of the employees' communications, method of notice is not prescribed, but has to be clear regarding the nature of the monitoring and must be in advance. So that's where we're saying good policies in place.

Consider the extent of monitoring, degree of intrusion. Think of the breadth and depth of the monitoring. For example, distinction between monitoring a flow of communications against actually reading each individual communication. Have restrictions in place about who's going to monitor them. Limit the numbers.

Is there a legitimate interest or reason for the monitoring that justifies it as a proportionate response to the risk? Could it be achieved by less intrusive methods?

Don't access the information or . . . In Ribalda, what the court had looked at was that they'd actually limited how many people could see the footage and how that particular footage would be used.

Balance it with the privacy interests of the employee. Think about the seriousness of the consequences as a result of the monitoring. Is it something that could lead to dismissal? There's going to be a higher standard of privacy afforded to the employee, and the consequences need to be really, really clearly explained to them.

And have adequate safeguards in place, especially where monitoring is intrusive, that you limit it as much as possible. So, for example, content of emails. Do we really need to look at the content of emails, or is it going to be more about volume or what exactly the employee has been getting up to?

Kind of following on from that space, Andrew is going to look at what are the risks and consequences in terms of breaching employee data protection rights.

Andrew:  Sure. So having had an overview of the compliance obligations in this space, particularly when considering and implementing new technologies, it's kind of good to get a sense of what the DPC's powers in this space are and the risks that could materialise from non-compliance with those kinds of obligations.

And to get a sense of that, the headline news has always been that the GDPR provides for fines of the greater of up to €20 million or 4% of worldwide annual turnover.

Obviously, not all fines are within that bracket, but there are quite chunky examples that we'll explore in a minute of fines that are well within those kind of higher ranges from just the past few years that relate to employee monitoring and workplace data processing practices.

But as well as that, it's also useful to kind of take note of the Data Protection Commission's investigatory powers. So the Data Protection Commission has powers to conduct inquiries and it can require written reports from controller organisations that process personal data.

It can also require the attendance of authorised persons before the DPC, so that people have to come into the DPC and actually explain the processing activities in an in-person interview.

And the DPC has audit powers, so it can enter and inspect premises and require the production of documents to kind of make sense and further investigate data processing activities.

The DPC also has powers to issue information notices. So, actually, in just the past few days, the Data Protection Commission announced it's going to be part of a coordinated enforcement action with the EDPB, in which they're going to issue information notices to certain controllers in the public and private sectors in Ireland relating to the right to access that we were looking at earlier.

So this is to investigate an organisation's compliance and its means of complying with and facilitating exercise by data subjects of their right of access.

So the DPC's other powers as well include the pursuit of enforcement notices. So this is a notice that goes to an organisation mandating it to bring its processing into compliance with the GDPR and the Data Protection Act.

And lastly, the DPC publishes news articles and updates and blog posts and an annual report on its key decisions throughout the year. So there is an element of name and shame where an organisation can find itself facing unforced publicity from featuring in findings by the Data Protection Commission.

Beyond the Data Protection Commission's powers, the GDPR introduced in Article 82 an entitlement of individuals' compensation for material and non-material damage.

So by introducing language on non-material damage, it opened up the possibility for damages from more kind of abstract harms, like psychological upset or distress that could arise from unlawful data processing practices. And there's some recent case law in that space that Emer is going to explore.

Emer:  The case of Österreichische Post has actually specifically looked at Article 82, as Andrew said, which introduced that there doesn't have to be material damage.

So in the court's view in that case, they said that Article 82 requires establishing damage, which is either material or non-material, an actual infringement of the GDPR, and that there's a causal link between the two. But the CJEU also ruled that the right to compensation in GDPR can't be made contingent upon individuals satisfying a certain seriousness threshold.

They noted that the GDPR doesn't contain any rules for determining the amount of damages to be paid out to claimants once they've established their claim. And the CJEU held that national courts in the EU can apply existing domestic rules when deciding the amount of compensation to award to successful claimants.

So I think that's the one that employers have been looking at. If there's a case taken against us, what's the potential award in it? And Kaminski v Ballymaguire Foods is instructive on that one.

Kaminski was employed with Ballymaguire Foods, and during a training exercise to highlight poor work practices in the organisation, CCTV was shown to a group of employees and the plaintiff was identifiable in the footage. And he alleged the processing and use of the CCTV footage amounted to unlawful processing of his data in breach of GDPR.

He allegedly suffered anxiety, embarrassment, and sleep disturbance. And the court held that there was an infringement of his GDPR and that non-material damage resulted from that infringement.

But the financial sum that they awarded him in compensation was €2,000 for non-material loss, and that was on the basis that it went beyond mere upset and it resulted in insecurity lasting a short period of time. It's one of the only decisions in Ireland at the moment, and it's looking like any awards are going to be low.

And then in addition, the courts and Civil Law (Miscellaneous Provisions) Act, Section 117, came into force in January. And that provides that there's a concurrent jurisdiction. So the district court can determine claims alleging breaches of the GDPR. The limit in the district court is €15,000. So that in itself is going to keep any claims down in the lower end in terms of costs.

There are a number of cases that actually have been put a stay in them, and I've mentioned one of them there. So Garry Cunniam v Parcel Connect trading as Fastaway Couriers, there's a stay on that case, which is pending determination by the CJEU of a number of references to it under Article 82.

So one of them was Österreichische Post, and actually that's been determined. There are about three still left. So it's kind of a "watch this space now to see whether anything changes there" going forward.

But Andrew has found some interesting recent cases that kind of touch on the space of monitoring of employees.

Andrew:  Yeah, sure. So this particular decision came in from the French regulator, which is the CNIL. So that's like the equivalent of the DPC in France. But because the GDPR is European-wide law, the principles espoused there and the logic followed holds true EU-wide. So it's highly relevant for our present purposes.

It's also one of the more prominent cases in this space in terms of highlighting how the principles under the GDPR kind of stack up and how some of those obligations you were talking about earlier feed into each other, and how a given processing operation, if deemed unlawful, can breach several different obligations. That can add piece-by-piece incrementally to increasing the severity and gravity of the fine.

So what this decision relates to was Amazon, and particularly their French warehouse operation. They had a practice of requiring employees working in the warehouse to use scanners. These scanners identified each employee and they received tasks via the scanners.

But the scanner also monitored performance of the tasks in terms of time metrics, so how quickly they were scanning each item through for delivery to a given truck to deliver to a specific location.

It collected a very, very micro level, extensive level of detail on the staff's activities. And this came to the attention of the French regulator, the CNIL, and so they investigated.

And so the CNIL found the monitoring by the scanners was excessive and it breached the GDPR and imposed a fine of €32 million, which is one of the much more significant fines under the GDPR Europe-wide.

Amazon had argued that this processing was based on legitimate interests, that legal basis we were talking about earlier, and that it was necessary for the purposes of quality and safety assurance, warehouse and workload management, work planning, employee evaluation, and coaching, which are all kind of fairly sound and credible claims as legitimate interests.

However, the CNIL found that the level of processing was excessive and disproportionate. And so even if these were the interests being pursued, they weren't legitimate interests because the level of processing of personal data of these individuals was far too excessive, and so the balance wasn't struck.

And if we look at the next slide, we can break down a little bit more how the CNIL sort of analysed different obligations through the GDPR and found Amazon to be in breach in this instance.

So the data subjects were subject to being . . . The performance metrics included whether they were scanning too fast. So it would record a time of 1.25 seconds. If a package was scanned in less time, then this was considered to be scanned too fast.

It also monitored inactivity down to the minute. And so any inactivity exceeding a number of minutes was reportable and recorded against the person.

All this data was also retained and reviewed over a 31-day period as part of an on-going performance review process. And the CNIL found that this was an excess of the data minimisation principle, which provides you should only process, just enough data as you need to pursue the purpose which you're pursuing.

So when we look at those earlier purposes that Amazon argued or pursued here, they found that this micro level of data collection was far too in excess of that.

They also found that it was disproportionate and therefore inappropriate, and so this legal base of legitimate interest did not stand up.

Another factor feeding into this was that the employees would not reasonably expect their activities using the scanners to be monitored in this way. And furthermore, they hadn't been provided with a data protection notice explaining this processing before Amazon had begun it. And so they were never fully informed of the process. So that's a breach of the transparency principle and a breach of Articles 13 and 14, the information requirements of the GDPR.

And lastly, kind of a more adjacent issue was that there weren't sufficient security measures in place to protect this personal data. So there weren't appropriate technical and organisational measures.

Access to the software with the overview, so where one could actually view the employee's performance as against these metrics, was not sufficiently secure and it wasn't sufficiently limited to a limited number of authorised personnel.

Also, the access passwords were deemed not to be sufficiently secure. And the access accounts were shared . . . each individual account was shared between several users. So this made for a significant accumulation of security defects, which would mean it was much more difficult to trace how a given personal data breach could have occurred if it would happen, because there was a sort of confabulation of different security shortcomings.

And that, again, contributed to the accumulation of breaches of data protection law, which informed this very chunky fine imposed by the CNIL.

That kind of speaks to a need for implementing appropriate policies and training as well as having sound legal basis underpinning your processing, and also the policies and training, something Emer can speak to now.

Emer:  When we're talking about policies and training, there's actually a recent case out of the ICO in the UK where there's an employer, Serco, who have actually had enforcement notices issued against them because they were using facial recognition and biometric finger in terms of tracking employees.

It's basically tracking their attendance at work, and it's across I think 38 sites. So there was a significant portion of it being done.

And Serco did have a DPIA and a legitimate interest assessment carried out. They did make submissions to the ICO that they needed to put this system in place because employees have been abusing previous systems for tracking them. But it was still deemed as excessive by the ICO and they've issued enforcement notices against them that they have to cease using this data.

It's special category data, number one. So, again, as we've warned, it's much more difficult to deal with special category data. You have to have your very significant bases for processing it.

And also then what they looked at in the ICO decision was whether it was proportionate what was being done. Were there other ways that employees could be tracked? Could employees opt out of it? And really, it wasn't foreseeable that they could because they feel under pressure. "This is my attendance being monitored. If I don't let them do this, I'm not going to stay in employment".

And also, where they did object to it, there was no system in place as to alternatives or clear guidance as to what the alternatives were.

So I think it's clear from that . . . I mean, that may influence what the Irish Data Protection Commissioner might look at in this space. And that's why I'm saying to watch out for any policies they might issue on it. It's something to keep an eye on, be aware of.

For example, I know I've got CCTV there on the slide, as I mentioned, but make sure you've got clear policies and procedures for any processing that you're going to do that's containing employee personal data.

Have IT and acceptable usage use policies, use of personal devices policies, remote work policies. If you're using tracking systems, make sure that you've got policies covering them and letting employees know what data you are collecting.

And be aware employees can make complaints about it. They can raise it with the DPC, and it can be looked into further. So try and get your ducks in a row and make sure you've done your due diligence before you implement any systems.

Have good data retention policies in place, clear guidelines how long data will be held for. And I suppose with the introduction of new technologies, as you've seen in that Serco case, it's important that employers review your policies on a regular basis. Look at them and really consider whether they are fit for purpose. Are they covering what needs to be covered?

And before you introduce any new technology, take a step back and look at it and say, "Look, is this going to be something that we can implement or are we going to fall afoul of GDPR? Are there other ways that we can measure these metrics?"

In Serco, they could have had sign-in sheets. They said it was subject to employee abuse, but the ICO didn't say that was a sufficient reason for them to then have to introduce biometric testing.

I suppose the key is managing employees' expectations. They want to know what you're gathering on them. And quite often, these things don't come up until you are ending up in a disciplinary space. But you should have everything in order before you even get there.

Check your policies, keep everything updated, make sure things are easily accessible for employees and that you keep them informed, and that should avoid many of the pitfalls that could come forward.

Julie:  Okay. Thanks very much, Andrew and Emer. That was a whistle-stop tour through GDPR and there's so much information there. Thanks very much for the cases as well.

Again, everybody, we'll send you those slides and then you can take a look at some of those things in more detail.

So you just talked, Emer, about Serco Leisure and about the issues that they had with introducing a new system. We've had someone in the audience as well that's had difficulty with an HRIS.

They thought it was GDPR-compliant. Some of the workforce who are unionised complained that some of the members are unhappy because information was shared with that third-party provider of the HRIS and also to the external HR consultant that was working on this project with them. So they're asking, "Has the employer done anything wrong, or what advice would you give in regards to that?"

Emer:  That's difficult to say without seeing their policies. So they need to go back and look at what policies do they have in place? Have they clearly set out that they're using a third-party provider for processing this data? What use is the third-party provider going to make of that data, and have they informed the employees of that in advance?

So I think your first step is to go back, look at your policies, see what's covered in your policies, and then take it from there.

Julie:  Great.

Andrew:  Especially the transparency information. So what transparency data protection was presented to these employees, and would they have been informed of the processing at the outset and that information has been informed and that sort of thing is happening?

Also, take stock of what exactly is the person that's transferred and what's their legal basis underpinning transferring that. If it's special category data, biometric data, or something like that, that's probably got a bigger question mark over it because you can't rely on legitimate interest factor relying on the Article 9 legal basis, the most useful of which is probably explicit consent.

But explicit consent is not really strong in the employment context because that inequality of bargaining power. It's hard to say it's valiantly given or freely given in that context.

Julie:  All right. Thanks, Andrew, and thanks, Emer. I hope that that helped our audience member as well.

Then there have been quite a few questions about retention periods. So one is about different types of employee leave, including force majeure. And then another question is about recruitment, specifically people that haven't been shortlisted, how long you would keep those documents for. So either of you, if you can . . .

Emer:  I can hop on, on that one. So for parental leave, force majeure leave, the retention period should be . . . we're recommending about eight years. That's covered under Section 27, I think it is, of the Parental Leave Act 1998.

When you're thinking of recruitment records, so application forms, interview notes for unsuccessful candidates, you'd be thinking about Section 77 of the Employment Equality Act. So, generally, we recommend it be 12 months after the date of the appointment or selection of a successful candidate. And that being because that's the period which the person could make a complaint to the WRC under that Act. So they would have 6 months, which could be extended up to 12. Ideally, let's say hold for 12 to 13 months.

Julie:  Okay. That's great. Thank you very much. Now, those are most of the questions at the moment. And I think you've answered quite a few of those during your presentation as well about aspects about where people can go for more resources, too.

Thank you very much, everybody, for joining us today. And thanks very much, Emer and Andrew, for a really informative session. There was lots in there, so lots to get our teeth into as well.

Thanks, everybody, and see you next time at our next webinar. And thanks very much to A&L Goodbody.

Emer: Thank you.

Sponsored by:

 

Continue reading

We help hundreds of people like you understand how the latest changes in employment law impact your business.

Already a subscriber?

Please log in to view the full article.

What you'll get:

  • Help understand the ramifications of each important case from NI, GB and Europe
  • Ensure your organisation's policies and procedures are fully compliant with NI law
  • 24/7 access to all the content in the Legal Island Vault for research case law and HR issues
  • Receive free preliminary advice on workplace issues from the employment team

Already a subscriber? Log in now or start a free trial

Disclaimer The information in this article is provided as part of Legal Island's Employment Law Hub. We regret we are not able to respond to requests for specific legal or HR queries and recommend that professional advice is obtained before relying on information supplied anywhere within this article. This article is correct at 20/03/2024