Under the GDPR, one of the requirements for compliance is the process of completing a DPIA, or Data Protection Impact Assessment. The ICO mandates DPIAs to be filled out for processing operations that are “likely to result in risk.”
But what exactly does “likely to result in risk” mean? How can organizations form a solid definition of this term to use in their DPIA and overall GDPR compliance plan? Even though the GDPR is vague about what this means, there are plenty of examples of what a high-risk action or process is. In this guide, we’ll break down everything you need to know about what is likely to result in risk under the GDPR, as well as some examples of what to look out for.
You can identify and reduce a project's data protection risks by doing a data protection impact assessment (DPIA). It is a particular kind of evaluation carried out in relation to a particular processing activity to guarantee that any privacy risks have been noted and safeguards have been implemented where appropriate.
For processing that poses a significant danger to people, a DPIA is required. This contains a few particular categories of processing. A DPIA should be conducted for each additional significant project involving the handling of personal data.
Data controllers must conduct a data protection impact assessment in compliance with Article 35 of the GDPR if a processing operation, particularly one involving the use of new technologies, is "likely to result in a high risk to the rights and freedoms of natural persons."
Data controllers are obligated to conduct a data protection impact assessment in accordance with Article 35 of the GDPR if data processing activities, particularly those utilizing new technologies, are "likely to result in a high risk to the rights and freedoms of natural people." Neither "high risk" nor "DPIA" are defined under the GDPR. However, some standards for what constitutes "high risk" are listed in laws and regulations, which are in-depth explored in Section 4.
In brief, processing significant volumes of data, sensitive or special categories of personal data, or processing procedures utilizing new technologies for which the controller has not yet performed a DPIA may be associated with high risk.
To get a better grasp on what is considered high risk under the GDPR, let’s take a look at some examples.
This risk would apply to processing requiring the deployment of unique applications of already available technology. When paired with any other need from WP248rev01, a DPIA is necessary for any proposed processing processes requiring creative technology usage or the implementation of novel organizational and technological strategies.
Depending on the particulars of the processing, examples of this include artificial intelligence, machine learning and deep learning, connected and autonomous cars, intelligent transportation systems, smart technologies, market research including neuro-measurement, and various IoT applications.
Instances where the controller believes that adhering to Article 14 would be impossible or require an unreasonable amount of effort are known as invisible processing, as described in Article 14.5. Invisible processing is the processing of personal data that has not been directly obtained from the data subject. Any proposed processing action where the controller is relying on Article 14.5 requires a DPIA.
List brokering, direct marketing, online tracking, advertising, data aggregation, and the reuse of publicly accessible data are a few examples of this processing.
Processing that involves tracking a person's geolocation or activity, including but not limited to the online world, is known as tracking. Any proposed processing procedure using geolocation data must first undergo a DPIA. Social media platforms, software programs, online advertisements, web and cross-device monitoring, eye tracking, data processing at work, data processing in the context of working from home and other distant locations are a few examples.
Any processing of genetic data that isn't done by a specific doctor or other health care provider for the purpose of providing direct treatment to the data subject is considered to be the processing of genetic information. Any anticipated processing actions involving genetic data should be subject to a DPIA. DNA testing, medical research, and medical diagnostics are a few examples.
Another possible issue is data matching, which includes merging, contrasting, or matching personal data gathered from several sources. Examples include federated identity assurance services, direct marketing, tracking personal usage of statutory services or benefits, and fraud protection.
Denial of service occurs when judgments concerning a person's access to a good, service, opportunity, or benefit are made in part or entirely based on automated decision-making, or when special-category data is processed.
Credit checks, mortgage or insurance applications, and other pre-check procedures connected to contracts are some of the most typical forms of denial of service.
Any extensive individual profiling carried out by an organization poses this risk. Examples of this include information processed by Smart Meters or IoT apps, fitness or lifestyle tracking gear and software, social media networks, and the integration of AI into current processes.
Any processing of biometric information aimed at uniquely identifying a person is subject to this danger. When paired with any other criterion from WP248rev01, a DPIA is necessary for any anticipated processing activities using biometric data for the purpose of uniquely identifying a person.
Facial recognition technologies, workplace identity verification and access control systems, and access control and identity verification for hardware and apps are some intriguing instances of biometric data threats (including voice recognition, fingerprint, facial recognition, etc.)
If you find yourself still unsure whether certain activities are “likely to result in risk” or not, it is best practice to create a DPIA just in case. It is always better to be overly careful when it comes to identifying and mitigating risks associated with processing personal data.