INTRODUCTION
Article 22 of the General Data Protection Regulation (GDPR) pertains to “Automated individual decision-making, including profiling,” and it provides individuals with the right not to be subject to purely automated decisions, including profiling, which have legal or similarly significant effects on them. This means that individuals have the right to request human intervention and the right to express their point of view when decisions are made based solely on automated processing.
AUTOMATED DECISION MAKING AND PROFILING
Automated decision-making is the process of deciding by automated means without any human involvement. These decisions can be based on factual data, as well as on digitally created profiles or inferred data.
Further, it refers to using technology, such as algorithms and artificial intelligence, to make decisions without human intervention. Under the General Data Protection Regulation (GDPR), automated decision-making processes are subject to specific requirements to protect individuals’ rights and freedoms.
For Example, the recruitment and hiring sector in many organizations uses applicant tracking systems and algorithms to process job applications and screen candidates.
Suppose, Abhishek applied for a job in “XYZ” company and sent an email with a manually drafted Resume along with ‘A LinkedIn profile ‘URL’. Where company “XYZ” uses an AI & Algorithm which analyzes resumes, cover letters and other application materials to identify individuals who match specific criteria. Based on this automated processing, this system automatically rejected Abhishek’s job application based on predetermined parameters without human involvement.
In this context, if a job application is automatically rejected based solely on the output of the automated system, it falls under the category of automated decision-making under the GDPR. The GDPR requires that individuals have the right to request human intervention, express their point of view, and obtain an explanation of the decision when automated processes are utilized to make significant decisions about them.
This example demonstrates how automated decision-making is not limited to a specific industry and can have implications across various sectors, necessitating compliance with GDPR requirements to protect individuals’ rights and ensure transparency and accountability in decision-making processes.
Profiling refers to any form of automated processing of personal data intended to evaluate certain personal aspects relating to an individual. This can include analyzing or predicting aspects concerning a person’s performance at work, economic situation, health, personal preferences, reliability, behavior, location, or movements.
For example, suppose when a person browses for a specific product online, such as a new smartphone, various data points such as search history, website visits, and demographic information may be collected and analyzed to create a profile indicating their interest in technology products. Subsequently, the individual may start seeing targeted advertisements for smartphones and related accessories on different websites and social media platforms, reflecting the profiling conducted based on their online activities.
This example illustrates how profiling is utilized to create personalized experiences for individuals in the digital advertising space, and emphasizes the importance of transparency, consent, and accountability in the collection and use of personal data for profiling activities, in accordance with GDPR’s provisions.
Article 22(1) GDPR Right to Explanation and Intervention:
Article 22 grants individuals the right not to be subject to a decision based solely on automated processing if that decision has legal or similarly significant effects on them. This means that individuals have the right to be informed about the logic behind automated decisions and to request human intervention.
For example, if a bank uses an automated system to determine credit scores and deny a loan based solely on that score, the individual has the right to request an explanation for the decision and to have a human review the case.
Article 22(2) Exceptions to the Right: The right given does not apply if the automated decision-making is necessary for the performance of a contract, authorized by law, or based on the individual’s explicit consent.
The article also requires appropriate measures to safeguard individuals’ rights, freedoms, and legitimate interests. However, there are exceptions to this right in specific cases, such as when the automated decision is necessary for entering into or performing a contract between the individual and a data controller, authorized by Union or Member State law, or based on the individual’s explicit consent.
Article 22(3) Safeguards for Significant Decisions: If the automated decision-making is authorized and produces legal effects or similarly significantly affects the individual, the data controller must implement suitable measures to safeguard the individual’s rights, freedoms, and legitimate interests. This may involve the right to obtain human intervention, express their point of view, and contest the decision.
Article 22(4) Information to the Data Subject: Individuals have the right to be informed about the existence of automated decision-making, including profiling, and to receive meaningful information about the logic involved, as well as the significance and consequences of such processing.
Implications and consequences of data profiling under GDPR can have significant impacts on individuals and organizations. Some of these consequences include:
- Violation of Privacy Rights: Data profiling can infringe upon individuals’ right to privacy by collecting, analyzing, and using personal data without their knowledge or consent, leading to potential privacy breaches.
For instance, a social media platform might use profiling to track users’ online activities and behaviour, potentially revealing sensitive personal information without explicit consent.
- Discriminatory Practices: Profiling can lead to discriminatory practices if it results in unfair treatment based on personal characteristics such as race, ethnicity, religion, or sexual orientation. This can lead to social and ethical implications and damage an organization’s reputation.
An example of this is an insurance company using profiling to set higher premiums for individuals residing in specific postal codes, potentially discriminating against certain demographic groups.
- Lack of Transparency: If organizations fail to provide transparent information about their profiling activities, individuals may not be aware of how their data is being used, leading to a lack of trust and confidence in the organization.
For instance, a retail company might use data profiling to personalize product pricing without disclosing this practice to customers, leading to distrust and scepticism.
- Potential for Error and Inaccuracy: Profiling processes may not always produce accurate results, leading to incorrect assumptions and decisions about individuals. This can result in unfair treatment or negative consequences for individuals based on flawed profiling algorithms.
An example is an automated recruitment tool that profiles job candidates based on historical data, potentially leading to biased hiring decisions and unequal opportunities.
- Legal and Financial Repercussions: Non-compliance with GDPR regulations on data profiling can lead to severe financial penalties for organizations. Fines for violations can be substantial, reaching up to 4% of annual global turnover or €20 million, whichever is higher.
For instance, a telecommunications company was fined by the Information Commissioner’s Office (ICO) for using profiling to make unwanted marketing calls to individuals without their consent, resulting in a substantial penalty.
- Loss of Customer Trust: Misuse or mishandling of personal data through profiling can result in a loss of trust from customers and stakeholders, leading to reputational damage and potential loss of business.
- Data Security Risks: Profiling activities may increase the risk of data breaches and unauthorized access to sensitive personal information, leading to potential harm to individuals and regulatory penalties for organizations.
To mitigate these negative implications and consequences, organizations must adhere to GDPR requirements regarding data profiling, implement robust data protection measures, ensure transparency and consent in data processing, and conduct thorough impact assessments to safeguard individuals’ rights and privacy.
Organizations and businesses subject to GDPR must comply with Article 22 when engaging in automated decision-making and profiling processes to protect individual’s rights and freedoms related to their personal data.
Conclusion
Overall, Article 22 aims to ensure that individuals are not solely at the mercy of automated decision-making processes and can understand, question, and challenge these decisions when necessary. This is crucial in protecting individuals from potentially unfair or discriminatory outcomes resulting from the automated processing of their personal data.
In conclusion, profiling within the scope of GDPR is a critical aspect of automated decision-making, and organizations need to ensure that individuals are not subjected to detrimental effects without the opportunity for human intervention or the chance to contest the decisions made through automated profiling.
Pranav Kumar
This article has been authored by Pranav Kumar, data privacy intern at Zedroit.
Pranav Kumar
This article has been authored by Pranav Kumar, data privacy intern at Zedroit