Opinion 08/2024 of the EDPB of April 7, 2024 on valid consent within the framework of the “Consent or Pay” models implemented by large online platforms
The opinion in question mainly deals with the widespread use of the “Pay or Consent” model, whereby large online platforms, attracting a significant number of users, offer two options: either agree to have their personal data used for advertising purposes, or pay a sum to avoid their use. The European Data Protection Board (EDPB), within the framework of Article 64, paragraph 2 of the GDPR, considers that in many cases, this model does not make it possible to respect the criteria of valid consent. According to the EDPB, forcing users to choose between giving consent to the processing of their data or paying a fee is problematic, as data protection should never be reduced to a financial transaction. The EDPB recognizes, however, that companies can freely set their prices and structure their revenue models. However, this flexibility must be balanced by the fundamental right of individuals to the protection of their personal data. Data protection authorities, responsible for ensuring compliance with the GDPR, must ensure that consent requirements are met. They also have the power to assess to what extent the presence of a fee could limit users' freedom of choice. Although companies can determine the amount of fees, supervisory authorities can impose corrective measures.
EDPB Data Protection Guide This guide for small businesses is now available in 18 languages including French, English and German! The document helps small business owners in their efforts to comply with data protection rules. It aims to raise awareness of GDPR and provide SMEs with practical information on GDPR compliance in an accessible and easily understandable format.
CJEU decision in joined cases C-17/22 and C-18/22 dated September 12, 2024 The decision highlights tighter restrictions on the sharing of personal data, particularly those involving contact details. The Court examined the conditions under which data processing can be considered lawful, even without explicit consent.
The Court addressed several key points:- Processing necessary for the performance of a contract: To be justified, the processing must be “objective essential” for the performance of the contract. Merely practical or beneficial treatment is not enough. The Court found that the transmission of contact details of business partners did not meet this requirement, particularly if the contract prohibits this disclosure. - The legitimate interests of the controller or a third party: Even if a legitimate interest for the controller or a third party can be recognized, the Court ruled that the fundamental rights of the data subjects took precedence over these interests. - Compliance with a legal obligation: The Court recalled that any obligation to process personal data must be established by a clear, precise and predictable national law.
This decision underlines that the consent of individuals remains central to justifying the processing of data under the GDPR. Although there are alternative grounds such as legitimate interest, in the absence of consent these grounds must be applied very strictly and restrictively. Therefore, when consent is not obtained, companies must interpret other legal bases with caution.
Guidelines 01/2023 on Article 37 of the Law Enforcement Directive adopted on June 19, 2024
These directives establish the legal standards to be respected by competent authorities when transferring personal data to countries outside the EU or international organizations. They describe in detail the steps necessary to assess whether the guarantees in these third countries are adequate. This includes an in-depth analysis of the associated risks, taking into account the specificities of each situation and the potential impacts on the rights and freedoms of the people concerned.
Emphasis is placed on the importance of putting in place legally binding safeguards to protect personal data. These guarantees must ensure a level of protection similar to that in force within the European Union for transfers to third countries or international organizations.
Data controllers have an obligation to maintain an increased transparency and accountability, in particular by informing and collaborating with data protection authorities on transfers made. They must also carry out regular reviews of the protection measures put in place, while ensuring that transfers are secure and comply with legal requirements. Additionally, they have the responsibility to ensure appropriate protective measures throughout the process.
Survey on Google Artificial Intelligence from September 12, 2024 The Irish Data Protection Commission (DPC) has launched an investigation into Google's Pathways Language Model 2 (PaLM 2) AI model, used for natural language processing. The aim is to check whether Google has conducted a data protection impact assessment (DPIA), as required by the European Union's General Data Protection Regulation (GDPR). In particular, the investigation seeks to determine whether Google properly assessed the risks of collecting and using personal data from residents of the European Economic Area (EEA) to train its AI models, and whether this could compromise the fundamental rights and freedoms of individuals. Article 35 of the GDPR requires the carrying out of an impact analysis when data processing, especially when it is based on new technologies, presents a high risk for the rights and freedoms of the data subjects, taking into account the nature, extent, context and objectives of the processing. The DPIA is an essential tool for ensuring GDPR compliance, enabling data controllers to identify and reduce risks associated with high-risk processing. If the DPC concludes that Google has violated the GDPR, the company could face financial penalties of up to 4% of its global annual turnover. The investigation is part of a broader trend to increase scrutiny of how big tech companies use personal data to develop sophisticated AI systems.
LLM chatbots The Hamburg Commissioner for Data Protection and Freedom of Information, in a document published on July 15, affirmed that LLMs (broad language models) do not store personal data. Therefore, it is the companies deploying these LLM chatbots, and not the providers of the LLMs, who should be responsible for the requests of data subjects, in particular with regard to the data captured and generated by these chatbots.
It is first worth remembering that LLM chatbots are sophisticated AI systems designed to understand and produce language in a way that mimics human communication. Based on the principle of fairness enshrined in Article 5(1)(a) GDPR, as highlighted by the EDPB ChatGPT Working Group, it is essential that controllers do not transfer the risks of the use of these technologies to data subjects. In other words, companies must take responsibility for the risks associated with operating these systems. LLM chatbot providers must therefore recognize that users may, sooner or later, insert personal data into their interactions with chatbots. It would not be acceptable if these providers attempted to shift this responsibility onto the users themselves, claiming that they are responsible for the data they enter. Consequently, companies cannot simply deploy new technologies without fully assuming the consequences and responsibilities that result from them.
CJEU decision of October 4, 2024 On October 4, 2024, the Court of Justice of the European Union delivered a very important judgment in the Lindeapotheke case, C-21/23, in which it found that the information entered by customers when ordering medicines online (whether or not subject to medical prescription) such as their name or address constitute data relating to health within the meaning of the GDPR.
Opinion 22/2024 of the EDPB of October 7, 2024 The EDPB has published two important documents for professionals, including an opinion on the obligations relating to processors and sub-processors. This document covers the checks to be carried out by the data controller, the necessary documentation, and the distribution of responsibilities between data controller and subcontractor. The controller must always have access to the list of processors and monitor the transfer of data outside the EEA. Sub-contractors must provide relevant information proactively, and contracts must specify the cases in which they may process data beyond the instructions received, particularly in the event of a legal obligation.