The subcontractor, between weak link and strategic player.
Legal Watch No. 92 – February 2026.
The subcontractor, between weak link and strategic player.
On February 9, the CNIL presented its 2025 report concerning its sanctions and corrective measures: cookies, employee monitoring and data security are the main subjects of the sanctions imposed in 2025, the cumulative amount of fines of which represents a total amount of 486,839,500 euros.
Among the sanctions, a significant number of cases involve subcontractors for failing to comply with obligations regarding the data entrusted to them.
The CNIL reminds us that the subcontractor must:
- Process data only on the instructions of the data controller;
- Implement appropriate technical and organizational measures to ensure an adequate level of security;
- And delete the data at the end of the contractual relationship with the data controller.
The subcontractor must also assist the controller in implementing certain obligations of the regulation: privacy impact assessment, data breach notification, security, contribution to audits.
It must also keep a record of processing activities, and in certain cases appoint a data protection officer.
Article 28 of the GDPR, which deals with the establishment of a specific contract between controller and processor, specifies the mandatory provisions that must be included in the contract.
This may be based, in whole or in part, on standard contractual clauses such as those approved by the European Commission or one of the EU Member States.
It is advisable to clearly define the respective responsibilities of the contractor and the subcontractor in these clauses. These details may prove useful in the event of default by one of the parties.
It is worth remembering that many recent data breaches, affecting both the public and private sectors, have been caused by a security failure on the part of a subcontractor.
Often, the cause is human error, and more specifically, the theft of an employee's credentials.
Two-factor authentication and the use of complex passwords remain essential ways to protect against such attacks, as we echoed in a commentary on the CNIL's recent "France travail" sanction.
Subcontractors do indeed have specific obligations regarding security, confidentiality and documentation.
In order to guarantee effective data protection, adapted to the risks, they must put in place all useful measures from the design of the service or product and by default ("privacy by design & by default").
Furthermore, any data breach must be notified to the data controller within the time limits stipulated by law.
Finally, deleting data at the end of the contract is not only an obligation, it is also the best way to protect against data breaches once the contractual relationship has ended, as reminded by the one million euro fine imposed at the end of December by the CNIL on the company Mobius Solutions.
The company had retained a copy of the data of more than 46 million Deezer users after the end of their contractual relationship, despite its obligation to delete all of this data at the end of the contract, making a major data breach possible.
The CNIL published an updated version of its data protection tables on March 2nd.
These tables gather and organize the essential case law and decision-making practice regarding the protection of personal data at the national and European level.
At the end of February, a report by France 2 warned about a breach of medical data, and more specifically about the hacking of the "MonLogicielMedical" software by Cegedim.
The attack allowed a hacker to collect the medical data of 11 to 15 million French people, accessible on the darknet.
The CNIL had already sanctioned the company in September 2024 with a fine of 800,000 euros, notably for processing health data without authorization.
On February 18, 2026, the Directorate General of Public Finances (DGFiP) reported in a press release that illegitimate access had been detected in the national bank account file (FICOBA), following the usurpation of the identifiers of an authorized agent.
The hacker had obtained the account credentials prior to the cyberattack: the Ministry of Economy had not implemented two-factor authentication.
The data in question relates to approximately 1.2 million accounts and includes, in particular, the identity of the account holders, their address, their bank and their IBAN.
01net reports on the assessment carried out by the company Surfshark regarding data leaks in France.
"By 2025, 425.7 million accounts will have been hacked worldwide. The most affected country is none other than the United States, with 34 million of global data breaches. In second place is France, followed by India, Germany, and Russia. By 2025, French internet users will have experienced, on average, nearly one account hack per second."
France also stands out with "a density of violations 12 times higher than the world average". The lack of investment in security, whether by companies or public entities, is being singled out.
Data brokers, who scour the web to resell the personal data thus collected, are also a prime target for hackers.
European institutions and bodies
The European Commission's "Omnibus" proposal to simplify the GDPR suffers a setback at the EU Council.
In a document dated February 20, the Council intends to remove the proposal to amend the definition of personal data.
Furthermore, according to Euractiv, the Council also wants to remove the Commission's proposal to extend its own powers, which would allow it to determine what constitutes sufficiently pseudonymized data.
The compromise text, prepared by the Cypriot Presidency of the Council, will serve as the basis for negotiations between the national governments.
The Council is expected to examine the text on February 27.
On February 11, the European Data Protection Board (EDPB) and the European Data Protection Supervisor (EDPS) had already adopted a long-awaited joint opinion on the data protection aspects of this Omnibus proposal.
In their press release, the regulators insist that simplification must not come at the expense of protecting people.
They urge the co-legislators not to adopt the proposed amendments to the definition of personal data, as they would significantly reduce its scope and would be in contradiction with the case law of the CJEU.
“We must ensure that any amendments to the GDPR and the European General Data Protection Regulation truly clarify obligations and provide legal certainty while preserving trust and a high level of protection for individual rights and freedoms.”
The joint opinion also underlines the need for clear control by data protection authorities regarding cookie banners and the reuse of public sector data.
The EDPB has published a report on its coordinated action regarding the application of the right to be forgotten.
This right is one of the most frequently exercised.
It has given rise to numerous complaints and an increasing number of decisions by data protection authorities.
The report highlights seven major challenges for organizations, including the lack of internal procedures. It also provides practical recommendations to help organizations address these challenges.
The Committee also published a report on the event held on December 12, 2025 concerning anonymization and pseudonymization.
The report summarizes feedback from around one hundred participants gathered to discuss the application of the GDPR to these techniques following the CJEU ruling in the EDPS v SRB case.
He highlights the practical uncertainties in determining when data remains "identifiable" depending on the actors and contexts, and the question of the "means reasonably likely" to be used to re-identify a person.
The document does not provide definitive guidelines, but identifies the key issues that will guide the EDPB's future directions on anonymization and pseudonymization in its work program.
The EDPB has adopted its work program for 2026-2027.
He will work on several models aimed at facilitating GDPR compliance: data breach notifications, data protection impact assessments, legitimate interest assessment, processing register and privacy notice/policy.
The Court of Justice ruled on February 10 in case C-97/23 P | WhatsApp Ireland v. European Data Protection Board that the action brought by Whatsapp against a decision of the EDPB was admissible.
As the Court of Justice of the European Union has not yet examined the merits of the case, the Court annuls the contested order and refers the case back to the Court.
The European Court of Human Rights ruled on February 17 in the case of Green alliance v. Bulgaria – 6580/22 that Bulgarian regulations authorizing the national security agency to use "undercover agents" violated Article 8 of the European Convention on Human Rights, as they allowed covert surveillance of organizations without sufficient safeguards or controls.
News from the member countries of the European Union.
In Germany, according to Heise.de and Table.Media, the Bundestag is working intensively on a thorough restructuring of its digital architecture. Parliament wants to free itself from the technological grip of American companies like Microsoft in order to act more resiliently and, above all, independently of third countries in times of crisis.
The Austrian Data Protection Authority (DPA) has fined a data controller €1,500 for illegally filming a public sidewalk with a CCTV camera and publishing the images of a suspected thief on social media, thereby violating the principles of data minimization and transparency, as well as the rules governing the processing of judicial data.
The Danish Data Protection Authority (DPA) has issued a reprimand to 51 municipalities and simultaneously warned them about their use of Google products in primary and lower secondary schools. Specifically, it found that the municipalities had not adequately demonstrated that they ensured an appropriate level of protection for personal data processed outside the EU.
The Irish Data Protection Agency (DPA) announced on February 17 that it had opened an investigation into Elon Musk's company, X. The investigation concerns the creation and publication, on the X platform, of "potentially harmful, intimate and/or sexually explicit images, without consent, containing or involving the processing of personal data of EU/EEA data subjects, including children," using a generative AI feature associated with the Grok language model. Several countries have already decided to completely ban Elon Musk's AI chatbot, and the European Commission launched an investigation on January 26.
In the Netherlands, a court ruled that the Dutch Data Protection Authority (APD) had not sufficiently justified its dismissal of a complaint against a cinema that no longer accepted cash payments. According to the court, the APD had not assessed whether the requirement for card payments pursued a sufficiently concrete and justified objective under the GDPR.
In Slovakia, the Constitutional Court invalidated a law requiring NGOs to publish data relating to their contributors, finding that it infringed on privacy, informational self-determination and freedom of association, while imposing excessive burdens.
The UK's data protection authority (ICO) has fined Reddit £14.47 million (approximately €16.6 million) for failing to comply with its obligations regarding the privacy of children's data.
The ICO believes that Reddit's age verification is insufficient and that the platform "therefore did not have a legal basis for processing the personal information of children under 13."
Furthermore, Reddit "did not conduct a data protection impact assessment (DPIA) to assess and mitigate risks to children before January 2025."
The ICO also announced on February 5 a fine of £247,590 against MediaLab.AI, Inc., the company behind the image sharing platform Imgur.
MediaLab allowed children to use Imgur without implementing the basic safeguards required by UK data protection legislation.
India decided in February 2026 to extend the use of Aadhaar, the world's largest digital identity system, to everyday privacy through a new application and an offline verification system.
The changes should allow individuals to prove their identity without real-time verification in the database, by integrating private verification services such as Google Wallet and Apple Wallet.
A civil society campaign argues that the offline verification system risks reintroducing private sector use of Aadhaar, which has already been condemned by the Supreme Court.
The system also allows "Indian states and the police to link all sorts of personal information to the Aadhaar number: GPS coordinates, telephone numbers, social networks, voter registration card, passport, loans, social benefits, sometimes even the names of relatives, or even the names of partners on certain police forms."
According to a CNBC report dated February 19, Accenture links promotions of its senior executives to the regular use of its AI tools.
Associate directors and senior managers were reportedly informed that "regular use" of AI would be required to access leadership positions.
The company's AI strategy "requires the adoption of the latest tools and technologies in order to serve our customers in the most efficient way possible," a spokesperson told CNBC.
On February 11, 2026, the California Attorney General announced a record $2.75 million settlement with the Walt Disney Company in order to resolve allegations that the company did not adequately respect consumers' objections to the processing of their data, under California's Consumer Privacy Act.
In a February 13 article, the New York Times reports that Meta, Facebook's parent company, plans to add facial recognition functionality to its smart glasses, which it manufactures in collaboration with the owner of Ray-Ban and Oakleys, as early as this year.
This feature, internally called "Name Tag," would allow smart glasses users to identify people and obtain information about them via Meta's artificial intelligence assistant. In an internal memo published last year, Meta reportedly stated that political unrest in the United States would divert attention from critics of the feature's release.
On February 23, 2026, a joint statement on AI-generated images was published by 61 data protection authorities.
This joint statement addresses concerns about AI systems capable of generating realistic images and videos depicting identifiable individuals without their knowledge or consent.
On February 19, 2026, the Organisation for Economic Co-operation and Development (OECD) published new guidelines to help organisations implement the OECD Guidelines for Multinational Enterprises and the OECD Principles on AI.
These Guidelines are intended to help organizations manage AI-related risks and comply with international standards for responsible business conduct.
The development of AI agents, while intended to facilitate many administrative tasks, also raises serious security questions.
OpenClaw, the latest, is an open source project that revolves around an autonomous AI agent capable of piloting a computer instead of the user.
Recently acquired by OpenAI, OpenClaw is now the target of serious security attacks.
More than 30,000 OpenClaw instances were compromised by cybercriminals in the space of a few weeks, with fake scripts enabling the installation of viruses and malware capable of recording everything typed on the keyboard on Windows computers.
“More and more startup leaders are banning autonomous agents on company devices, fearing leaks of confidential data. Security experts are raising serious concerns about the lack of auditing, perspective, real safeguards, and transparency surrounding AI, while more and more employees are already integrating it into their daily tasks in a quest for productivity.”
These warnings were relayed on February 12 by the Dutch Data Protection Authority (APD), which cautions users and organizations against using OpenClaw and similar experimental systems due to critical security vulnerabilities.
The International Association of Privacy Professionals (IAPP) published an updated version of its global directory of privacy laws and data protection authorities on February 5.
The IAPP notes that data protection regimes continue to develop and mature worldwide. “In the past 12 months, India has introduced new enforcement rules to implement the Digital Personal Data Protection Act, Bangladesh and The Gambia have adopted or are considering adopting new comprehensive laws, and other countries, including Ecuador and Indonesia, have created new agencies to interpret and enforce existing privacy laws.”


