Five years after Brexit: a look at the situation in the United Kingdom
Legal Watch No. 87 – September 2025.
Five years after Brexit: a look at the situation in the United Kingdom.
Since the United Kingdom left the European Union, British data protection law has been slowly but surely moving away from the European regulatory framework.
While there is a trend on both sides of the Channel towards a simplification of standards, the United Kingdom is ahead of the European Union.
Following Brexit, the Data Protection Act 2018 (DPA) was amended to remain complementary to the "UK GDPR", the British version of the GDPR which came into force on January 1, 2021.
The United Kingdom has also adopted the Privacy and Electronic Communications Regulations (PECR), which implement the European ePrivacy Directive.
The Data Use and Access Act (DUAA), adopted on June 19, 2025, is a major reform of the UK's personal data regime.
It modifies and complements both the UK GDPR, the DPA and the PECR, and represents a strategic turning point by which the UK moves away from a fundamental rights-centered vision to adopt a more pragmatic and pro-innovation approach.
Its main objectives are to:
- To propose a broader legal basis of "recognized legitimate interests" for certain processing activities, including archiving in the public interest, public safety, or taxation;
- Giving companies the option to suspend the response time for access requests,
- Simplify the rules on cookies;
- Facilitating international data transfers;
- Encourage innovation in digital services and AI, by relaxing the framework for automated decisions;
- Impose "privacy by design" in digital services targeting children,
- To enable the implementation of a digital identity,
- Simplify the rules concerning police and intelligence services.
However, some point out the risks of these reforms, and in particular the weakening of individual rights.
In the area of automated decision-making, they point, for example, to the fiasco of the 2020 "A level" exam results, where irrelevant grades were awarded by an algorithm, and the errors in the Department for Work and Pensions (DWP) fraud detection tools that resulted in numerous harms for those concerned.
The digital identity project, announced by the government in early October, has provoked a storm of protest, with millions of British citizens signing a petition demanding the project's withdrawal.
For some years now the British government has also been pushing to put in place mechanisms allowing access to the encrypted content of communications: the public discussion focuses on controversial technical approaches such as "client side scanning".
It should be noted that the European Union is currently discussing a related topic: the CSAR proposal aimed at preventing and combating sexual abuse committed against children is again on the agenda of the European Council on October 14.
This proposal, which provides for the possibility of scanning communications on the user's terminal before they are sent, is considered by many scientists and civil society to be both ineffective and particularly dangerous for fundamental rights and encryption itself.
The European Union is also currently tending to simplify the regulatory framework concerning data through its "digital omnibus package".
The Commission has launched a call for contributions, which closes on October 14, covering areas of data legislation, including rules on cookies and other tracking technologies, the reporting of cybersecurity incidents, and certain aspects of the law on artificial intelligence.
With regard to the GDPR more specifically, the issues at stake include, for example, limiting the obligation to maintain a register to organizations with more than 500 employees, compared to the current limit of 250. The European Union is not currently challenging the core principles of the regulation.
On both sides of the Channel, the announced benefits for industry relate to the reduction of compliance costs.
However, the costs of implementing the reform itself, the costs of errors or disputes, and the costs to consumer confidence are less well understood. This argument was also raised during the "implementation dialogue" organized by the European Commission in mid-July. The private sector indicated that it had "invested in compliance and that a general reopening could create uncertainty, particularly in the context of international data transfers."
Today, the benefits/risks ratio of British reforms and their impact on international partnerships, and more decisively, on maintaining the UK's adequacy decision with the EU, remain uncertain.
The European Commission's draft adequacy decision is, however, in favour of acknowledging the level of protection of the United Kingdom.
However, it must be submitted to the European Data Protection Board for its opinion and discussed in the Council.
Let us hope that the final decision will be particularly transparent regarding the criteria taken into account, given the consequent changes to British law.
This appears essential to ensure sufficient legal certainty in the future for both European companies and those established outside the EU.
On September 18, 2025, the CNIL fined the company Samaritaine SAS 100,000 euros for concealing cameras in the store's storage area.
These cameras were disguised as smoke detectors and were capable of recording sound.
The CNIL has reminded that an employer can install hidden cameras in exceptional circumstances and provided that a fair balance is struck between the objective pursued (the protection of property and persons) and the protection of the privacy of employees.
Such a system could, for example, be authorized provided that it is temporary and deployed after documented analysis of its compatibility with the GDPR and in view of exceptional circumstances.
In this case, the company did report the existence of thefts committed in the reserves and explained that the system was temporary, but it did not carry out any prior GDPR compliance analysis, nor did it document the temporary nature of the installation.
The CNIL has also published several pieces of content concerning the management of inactive accounts for audiovisual and video game professionals, on video devices in schools, and on the geolocation of children.
The National Assembly adopted the cybersecurity Resilience bill in a special committee on September 9.
Philippe Latombe, chairman of the committee, had an amendment adopted aimed at enshrining end-to-end encryption in article 16 bis: "It cannot be imposed on encryption service providers, including qualified trust service providers, to integrate technical devices aimed at intentionally weakening the security of information systems and electronic communications such as master decryption keys or any other mechanism allowing non-consensual access to protected data."
This text, which transposes into French law the European directives NIS2, DORA and REC, should lead to a significant increase in the general level of cybersecurity.

European institutions and bodies
On September 10, Ursula von der Leyen delivered a State of the Union address in which she reaffirmed that Europe would remain sovereign in defining its own rules and standards, thus rejecting transatlantic criticism.
On the same day, 39 European industry leaders and associations signed the European Declaration on AI and Technology, committing to invest in Europe's technological sovereignty.
Ms. von der Leyen reiterated the key priorities of the EU's AI strategy, including the future regulation on cloud and AI development, the "Quantum Sandbox" initiative and significant investments in European AI "gigafactories".
The future regulation on cloud and AI development could include data sovereignty and localization measures aligned with the Union's future data strategy.
The Commission President also highlighted the objectives of simplifying European regulation, emphasizing recent reform proposals that could impact data protection, such as reducing data register obligations and streamlining incident reporting requirements in digital legislation.
Finally, she addressed the issue of children's online safety, announcing that the Commission would seek expert advice by the end of the year, possibly drawing inspiration from the Australian approach to social media restrictions.
The Data Regulation (Data Act) came into force on September 12, 2025.
This text gives users of connected products (businesses or individuals who own, rent or lease such a product) greater control over the data they generate, while maintaining incentives for those who invest in data technologies.
It also defines the general conditions applicable to situations in which a company has a legal obligation to share data with another company.
The European Data Protection Board has published guidelines on the interaction between the Digital Services Regulation and the GDPR, which are open for public consultation until the end of October.
The EU Court has ordered the European Commission to pay €50,000 in compensation to a person implicated in a press release from the European Anti-Fraud Office (OLAF).
Although no name was mentioned in the press release, contextual clues allowed the researcher to be identified. The case in question, OC v Commission [C-479/22 P], had been cited in the recent EDPS v. SRB case, which is now generating strong reactions regarding the scope of the identifiable nature of personal data.
News from the member countries of the European Union.
In Germany, the Federal Labour Court ruled that a company had illegally transferred the personal data of its employee to its parent company in order to test human resources management software.
This transfer could not be justified by a legitimate interest, as fictitious data would have sufficed. The court awarded the employee €200 in damages for emotional distress.
In Austria, a court ruled that a data subject's right of access ceases upon their death and is not transferred to a legal successor. The court thus overturned a decision by the data protection authority concerning a violation of the right of access after the data subject's death during the appeal proceedings.
The Austrian consumer rights organization VSV has launched a class action lawsuit against Meta in Austria and Germany, seeking up to €5,000 in damages for individuals over 18. The lawsuit concerns Meta's professional tools, which VSV alleges are used for "illegal surveillance" of users' private lives.
The Belgian data protection authority has fined the owner of a student house €9,700 for illegally operating a video surveillance system that was not necessary for the protection of the property or for monitoring compliance with the house rules and that could not be based on a legitimate interest or the execution of the rental contract.
The Spanish Data Protection Agency (APD) has imposed a fine of €1,800,000 on a company that processed the personal data of self-employed workers collected by a public authority without a legal basis.
She considered that, although the Spanish Tax Agency (AEAT) was legally entitled to communicate certain data to the Chamber of Commerce, the subsequent transfer to the company Camerdata and the use of this data for commercial and marketing purposes had no valid legal basis.
The legitimate interest invoked by the company did not outweigh the risks to the rights and freedoms of the workers whose data had been disclosed.
In Estonia, the pharmaceutical company Allium UPI was fined €3,000,000 for failing to implement adequate technical and organizational measures, such as multi-factor authentication, which resulted in a data breach affecting 750,000 people, including children and vulnerable groups.
In Finland, the APD fined a bank (S-Pankki Oyj) €1,800,000 for failing to implement sufficient technical and organizational measures regarding a new login feature on its application, which led to unauthorized access by its customers to other customers' accounts.
The Italian Data Protection Authority (APD) ruled that a company's customer has the right to withdraw her consent to the use of her image in the company's advertising. It clarified that consent could be withdrawn regardless of any negative economic consequences for the data controller.
With a Senate vote on September 17, Italy becomes the first European country to adopt a law on AI.
After defining the framework of general principles, the law pays particular attention to crucial sectors such as labor, health and justice.
It also regulates the use of AI by minors and includes penal provisions.
The Dutch Data Protection Authority (DPA) is currently conducting an investigation in collaboration with the Italian, Luxembourg and Hungarian DPAs into how connected TVs process personal data.
The report notes in particular that connected TVs send a significant amount of data during their installation, daily use, when they are in standby mode and even when they are switched off, and that they operate in an opaque Internet ecosystem involving different parties: manufacturers, operating system providers, application developers, etc.
He points out that users often have no choice but to accept privacy policies and have difficulty identifying those responsible for data processing.
Pre-installed applications (sometimes impossible to remove) raise questions regarding data minimization and user rights.
Following the launch of Apertus AI, the Swiss open source chat, another major open source machine learning model supported by a public institution in Europe has emerged: TildeOpen LLM.
This is a fundamental linguistic model designed to compensate for the weaknesses of existing LLMs with regard to Nordic and Eastern European languages, which are currently underrepresented.
This 30 billion parameter model was developed with funding from the European Commission and trained on the LUMI supercomputer.
These two models have announced that they comply with the European regulation on AI.
The 47th General Assembly on Privacy Protection, held last September in Seoul, reaffirmed that data privacy and security are crucial issues facing the world today.
The annual event brings together regulators as well as companies and organizations from around the world.
Twenty data protection authorities adopted a joint declaration on this occasion in order to build a reliable governance framework for trustworthy AI.
They advocate integrating data protection principles from the design stage, establishing robust data governance, and anticipating risk management.
The statement also highlights the increasing complexity of data processing in this context and underlines the diversity of actors involved, as well as the need for a regulatory framework adapted to technological progress.
In the United States, "Shutdown" has significant consequences in terms of data protection.
Some federal agencies are almost entirely shut down, including the Federal Trade Commission (FTC), the primary body responsible for enforcing data privacy.
Within the FTC, the Bureau of Consumer Protection is the most affected by the forced layoffs.
Except in cases where the harm is deemed extremely serious, consumer protection matters will be suspended, whether they involve preliminary investigations, administrative proceedings, or lawsuits in federal courts.
US users of ChatGPT Plus, Pro and Free can now buy directly from the Etsy sales platform, and many other sellers will soon be accessible as well.
This implies, for ChatGPT as for other agentic AI systems, that the user gives third parties access to their personal information, and in particular banking information, with all the data vulnerability issues that this implies.