Simplification of European digital rules: what can we expect?
Legal Watch No. 89 – November 2025.
Simplification of European digital rules: what can we expect?
How far will the European Commission go in its desire to simplify the European legislative framework?
Since the publication of the Draghi report in September 2024, and in a turbulent international economic context, the European executive has been multiplying measures in favor of industry.
The recent publication of the "Digital Omnibus" package is intended to reassure the private sector about legal constraints in the digital sector, much to the dismay of civil society, which fears an unprecedented erosion of fundamental rights.
The Commission officially published two proposals on November 19, the “Digital Omnibus” and the “Digital Omnibus on AI”, and launched a digital skills assessment.
The proposal concerning digital technology introduces amendments to both the GDPR and the AI regulation, as well as the ePrivacy Directive. The text is dense; here are some key points.
-
- Regarding the ePrivacy Directive,
The Commission is proposing, in particular, changes to the cookie policy. The aim is to reduce the number of times banners appear and to allow users to give their consent with a single click and save their preferences via the central settings of their browsers and operating systems. This provision is among the least contested.
The text also provides for a unified reporting interface enabling companies to meet all their incident notification obligations via a single secure portal, with the aim of simplifying the competing obligations of the NIS2 directive, the GDPR and the regulation on digital operational resilience.
A relaxation of legal obligations is planned for SMEs and small and medium-sized enterprises regarding documentation, sanctions, and cloud switching rules.
-
- Regarding the GDPR,
The proposal restricts the definition of personal data by introducing a subjective approach to data identifiability, which could exclude certain pseudonymous data or identification numbers from the scope of the regulation.
The definition of sensitive data would be modified to only concern those that "directly" reveal information concerning health, racial origin, etc., at the risk of excluding those that could be deduced by an algorithm, for example.
The proposal also seeks to restrict individuals' right to request access to their data only for "data protection purposes".
A new definition of "scientific research" could also lead to wide-ranging exemptions from the GDPR benefiting the private sector: "any research likely to support innovation, such as technological development and demonstration" could thus be exempt from the obligations of information and purpose limitation.
The proposal includes changes facilitating automated decision-making, and allowing the use of personal data to train and operate AI systems based on legitimate interest.
The text also provides for a six-month transitional period for Article 50(2) of the AI regulation (transparency obligations).
The digital omnibus package remains open for comments for a period of eight weeks, extended daily until the proposal is available in all EU languages, currently until 29 January 2026.
The proposal will then follow the ordinary EU legislative procedures in the European Parliament and the Council, where it will certainly be the subject of lively debate.
With regard to the GDPR more specifically, some countries are open to a review of the Commission's proposals, while others, including Slovenia, Estonia and Austria, have already indicated that in their view, the GDPR "does not require any further changes at this time".
The process will also include the views of the European Data Protection Board and the European Data Protection Supervisor, two positions which, although non-binding, are expected to influence further discussions.
After an initial report assessing whether the French were willing to pay for online services without targeted advertising, the CNIL published on November 17 a second part concerning the attitude of the French towards the monetization of their data.
65% of respondents (%) indicated they were willing to sell their data. The most frequent valuation was between 10 and 30 euros per month, preferred by 28% of these respondents (%).
On the other hand, 35% of individuals do not wish to sell their data, whatever the price, expressing a principled rejection of the monetization of personal data.
In this regard, the CNIL points out that, while it is possible to transfer a right of use over one's data, "a practice of 'monetizing' one's personal data, in the sense of transferring ownership rights over it, is not possible within the current legal framework since one cannot waive one's rights over one's data."
On November 20, 2025, the CNIL fined the French company Conde Nast publications 750,000 euros for failing to comply with applicable rules regarding cookies placed on the terminals of users visiting the website "vanityfair.fr".
The company had already been issued a formal notice following a complaint from the Noyb association in 2019, without complying.
The fine takes into account the company's lack of response, the number of people involved, and the various shortcomings: failure to obtain consent, failure to inform users, and failure of the mechanisms for refusing and withdrawing consent.
On November 27, 2025, the CNIL also imposed a fine of 1.5 million euros on the company American Express Carte France for non-compliance with applicable rules regarding cookies.
On November 26, ANSSI published a report concerning the state of the threat to mobile phones.
The report highlights the exploitation by attackers of vulnerabilities that can target networks, the operating system or applications, and identifies a specific threat: espionage and surveillance operations carried out by state actors.
Mobile phones are also a prime target for cybercriminals, who, by compromising them, manage to steal money from their victims.
Mobile phones are also being misused for private surveillance or destabilization operations.
The report also provides security recommendations for users.
These include regularly turning your phone off and then on again without using the restart function, not clicking on links or opening files in unsolicited messages, being vigilant when opening links transmitted via QR codes, applying operating system updates, disabling Wifi and Bluetooth interfaces when not in use, and avoiding connecting to public Wifi networks.
European institutions and bodies
On November 26, representatives of EU member states agreed on the Council's position regarding the regulation aimed at preventing and combating sexual abuse of children.
The text aims to impose on companies in the digital sector the obligation to prevent the dissemination of child pornography and the solicitation of children.
The competent national authorities will have the power to compel companies to remove and block access to certain content or, in the case of search engines, to remove certain search results.
The regulation also provides for the creation of a new European agency, the European Centre on Child Pornography, tasked with helping Member States and online service providers to implement the law.
The Council also wishes to make permanent a currently temporary measure that allows companies to voluntarily scan their services for child sexual abuse.
Although less intrusive than its previous version, the text is still considered by its detractors as an attack on end-to-end encryption and the confidentiality of communications.
The project still needs to be discussed in trilogues with the Commission and the European Parliament.
On December 5, the European Commission fined X 120 million euros for failing to meet its transparency obligations under the Digital Services Act (DSA).
The shortcomings include the misleading design of its "blue validation" for verified accounts, the lack of transparency in its advertising directory, and the failure to provide researchers with access to public data.
In its Russ media decision of December 2, the Court of Justice of the European Union (CJEU) held that a marketplace publisher is responsible for the processing of advertisements published on its platform and must check, before publication, whether they contain sensitive data and whether their processing complies with the GDPR.
1/ The operator of a marketplace is responsible for processing the published advertisements: several elements justify this classification for the CJEU:
- The advertisement is only accessible online because of the service provided by the platform.
- The publisher pursues its own purpose, notably commercial and advertising, and is not limited to a technical service.
- It determines essential means: presentation, duration of online posting, sections, classification, methods of distribution.
- Thus, the operator and the advertiser are jointly responsible for the processing carried out on the advertisements.
2/ As the data controller, the operator must identify the potential risks of data processing and implement measures and safeguards appropriate to the identified risk: the CJEU indicates that the operator must, upstream:
- Detect if an advertisement contains sensitive data.
- Check whether this data relates to the advertiser or whether the advertiser has an exemption, in particular an explicit consent from the person concerned.
- Refuse publication if these conditions are not met.
As part of its security obligations, the operator must also implement measures to limit the illegal copying and reproduction of advertisements containing sensitive data.
On November 20, the CJEU ruled in the Policejní prezidium case on the Czech police practices of collecting and indefinitely storing the biometric and genetic data of all persons suspected of having committed intentional offences.
The aim was to determine whether the European "Police" directive requires a case-by-case assessment of the need for retention, whether indefinite retention periods are permitted, and what legal safeguards should govern the processing of this sensitive data.
The Court does not find a prohibition in principle but imposes a series of conditions: the data controller must comply with all the principles and specific requirements applicable to the processing of sensitive data, and national legislation must set appropriate time limits for a periodic review of the strict necessity of retaining this data.
The European Union Agency for Fundamental Rights published a report on December 4th addressing the protection of fundamental rights in the use of AI in high-risk areas. It highlights a lack of awareness regarding these rights.
News from the member countries of the European Union.
The Vienna Regional Court for Civil Matters held that legal action against a data controller not established in the EU must be served on the controller and not on its representative in the EU.
A notification to the representative under Article 27 of the GDPR is insufficient, unless national procedural law provides for this option.
On November 14, the Croatian Data Protection Authority (APD) imposed an administrative fine totaling EUR 4,500,000.00 on a telecommunications operator, in its capacity as data controller, for transferring personal data to a third country in violation of the GDPR.
The transfer to a Serbian subcontractor was carried out without a valid legal basis and without transparent information to the persons concerned, the processing of copies of identity cards and criminal record certificates of employees without legal basis, and in the absence of appropriate prior checks of the subcontractor.
The International University of Valencia in Spain has been fined €750,000 for using facial recognition and AI to identify exam participants without a proper legal basis.
In Spain too, the APD fined a medical clinic €30,000 for breaching Article 5(1)(f) of the GDPR by disclosing the phone numbers and health data of around 90 clients in a Whatsapp group without prior consent and without adequate confidentiality measures.
A court has awarded over €480,000,000 in damages to 87 Spanish media outlets after concluding that Meta had illegally used personal data collected on its social networks to create detailed user profiles and offer more effective personalized advertising than its competitors, thereby gaining an unfair competitive advantage.
The court considered that the processing of personal data is a key competitive factor in the digital economy and that violations of the GDPR can constitute unfair competition when they confer a significant advantage.
The Italian Data Protection Authority (APD) has fined the province of Bolzano €32,000 for illegally operating a network of traffic-monitoring cameras. The province lacked a valid legal basis for processing personal data, particularly license plate numbers.
In the United Kingdom, 73 academics, lawyers, data protection experts and non-governmental organizations have called in a letter to the House of Commons for an inquiry into the British data protection authority (ICO), after what they describe as a "collapse of enforcement measures" following in particular the Afghan data breach scandal.
They warn of "deeper structural failures" beyond this data breach.
This constituted a particularly serious leak of information concerning Afghans who had collaborated with British forces before the Taliban took control of the country in August 2021, endangering the lives of the 100,000 people whose names had been disclosed by the Ministry of Defence.
The ICO is criticized for not having initiated formal legal proceedings against the ministry, despite repeated shortcomings.
In Argentina, some courts have ruled in recent months on the use of AI in lawyers' pleadings and briefs.
The cases involved lawyers who included case law quotes that turned out to be false or inaccurate due to AI hallucinations.
Just as in the United States in particular, Argentine courts are beginning to assess the extent of lawyers' professional responsibility: even when they act in good faith, submitting briefs that cite non-existent case law undermines the fundamental principles of the profession, including honesty, loyalty, and sincerity, as set out in the codes of ethics of the various Argentine jurisdictions.
In the specific cases examined, the courts decided not to impose direct sanctions on the lawyers.
Nevertheless, they deemed it appropriate to inform local bar associations to raise awareness of the risks and responsibilities associated with the use of AI and to promote a wider debate on the responsible use of AI in legal practice.
The Australian government is the latest to unveil a roadmap for AI.
After considering a security-focused strategy, the government ultimately chose to emphasize investment and the economy.
Rather than establishing mandatory safeguards in high-risk environments, Australia will build "on existing robust legal and regulatory frameworks."
The National Plan announced on December 2nd aims to enhance Australia's reputation as an AI investment location, and promotes targets for widespread AI use across the country, and in public services in particular.
It also describes the role of the AI Security Institute in testing and sharing information on AI capabilities, risks, and dangers.
In Australia as well, children and teenagers under the age of 16 will no longer be allowed to use social media from December 10th.
Platforms are required to implement age verification measures.
Failure to comply could result in fines of up to 28 million euros.
This ban could be followed by others: on November 26, the European Parliament called for setting the minimum age at 16 across the European Union for social networks, video sharing platforms and AI assistants, while allowing access for 13-16 year olds with parental consent.
While expressing their support for the Commission regarding a European age verification app and a European digital identity wallet (eID), MEPs insist that age verification systems must protect the privacy of minors.
In the United States, only two out of five commissioners remain on the Federal Trade Commission (FTC).
A third resignation, that of Melissa Holyoak, was indeed announced by the FTC on November 17, 2025.
This resignation follows President Trump's dismissal of two Democratic commissioners, leaving only two Republican commissioners in place.
The Supreme Court has been seized of the matter, and its decision could have broader implications for the president's power over independent agencies.
Travelers from countries such as Britain, France, or South Korea, countries eligible for the U.S. Visa Waiver Program, may soon have to submit to a review of their social media activity going back up to five years, according to a proposal filed on December 9 by U.S. Customs and Border Protection (CBP).
CBP also plans to ask applicants for a long list of personal data including their email addresses for the past ten years, as well as the names, dates of birth, places of residence and places of birth of their parents, spouses, brothers, sisters and children.
On November 13, 2025, the Indian Ministry of Electronics and Information Technology promulgated the implementing rules of the 2023 Digital Personal Data Protection Act.
According to the firm Nishith Desai Associates, they specify the requirements for transparency, consent and registration, the obligations to notify in the event of a data breach, the rights of the persons concerned and details concerning the Indian Data Protection Council.


