European data sovereignty: wishful thinking?
Legal Watch No. 83 – May 2025.
European data sovereignty: wishful thinking?
Microsoft's blocking of the email account of the Prosecutor General of the International Criminal Court (ICC), Karim Khan, raises the crucial question of the digital sovereignty of France and Europe in general in the face of major tech players.
According to the AP, Mr. Khan's email account was blocked without warning, forcing him to use the services of the Swiss internet provider Proton.
Because the court is heavily dependent on service providers such as Microsoft, its work has been significantly slowed down.
This decision by Microsoft is a direct consequence of the measures taken by US President Donald Trump against the Hague Court in February, after a panel of ICC judges issued arrest warrants for Israeli Prime Minister Benjamin Netanyahu and his former Defense Minister Yoav Gallant for war crimes in the Gaza Strip.
The presidential decree specifically prohibits the provision of funds, goods or services to the ICC Prosecutor General and any transaction that would circumvent these prohibitions, on the grounds that the actions of the ICC constitute an unusual and extraordinary threat to the national security and foreign policy of the United States.
A Dutch publication from May 31st notes in this context that, given the number of American servers used by the public sector, "the US government could block or manipulate more than 650 Dutch government and critical business websites at the touch of a button, including those of De Nederlandsche Bank and the police, but also the website (…) of the Ministry of Foreign Affairs. Just like the Crisis.nl website, a reference site for Dutch people in the event of a disaster."
Another example of the potential consequences of our technological dependence can be found in a recent case between OpenAI and the US justice system: the company is currently fighting against a decision ordering it to retain all ChatGPT user logs – including deleted and sensitive chats – recorded through its commercial API offering.
The decision originated with press organizations pursuing copyright claims, who accused OpenAI of destroying evidence.
In its response, the company argues that the court is relying solely on an allegation by the New York Times and other media plaintiffs, and that it is "without any valid reason preventing OpenAI from respecting its users' privacy decisions."
Without taking a position on the legitimate issue of respecting copyright, the case leads us to question the control that foreign powers, in this case the United States, can have over our daily working tools, from the moment they alone decide on the motivations of national interest justifying such control.
This seems particularly worrying in a world where the rule of law is increasingly under attack and where political alliances fluctuate daily.
In this gloomy context where Europe is often accused of lagging behind, there is encouraging news: the Secure Data Access Centre (CASD) became in mid-May the first data hosting service in Europe to obtain the official GDPR certification issued under Article 42 of the GDPR and the Europrivacy standard.
This certification is officially recognised by the data protection authorities of 30 countries – all EU and EEA member states.
CASD already has many certifications (ISO 27001, ISO 2770) and is also, notably, a Health Data Host (HDS).
We also know that Europe is currently investing in AI, both financially and with the aim of simplifying regulations (see news items below). The IMA (Innovation Makers Alliance) Manifesto, supported by players such as OVHcloud, Hexatrust, and Mistral AI, also presented 33 proposals in March targeting AI, cloud computing, cybersecurity, and public procurement, with a view to an urgent rebalancing… While at the same time, Microsoft offered European governments a free cybersecurity program.
European initiatives will only reach their full potential if our reflexes evolve, whether in terms of strategic choices regarding AI tools, hosting services, but also digital hygiene: before transmitting or storing data, it is essential to initially process only strictly essential data and to implement, whether in the public or private sector context, practices and charters limiting the risks of loss of control over this data.
On May 15, 2025, the CNIL fined the company Solocal Marketing services 900,000 euros for soliciting prospects without their consent and transmitting their data to partners without a valid legal basis.
On the same day, it fined the company Caloga 80,000 euros for soliciting prospects without their consent and transmitting their data to partners without a valid legal basis.
At the end of May, MP and CNIL member Philippe Latombe questioned the Minister of Economy, Finance and Industrial and Digital Sovereignty via a parliamentary question about the choice of the ALAN mutual insurance company by many public bodies to provide supplementary health insurance to their employees.
He points out that "this French unicorn (...) hosts its data with Amazon Web Services (AWS) and is therefore subject to the extraterritoriality of American law."
The MP asks the government whether it is considering, "in order to protect the sensitive data of its agents and to be consistent with the directives it issues, asking ALAN to migrate to a sovereign cloud."
In a decision dated May 5, the Council of State referred a preliminary question to the Court of Justice of the European Union (CJEU) regarding the scope of consent: the case concerns the company Canal+, sanctioned by the CNIL in October 2023 for using data collected by its internet service provider partners for electronic marketing purposes, even though the partners were not explicitly identified when consent was given.
The Council of State essentially asks the CJEU whether the consent given to a main data controller (such as an ISP) for processing by "its partners" can be considered sufficient to authorize each of these partners to carry out marketing campaigns, even if they are not identified at the time of collection.
On April 25, the Council of State (CE) refused to suspend a decision by the CNIL which authorized the European Medicines Agency (EMA) to implement data processing for a study on the incidence and prevalence of diseases within the framework of the “DARWIN EU” project.
The applicant argued that the measure was urgent because the proposed processing concerned the health data of 10 million French people, which would be hosted by Microsoft, requiring transfers to the United States where the safeguards would be insufficient.
The CE considers that, "while it cannot be totally excluded that the data from the processing (...) may be subject to access requests by the authorities of the United States, via the parent company of the host, and that the latter may not be able to oppose it, this risk remains hypothetical at the current stage of the investigation.
Secondly, in addition to Microsoft Ireland holding the "Health Data Hosting" (HDS) certification (...), the project's implementation is surrounded by guarantees and security measures, notably in that the data will be pseudonymized several times and not directly identifiable, so that the CNIL considered that this risk was reduced to a level that did not justify refusing the requested authorization, the duration of which it also limited to three years.
The Bordeaux Court of Appeal on May 13th annulled a contract concerning the development of a website due to breaches of regulations on the protection of personal data, and in particular with regard to obligations concerning information and consent relating to cookies.
The cryptocurrency sector is in turmoil after several attempted kidnappings.
A director of the Paymium platform, targeted by a recent attempt, points to regulatory developments that intend to apply to the cryptocurrency sector several rules aimed at lifting anonymity, and already applicable to the banking sector in matters of money laundering or the fight against drug trafficking.
They are particularly targeting national and European mechanisms designed to make funds untraceable.
These concerns are echoed by some cybersecurity experts who argue that anonymity can be used in a legitimate context, but they are also put into perspective by others who point out all the data protection guarantees already applicable to the financial sector.
European institutions and bodies
On May 21, the European Commission published a proposal to amend the GDPR aimed at easing the obligations of SMEs and extending them to mid-sized companies employing fewer than 750 people.
The most important changes concern the requirements relating to the record of processing activities (RPA).
The current exception in section 30(5) would be extended to all such undertakings.
This new exception would apply unless the processing is likely to result in a "high risk" to the rights and freedoms of the persons concerned (as opposed to the current "risk").
The text would also add a recital specifying that the processing of particular categories of data necessary for the application of labour and social security law under Article 9(2)(b) does not, in itself, trigger the obligation to maintain a RAT.
The proposals would also amend the provisions relating to codes of conduct (Article 40) and certification schemes (Article 42) to extend them to companies employing fewer than 750 people.
These articles currently require Member States, Data Protection Authorities (DPAs), the Commission and the EDPB to "encourage" the development of codes and certifications that take into account the "specific needs" of SMEs.
To inform the future EU data strategy, the European Commission is opening a consultation on the use of data in AI, the simplification of data rules and international data flows.
The objective is to enable the creation of high-quality, interoperable and diverse datasets needed for AI, while ensuring consistency between data-related policies, infrastructures and legal instruments.
The consultation is open until July 18th.
The Commission also publishes a consultation on a draft guideline concerning the Digital Services Act (DSA), open until June 10.
The text includes guidelines on protecting minors online. The Commission plans to publish the final guidelines this summer. It is also working on an age verification app.
On May 27, the European Commission announced that it had opened investigations to protect minors from pornographic content under the Digital Services Act (DSA).
Investigations into Pornhub, Stripchat, XNXX and XVideos focus on the risks associated with the lack of effective age verification measures.
In parallel, member states, meeting within the European Digital Services Council, are taking coordinated action against small pornographic platforms.
The European Data Protection Supervisor published an opinion on May 28 on the proposal for a regulation establishing a common system for the return of third-country nationals residing irregularly in the EU.
While acknowledging the need for more effective enforcement of existing migration and asylum legislation, he stresses that "data protection – as a fundamental right enshrined in the Charter – is one of the last lines of defense for vulnerable people, such as migrants and asylum seekers approaching the EU's external borders."
The NGO noyb sent Meta a letter of "cease and desist" on May 14, as a qualified entity under the new European directive on collective redress, concerning Meta's AI training without user consent.
A preliminary injunction had also been filed in Germany by the Verbraucherzentrale NRW, which was rejected by the court at the end of May (see below).
News from the member countries of the European Union.
The German Federal Data Protection Authority (BfDI) has published an AI compliance questionnaire designed to help organizations validate their AI initiatives and ensure they comply with the GDPR.
The Cologne Regional High Court ruled on May 23 that Meta had not breached the GDPR or the European Digital Markets Regulation by using user profile data to improve its AI system, noting that this assessment is consistent with the assessment made by the Irish Data Protection Commission (DPC) which is competent in Europe with regard to Meta.
TikTok is facing a class action lawsuit in Germany. The Dutch NGO Stichting Onderzoek Marktinformatie (Somi), which filed a claim for damages in a Berlin court, alleges that "TikTok collects and analyzes highly personal and intimate data of its users to an extent that goes far beyond what is necessary."
The organization claims that the platform's algorithm creates "a system of manipulation and addiction," particularly for children. The NGO is seeking damages of up to €2,000 per person.
The Belgian Data Protection Authority (APD) has carried out an evaluation of the FATCA agreement concluded between the Belgian State and the United States and has determined that it is not compatible with the GDPR.
The complaints concerned the transfer of personal data relating to complainants, including Belgian "accidental Americans", to the American tax authorities.
The APD reprimanded the Federal Public Service Finance for violations of the GDPR and found a violation of the principles of purpose limitation, data minimization and data transfer rules.
This decision has implications that extend beyond Belgium, as similar agreements have been concluded by the United States in other European Union countries.
The Spanish Data Protection Agency (APD) has fined an Andalusian company 1,200 euros for asking its teleworking employees to provide their personal phone number to add them to the company's WhatsApp group.
Employees could theoretically consent to or choose the email alternative, but the company encouraged them to use WhatsApp for the smooth flow of communication.
The APD reiterated that consent is not a valid legal basis in an employee-employer relationship.
Also in Spain, the company Carrefour was fined 3.2 million euros by the APD for failing to protect access to its customers' accounts.
The company was convicted even though the credentials used in the hacking were not stolen directly from it, but because it had failed in its duty of care and its security systems did not allow it to detect massive attacks from a very large number of IP addresses.
The Italian Data Protection Authority (APD) has fined the American company Luka Inc, provider of the "virtual companion" "Replika AI", 5 million euros for illegal processing of personal data, violation of transparency rules, and for not implementing effective mechanisms for verifying the age of users.
The Polish Data Protection Authority (APD) has fined the Polish Minister of Digitalization 100,000 PLN (€23,448.50) and the Polish Post Office 27,124,816 PLN (€6,444,174) for illegally processing the personal data of approximately 30 million citizens to facilitate postal voting in a general election.
The Australian government has published standard clauses relating to AI.
These clauses apply to the terms of purchase of AI systems, ensuring that they are purchased and implemented responsibly, ethically, and safely. They aim to mitigate risks and promote transparency and accountability in AI deployment.
A report published on June 5 by Privacy Laws and Business indicates that many African countries are adopting personal data protection laws, in a socio-economic environment completely different from that of Europe or North America.
“Africa is the leading continent for mobile money use, with over 1.1 billion registered accounts, representing more than half of the global total. Consequently, privacy policy priorities in African countries are necessarily different from those in European countries.”
For the author, this different socio-economic context means that EU assessments of "suitability" for Kenya and other countries in Africa, Asia and Latin America should, to some extent, take into account national circumstances.
The President of the United States signed the "Take It Down Act" on May 19, a bill whose purpose is to block non-consensual intimate images, and which also covers AI "deepfakes".
“Take It Down” is the acronym for “Tools to Address Known Exploitation by Immobilizing Technological Deepfakes On Websites and Networks Act”.
Google has agreed to pay $1.375 billion to the state of Texas to settle two lawsuits accusing the company of tracking users' locations, "incognito" searches, and voice and facial data without their permission.
The company reportedly stated that it was settling the legal proceedings without admitting fault or responsibility, and without having to modify its products, and that its practices had evolved since the events.
Meanwhile, research findings published on June 3 show that Meta and the Russian company Yandex are tracking the web browsing of Android users, even in incognito mode or with a VPN, by exploiting a local port to link browsing activity to the connected identity.
Google says it is investigating this abuse, which allows Meta and Yandex to convert ephemeral web identifiers into persistent identities of mobile app users.


