Augmented cameras: the CNIL sets new guidelines.
Legal Watch No. 84 – June 2025.
Augmented cameras: the CNIL sets new guidelines.
On July 11, the CNIL considered that the use of "augmented" cameras to estimate the age of customers in tobacco shops in order to control the sale of products prohibited to minors is neither necessary nor proportionate.
These cameras are presented as a decision-making tool. They rely on an artificial intelligence algorithm, activated by default, which scans the face of everyone in their field of vision to estimate whether they are minors or adults.
The CNIL (French Data Protection Authority) notes that since the "augmented" camera only performs an estimation, tobacconists must systematically request proof of age from their customers to comply with their obligations. "Consequently, prior facial analysis by a camera to estimate age does not appear necessary: it would only add to the checks required by law."
The CNIL points to a disproportionate use in relation to the objective pursued, which leads to filming all people, even those who are clearly adults, and prevents people from exercising their right to object.
She also believes that the deployment of cameras in living spaces such as tobacco shops contributes to a risk of trivialization and habituation to a form of surveillance reinforced by the multiplication of such tools.
This is not the first time the regulatory authority has taken a stance on this issue. At the end of May, it drew the ire of the mayor of Nice by prohibiting the deployment of augmented reality cameras in front of the city's schools, highlighting the collection of personal data and reiterating the need to reduce surveillance of people in public spaces to the bare minimum.
In some contexts however, the use of augmented cameras finds favor in the eyes of the CNIL: it has thus considered that the deployment of these cameras for the automatic checkouts of supermarkets could constitute a legitimate interest in an objective of limiting the losses of revenue caused by errors or thefts at the checkouts, but under certain conditions: that the system is necessary for the objective pursued and that it does not disproportionately infringe on the rights of persons and that it cannot be achieved in a less intrusive way.
Data minimization measures must be implemented, such as limiting the capture area to self-service checkouts, and restricting the capture duration, resolution, and frequency. Furthermore, users must be informed about the system and provided with a camera-free alternative.
The CNIL intends to regulate the use of these systems by ensuring their necessity and proportionality on a case-by-case basis. It also recommends applying the principle of privacy by design.
It is worth remembering that the regulatory framework for algorithmic video surveillance is an evolving subject, subject to a complex regulatory framework.
This framework differs from that of biometric cameras, which systematically process data relating to the physical characteristics of people, with the aim of uniquely identifying or authenticating them.
These processing operations, involving sensitive data, are prohibited except in exceptional circumstances.
Regarding "augmented" cameras, there is currently no specific text except for the experimental framework provided for by the law on the Olympic and Paralympic Games of May 19, 2023.
The CNIL reminds us that "mechanisms likely to affect the fundamental guarantees provided to citizens for the exercise of public freedoms can only be deployed if a law authorizes and specifically regulates them."
For other systems, strong safeguards must be put in place.
The trial period for the Olympic Games law was recently extended until 2027, despite an assessment presented as controversial.
This law authorizes the experimental use of augmented camera devices to ensure the security of certain major sporting, recreational and cultural events, under very specific conditions: only the events specified by the law can be subject to detection, and no facial recognition can be carried out.
From a more political point of view, the CNIL calls on the public authorities to draw a line between what should or should not be allowed in a democratic society: not everything that is technically feasible is necessarily desirable from an ethical and social point of view.
In a context of increased awareness of the fight against discrimination, the CNIL publishes a recommendation on measuring diversity in the workplace.
She stresses that such a measure is a delicate exercise which implies that employers strictly respect the decision of the Constitutional Council of November 15, 2007, regularly and wrongly interpreted as an absolute ban on statistics related to origin.
The commission stresses in particular that investigations must remain optional and employees or agents must be properly informed and their rights respected.
It also recommends prioritizing anonymous surveys and limiting the data collected through closed-ended questions.
The CNIL also published two FAQs on the use of AI in schools, and a manga aimed at raising awareness among young people about the protection of their personal data.
It has also published recommendations on internet audience measurement tools and a self-assessment tool to evaluate their compliance with the legal framework.
It has published recommendations on the development of AI systems, which specify the conditions for using legitimate interest, particularly in the case of data harvesting (web scraping), and on June 12 opened a public consultation on its draft recommendation concerning the use of tracking pixels in emails.
The objective is to help actors who use these trackers better understand their obligations, particularly regarding obtaining user consent.
According to a June 30 publication by the cybersecurity media outlet Cybernews, 16 billion stolen usernames and passwords are currently accessible online.
The compromised information includes identifiers, such as usernames and email addresses, as well as passwords.
The directories also contain access tokens, login cookies, and metadata.
This would not be a new data leak but the aggregation of different past data leaks, which can increase the risks of data theft by facilitating the work of malicious actors.
Google risks a record fine of 525 million euros for its handling of cookies and advertising in Gmail inboxes.
In its draft decision, the CNIL accuses Google of having violated the principles implementing the European directive "privacy and electronic communications", by not obtaining user consent for the downloading of cookies when creating a Gmail account and for the display, in Gmail inboxes, of advertisements that appear to be emails.
If the fine is upheld by the CNIL's restricted committee, it will be the highest fine in the CNIL's history and the highest fine ever imposed for a violation of the ePrivacy Directive. The final decision will be announced in a few weeks.
Non-compliance with the GDPR is grounds for contract termination, particularly in the field of digital communication development services: following previous decisions, in its judgment of June 11, 2025, the Bordeaux Court of Appeal upheld the presence of a reCAPTCHA protection device associated with several cookies placed without the end user's consent.
Given the continued contractual breaches by the service provider, the client is entitled to request the termination of the contract pursuant to Articles 1610, 1217 and 1224 of the Civil Code.
European institutions and bodies
The timetable for the implementation of the European regulation on AI was published in mid-June.
Following rumors of a possible grace period, the Commission clarified that the text would apply in accordance with the prescribed deadlines.
The regulation applies to AI systems based on the risks they present, and governs general purpose AI models (GPAI) based on their capabilities.
The rules concerning GPAI will come into effect on August 2, 2026; for existing systems, application will begin on August 2, 2027.
In addition, the Code of Good Practices for General Purpose AI was published on July 10th.
It comprises 3 chapters.
The chapters on transparency and copyright provide all providers of general-purpose AI models with a means to demonstrate that they are complying with their obligations under section 53 of the AI Act.
The chapter on safety and security concerns only a small number of suppliers of highly advanced models, subject to the obligations set out for suppliers of general-purpose AI models that present a systemic risk under section 55 of the AI Act.
The European Commission's proposal for a "digital omnibus" is currently expected by December 10, according to an internal document seen by MLex.
It will be part of a package including the Digital Networks Regulation, the revision of the Cybersecurity Regulation and the European electronic wallet.
Regulations concerning cloud and AI development are tentatively scheduled for the following week.
At the end of June, the European institutions adopted a common position on additional procedural rules relating to the implementation of the GDPR. The text must now be formally approved by a vote of the European Parliament.
At a two-day meeting in Helsinki on July 1st and 2nd, the European Data Protection Board (EDPB) announced it would help organizations better understand their GDPR obligations by publishing simplified guidelines. In its statement, the board's chairwoman indicated that "through concise and timely guidelines and ready-to-use tools, such as a common data breach notification template, checklists, practical guides, and FAQs, we will continue to make GDPR compliance achievable and accessible for all."
Transatlantic data flows remain valid for the time being under the "Data protection framework", despite measures taken by the Trump administration which weaken the data protection framework in the United States.
The European Commission confirmed in mid-June, in response to a parliamentary question, that the dismissal of the members of the PCLOB (the Privacy and Civil Liberties Oversight Committee) does not affect the validity of the data protection framework between the EU and the United States, as the PCLOB remains able to function.
News from the member countries of the European Union.
The Data Protection Authority (DPA) of the State of Berlin concluded in a decision dated June 27 that DeepSeek's data transfers to China are illegal and asks Google and Apple to block the application.
DeepSeek was reportedly unable to provide the DPA with convincing evidence that the data of German users is protected in China at a level equivalent to that of the European Union.
"Chinese authorities have extensive access rights to personal data held by Chinese companies. Furthermore, DeepSeek users in China do not have enforceable rights and effective legal remedies as guaranteed in the European Union."
Belgian APD rejected on June 26 16 complaints filed by the NGO Noyb in 5 different cases on the grounds that there was no real (not fictitious) mandate from the persons concerned.
In its press release, the APD highlights the difference between Articles 80(1) and 80(2) of the GDPR and the fact that the Belgian legislator chose not to allow consumer rights organizations to file complaints without a mandate.
She concluded by stating that "given the importance of these organizations, (she) is in favor of a legislative amendment that would also allow this option in Belgium."
In DenmarkAn amendment to the copyright law will grant citizens a right over their voice, face and body, even when these are digitally reproduced by generative AI.
The Danish Minister of Culture stated in this regard that "human beings risk being turned into digital photocopiers and used for all sorts of abusive purposes, and I am not prepared to accept that."
In SpainThe APD has fined Carrefour €3,200,000 following a series of data breaches.
She found that Carrefour had not implemented adequate security measures and had not reported the breach to the people concerned.
The APD also imposed a fine of €12,000 on a subcontractor for contracting with a secondary subcontractor without the controller's authorization, in violation of Article 28(2) of the GDPR.
Microsoft faces the first class action lawsuit in Ireland The Irish Civil Liberties Council (ICCL) initiated proceedings before the High Court in Dublin at the end of May. This legal action, brought under the new European directive on collective redress, alleges that the real-time bidding (RTB) system used by Microsoft to deliver targeted online advertising is incompatible with the GDPR.
Documents relating to access requests cannot be kept indefinitely: following an investigation initiated by the user, Lithuanian ODA ordered a medical service provider to set retention periods for documents related to the processing of access requests.
Beware of tracking pixels: Norwegian ODA imposed a fine of NOK 250,000 (€21,600) on the municipality of Kristiansand for illegally processing children's personal data via Snap and Meta pixels on its child abuse helpline website.
The APD also considered that information relating to visits to pages of a website containing content on specific medical issues constituted sensitive data, and reprimanded a company for illegally processing this sensitive information via the Meta pixel.
In the United Kingdom, the Data Use and Access Act (DUAA) received royal assent on June 19th. This law includes provisions aimed at promoting the development of digital verification services, new smart data programs such as Open Banking, and a new national register of underground assets.
It also includes significant changes to UK data protection legislation.
The DUAA will not replace the UK's GDPR, but it will make some changes "to simplify the rules for organizations, encourage innovation, help law enforcement agencies fight crime and enable responsible data sharing while maintaining high standards of data protection."
In the United States, the recent Supreme Court decision in the case of Free Speech Coalition v. Paxton has sent shockwaves through the digital landscape.An article from the IAPP expresses concern about the questioning of notions of age verification, freedom of expression, and the implications for privacy.
The June 27 ruling upholds a Texas law requiring websites offering adult content to verify visitors' ages using potentially intrusive techniques such as biometrics. While this decision aims to protect minors from explicit content, it also raises questions about the risks associated with collecting sensitive personal information.
In the United States as well, Congress voted in early July in favor of adopting the "One Big Beautiful Bill Act"codifying President Trump's national agenda, with one key exception: the moratorium on AI regulation that was initially included in the bill. This moratorium would have halted more than 1,000 AI regulation bills that had been progressing through the legislative processes in state capitals since January.
Meta, the owner of WhatsApp, announced on June 16 the launch of new features in the "Updates" tab of WhatsApp, including targeted advertising and a subscription model.The company stated that these features would be gradually rolled out to users "over the next few months." To this end, Meta will use the "advertising preferences and information" from users' Facebook and Instagram accounts that are linked to WhatsApp.
The Irish Data Protection Commission (DPC) said it had been informed by WhatsApp that its advertising model would not be rolled out in the EU until 2026 and that it would be discussed with other data protection authorities so that they could raise concerns as European regulators.
According to the newspaper L'Express of May 26, Russia plans to implement, starting September 1, 2025, an experimental project requiring foreigners on temporary stays in Moscow and its region to use a mobile geolocation application and submit to biometric checks.Users will need to register in the application, "agree to the collection of their personal data, including geolocation, indicate their place of residence to the Ministry of the Interior, and update it within three days if they move."

