Veille juridique

Telegram, Signal, Whatsapp… How good are instant messaging services in a professional context?

Legal Watch No. 74 – August 2024.  

Telegram, Signal, Whatsapp… How good are instant messaging services in a professional context?

There have been numerous reactions in the media to the arrest in France of Pavel Durov, the founder of the messaging app Telegram.

Regardless of the political debates on the subject, the case gives us the opportunity to take stock of the reliability of instant messaging services.

To what extent is the information exchanged truly confidential? Are these applications comparable, and can they be used in a professional context?

Most instant messaging services today boast about encrypting their communications.

However, the practice differs depending on the messaging service.

Many Telegram users have thus learned in recent days that their communications are not protected by default.

Only direct communications between two participants who have manually activated encryption in their conversation options are protected.

It should be noted that Telegram does not use open-source encryption such as the Signal protocol, but relies on a "homegrown" technology that is impossible to verify. 

Group discussions (or chat channels) cannot be encrypted, and Telegram can therefore access their content as well as information about the members and administrators of these groups.

The exchanges here are more akin to those of a social network than to those of a messaging service.

Telegram has been criticized for years for its lack of moderation of discussion channels, for ignoring requests from many governments to remove illegal content and for refusing to share information about potential offenders.

These obligations of moderation and cooperation with law enforcement are provided for by French law and the European regulation on digital services.

It is in this context that its founder was arrested for lack of cooperation and complicity in organized crime.

This obligation to cooperate nevertheless has limits: thus, end-to-end encrypted communication remains confidential and cannot be intercepted by a third party, whether it be the messaging provider, a government or a hacker.

Some states are concerned about the widespread use of encryption, and are lobbying for regulatory changes that would allow it to be circumvented.

This is particularly the case in the context of the proposed European regulation on child sexual abuse (CSA).

This pressure to weaken the confidentiality of communications is provoking numerous reactions from defenders of individual liberties and data protection authorities, for whom it would amount to widespread surveillance of private communications, would undermine digital security by breaking encryption and would provide no evidence that it would even achieve its objective of protecting children.

Given the current state of law and technology, it is strongly advised to use end-to-end encrypted messaging to exchange professional messages, especially when the content of the message is sensitive (e.g., medical or financial data).

And in this respect, not all applications are created equal. While the message content may be protected, encryption does not prevent the collection of certain user identification and/or connection data.

Particular caution is recommended in all cases when using discussion groups.

The use of groups WhatsApp Exchanging sensitive data in a professional context has thus resulted in a data controller being sanctioned by a data protection authority.

Signal is generally recognized as providing a high level of protection, but it is not the only one.

For example, we can mention Threema And Olvid These services have the advantage of being European. However, they have the disadvantage of still being relatively unknown. Within a company, such messaging systems can nevertheless represent an interesting alternative.

NB / Numerous analyses detail the strengths and weaknesses of messaging applications. For example, see the one by Orange Cyberdefense: https://www.orangecyberdefense.com/fr/insights/blog/data/securite-et-vie-privee-comparatif-des-apps-de-messagerie-instantanee

For a more comprehensive analysis of what law enforcement can obtain from a messaging service, see this FBI training document obtained through a Freedom of Information Act request by Property of the People, an American non-profit organization dedicated to government transparency: https://propertyofthepeople.org/document-detail/?doc-id=21114562

 

        

SOS médecins indicated on September 2 that the CNIL had given its approval to the association's creation of a data warehouse, named "Contact".

According to SOS médecins, Contact is "dedicated to unscheduled care and improving access to care, (and) will promote research and innovation in health.

It will securely and confidentially consolidate the data of millions of patients treated each year within the federation's 64 associations.

According to the federation, this data should be reused for the benefit of patients to improve practices and optimize care.

Paris-Saclay University announced that it was targeted by a ransomware attack on August 11. This incident took place a week after another cyberattack against more than 40 cultural institutions in France: cybercriminals reportedly infiltrated the computer system common to the "Réunion des musées nationaux – Grand Palais", which centralizes all the financial data of these institutions, and threatened to release the data within 48 hours if the ransom was not paid.

According to L'Usine Digitale, the 36 stores managed by the network were affected and the systems were disconnected. An investigation has been launched by the Paris Public Prosecutor's Office.

 

European institutions and bodies

The European Commission has opened a consultation until September 30th on the protection of minors online. particularly in the context of the Digital Services Act (DSA) and the Reporting of Child Pornography Material (CSAM).

Under the DSA, the Commission must develop guidelines to help the online platforms concerned comply with the requirements for the protection of the privacy and safety of minors.

“These guidelines will provide a non-exhaustive list of good practices and recommendations for online platform providers to help them reduce risks related to the protection of minors. The guidelines will also help the Commission and digital services coordinators to supervise platforms and enforce the DSA.”

The European Commission will publish its first report on the functioning of the "Data Privacy Framework" (DPF) between the European Union and the United States this autumn. A bilateral assessment meeting was held on this subject in July, and the Commission has also opened a public consultation allowing comments to be submitted until September 6.

China and the EU have begun discussions on data transfers under a new "cross-border data flow communication mechanism". According to the European Commission, the mechanism aims to find ways to facilitate cross-border transfers of non-personal data for European companies, as well as their compliance with Chinese data laws.

 

News from European member countries.

In Germany, the Frankfurt Court of Appeal ordered Microsoft to refrain from placing and storing cookies on the devices of the person concerned without their consent, even if this means that Microsoft stops placing any tracking cookies.

This decision appears to be in line with a recent ruling by a Dutch court that prohibited Microsoft, LinkedIn and Xandr from placing tracking cookies without user consent and imposed a penalty of 1,000 euros per company for each day of non-compliance with the decision (via GDPRhub).

The Belgian Data Protection Authority (DPA) has rejected a complaint filed by a data subject following a data breach. It considered that the technical and organisational measures taken by the data controller under Article 32 of the GDPR were appropriate.

Danish ODA considered that in order for there to be free and explicit consent to the use of facial recognition for access to a fitness center, the person concerned must have other methods of verification which do not involve the processing of biometric data.

In Spain, the Data Protection Authority (APD) fined a data controller €145,000 after the theft of an unencrypted USB drive containing personal data related to a criminal case.She found a breach of the principle of confidentiality, even though there was no evidence that the data had been accessed.

In Italy, the APD imposed a fine of 20,000 euros on a municipality that had illegally published the names of unsuccessful applicants in a public selection process. Furthermore, the municipality had not entered into a binding agreement with its subcontractor, which constitutes a violation of Article 28(3) of the GDPR.

The APD also fined a telecommunications provider one million euros for contacting its customers for commercial prospecting purposes without obtaining valid consent. The data protection authority considered that a data controller could not invoke legitimate interest to make telephone calls for marketing purposes to its customers.

On July 22, in cooperation with the CNIL, the Dutch Data Protection Authority (APD) issued a fine of 290 million euros against Uber BV (based in the Netherlands) and Uber technologies INC. (based in the United States) for transferring personal data outside the EU without sufficient safeguards.

The CNIL (French Data Protection Authority) received a collective complaint from the League of Human Rights, representing more than 170 drivers on the platform. The Dutch Data Protection Authority (APD) found that the processing of drivers' personal data, for which Uber BV and Uber Technologies Inc. are jointly responsible, involves transfers to the United States. Between August 6, 2021, and November 21, 2023 (the date Uber was added to the Data Privacy Framework), these transfers were not subject to appropriate safeguards. The APD concluded that there was a breach of Article 44 of the GDPR. Uber announced its intention to appeal this decision, which it considers unfounded.

The Dutch Data Protection Authority (APD) also imposed a €30.5 million fine on Clearview AI on September 3, along with a further €5 million penalty for non-compliance. Clearview had, in particular, created an illegal database containing billions of facial images, including those of Dutch citizens. The APD also imposed a ban on using Clearview's services. As the company has shown no willingness to change its practices, the APD indicated that it is exploring various legal avenues, including the possibility of holding the company's executives personally liable for these violations.

The Norwegian University of Science and Technology (NTNU) has published a report entitled "Piloting Copilot for Microsoft 365" under the auspices of the "sandboxes" of the Norwegian Data Protection Authority.

The university tested Microsoft's Copilot artificial intelligence service and published 8 key findings in early summer 2024, useful before embarking on the use of this new service.

The Slovenian Data Protection Authority (APD) considered that a school may partially refuse to comply with a request for access submitted by a parent, if the disclosure of certain data may harm the best interests of the minor.

On August 14, Switzerland adopted an adequacy decision for the United States, allowing the transfer of data to companies certified under a Swiss-US "data privacy framework".

Switzerland thus joins the European Union and the United Kingdom, which allow organizations to transfer the personal data of their residents to the United States under similar conditions.

The NGO noyb alerted the media on August 12 about the social network "X", which has started illegally using the personal data of more than 60 million users in the EU/EEA to train its AI technology ("Grok") without their consent.

Unlike Meta (which recently had to end its AI training in the EU), Twitter, according to the NGO, did not inform its users in advance. The Irish Data Protection Commission has initiated legal proceedings against X, and noyb has filed complaints in nine European countries to stop these practices. X's commitment to stop using this data finally came as the new version of Grok is now available. According to the NGO, the AI model was indeed trained in the meantime using EU data.

 

In the UK, a data controller was reprimanded for failing to conduct a privacy impact assessment as required by Article 35 of the UK GDPR before deploying a facial recognition system in a school. The data controller had also failed to obtain valid consent.

The United Nations and the International Labour Organization have published a report entitled "Mind the AI Divide – Shaping a Global Perspective on the Future of Work".

Among other findings, the report mentions that technological advances are jeopardizing jobs in sectors such as call centers and other types of business process outsourcing that are widespread in some developing countries.

While some tasks in these professions could potentially be automated, the document adds that most still require human intervention. "Such partial automation could lead to efficiency gains, allowing humans to dedicate more time to other areas of work."

A US federal judge ruled on August 5th that Google held an illegal monopoly on internet search. The court concluded that "Google violated Section 2 of the Sherman Act by maintaining its monopoly in two product markets in the United States—general search services and general text advertising—through its exclusive distribution agreements."

Nigeria published its "national AI strategy" last August.

It states, in particular, that: “Nigeria and the wider African continent possess some of the most distinctive and compelling challenges and opportunities that AI could address. From optimizing agriculture in diverse climates to improving public health infrastructure, locally developed AI solutions, tailored to local realities, are far better equipped to tackle these challenges than externally imposed models created for a completely different context and people. (…) Many datasets in Nigeria suffer from inaccuracies, incompleteness, and a lack of standardization. Data quality must be improved to ensure the reliability and effectiveness of AI algorithms, which require clean and accurate data to function optimally.”

“X” has suspended its operations in Brazil after months of conflict with the Supreme Court judge.

On August 31, the judge ordered a ban on X and a freeze on the assets of Elon Musk's Starlink space company. This decision follows a months-long investigation into the dissemination of false and defamatory information via the social network, as well as a further investigation into Musk for alleged obstruction of justice.

en_USEN