Data insights: A discussion of the latest developments in privacy and data

Insight from Shoosmiths’ first Data conference including helpful discussion of EU digital regulation, privacy frameworks, AI risks, cyberattacks and class actions.

GRIP attended Shoosmiths’ inaugural “Data Insights” conference on January 24, 2024, hosted by Shoosmiths’ head of Privacy and Data –  Sherif Malak. The event, attended by many data protection officers (DPOs) and compliance folk, provided cutting-edge and thought-provoking discussions on the latest developments in privacy and data.

The day kicked off with a year in review by the Privacy and Data Team. The team highlighted the new EU digital regulation, privacy frameworks, risks for businesses and class actions for data breaches.

EU Digital Regulation

Hamish Corner, partner and commercial, IT and data privacy specialist, told the audience that privacy data regulation has grown up from regulating consumer data to regulating data as a whole. The EU’s two new digital acts purport to constrain anti-competitive practices of very large businesses.

EU Digital Markets Act (DMA)

The Act became broadly applicable from May 2023 and will be completely in force by March 2024. It aims to make the market for digital services in the EU fairer, more innovative and more competitive. It is hoped that it will level the digital services playing field and allow challenger providers to get a footing in the market, ultimately benefitting consumers.

The DMA has an extra-territorial effect, applying to companies that provide services in the EU regardless of where they are based.

In case a gatekeeper does not comply with the obligations laid down by the DMA, the Commission can impose fines up to 10% of the company’s total worldwide turnover, which can go up to 20% in case of repeated infringement.

EU Digital Services Act (DSA)

The Act came into effect on August 25, 2023, for very large online platforms and very large online search engines. It becomes fully applicable to other entities on February 17, 2024.

The DSA gives power the European Commission to request information to determine whether social media platforms and other services that carry user-generated content have failed to comply with content moderation duties. In particular, the Commission will be looking at VLOPs’ policies and actions regarding notices on illegal content, complaint handling, risk assessment and measures to mitigate risks identified are being investigated.

The Commission can apply fines up to 6% of the worldwide annual turnover for failing to comply.

AI risks to business

Monika Markiewicz, Shoosmiths’ internal facing head of Privacy and Data Protection, warned the conference about the risks of using AI, in particular the alleged Samsung data leak and a US lawyer’s use of AI to research a case.

Samsung employees were reported to have inadvertently leaked sensitive data via OpenAI’s chatbot ChatGPT. This presents a timely lesson on preventing future breaches involving Large Language Models (LLMs). The LLMs generate responses to questions and therefore may unintentionally divulge confidential information.

Companies may ban AI chatbots outright but this is “like playing whack-a-mole”. A better option could be to limit the output by controlling what type of data is fed into a model and who in the organization can access the model. Some businesses are developing their own models so they can feed in sensitive information because the outputs can only be used internally.

A US lawyer admitted to using AI to research a case. A judge said the court was faced with an “unprecedented circumstance” after a filing was found to reference legal cases that did not exist. The lawyer who used the tool told the court he was “unaware that its content could be false”. ChatGPT creates original text on request but does come with a warning it can “produce inaccurate information”. These AI inaccuracies are now know as “AI hallucinations”.

Challenges to EU-US Data Protection Framework

Nick Holland, an international data privacy lawyer expects challenges to the DPF, particularly as both the European Data Protection Board and European Parliament heavily criticised the framework.

Despite the fact the European Commission has confirmed that the DPF “introduces new binding safeguards to address all the concerns raised by the European Court of Justice” it is certain that the framework will still be challenged because opponents to the DPF, such as Max Schrems’ NOYB, have already stated publicly prior to the adequacy decision announcement that they do not agree the new safeguards adequately address the concerns raised in Schrems II

Holland noted in his recent article, that on July 10, 2023, NOYB also released a press release stating its intention to challenge the adequacy decision. In this regard, NOYB criticized the DPF as being largely similar to the Privacy Shield and said that there is little change in US law – in particular that the fundamental problem with s702 of FISA was not sufficiently dealt with or addressed. The ultimate question therefore is whether the DPF will get the approval of the CJEU, as opposed to just the European Commission’s sign off. 

Cyberattacks – ransomware

The panel said the conference could learn lessons from the recent ransomware cyberattack on the British Library. The attack, by a group known as Rhysida, demanded a “double extortion” ransom of £600,000 ($761,736) in bitcoin for the return of stolen data and to return access to the Library’s systems. The Library refused to pay. The incident was investigated with assistance from the UK’s National Cyber Security Centre (NCSC) and law enforcement.

If it is established that the British Library was at fault, it is possible that the Library may face regulatory penalties from the Information Commissioner’s Office (ICO).

Many organizations pay the ransom in order to go back to “business as usual” but paying the ransom may not have solved all of the library’s problems in this instance.

Also, it would only have encouraged the attackers to target other institutions. Organizations should heed this tale and update their security systems, have back-up plans and recovery plans in place, regularly test and monitor their systems, and train employees about phishing scams/password hygiene.

Class litigation actions

Malak quoted Shoosmiths’ 2024 Litigation Trends report which found that 51% of large company GCs said class actions would be the biggest threat over the next 1 to 3 years, with 32% believing these actions would relate to data breaches.

‘Class’ actions for data breaches in the UK are a complex and evolving area. The claimants have to prove they have the same interest in the claim. They also need to show a loss (not necessarily financial loss). Malak suggested that many felt a “loss of control should be enough” in order to make companies accountable in situations where millions of people’s data protection rights were infringed but each person did not suffer a loss. He also advised the attendees to keep watch on the Austrian Post case and that many claims were stayed pending that decision.

This presents a timely reminder that compliance with cybersecurity and privacy regimes are no longer simply regulatory concerns—they may present a class action risk, particularly in the wake of high-profile cyber incidents and data breaches.