This website uses cookies

This website uses cookies to ensure you get the best experience. By using our website, you agree to our Privacy Policy

Emily Carter

Partner, Public Law, Kingsley Napley

Quotation Marks
"...data housekeeping has become increasingly complex and the boundaries between private and work communications have become blurred. "

The evolving challenge of data protection laws

Practice Notes
Share:
The evolving challenge of data protection laws

By

Emily Carter explores anticipated developments in the realm of data protection

The impact of data protection law on individuals, businesses and other organisations continues to grow in depth and breadth. The General Data Protection Regulation (GDPR) was intended to be technology neutral and future proof – but is being stretched to its limits by the speed and direction of technological development.

As technology advances, data protection laws face increasing challenges and complexities. The volume of data held by organisations is increasing exponentially. New channels of communication, data formats and external storage solutions are being rapidly introduced. Large parts of the workforce are transitioning from email to more fluid and collaborative platforms (such as Teams) for their day-to-day data communication and mobile instant messaging apps (such as Whats App) are now common in the workplace. We have seen the rapid expansion of technology relating to fingerprints, facial recognition and other biometric data, as well as greater data interactivity, for example, within geolocation technology.

Consequentially, data housekeeping has become increasingly complex and the boundaries between private and work communications have become blurred. It is increasingly challenging to pin down responsibility for data protection obligations including security, retention, transparency, accuracy and access. The privacy concerns inherent in any form of monitoring demand specific attention in line with emerging regulatory guidance and legislative change.

Meanwhile, new arrangements were introduced in October 2023 to enable data sharing with US organisations via the ‘UK-US Data Bridge.’ Certainty in this area is welcome after numerous changes in this business-critical area over recent years.

Data breaches

Organisations continue to worry about the increasing prevalence of data breaches and cyber-attacks. There have been a number of high-profile attacks, including on Capita’s infrastructure services in March and Electoral Commission voter registration details in August. The risks have shifted in line with international political unrest, as well as the use of Generative AI tools to create highly targeted and carefully crafted phishing emails.

Human error with existing technology continues to create enormous risk. In August, information relating to more than 10,000 officers erroneously appeared within a hidden table in a spreadsheet published online by the Police Service of Northern Ireland in response to a Freedom of Information Act 2000 request. Similar breaches were subsequently reported by Norfolk & Suffolk Constabulary, Cumbria Constabulary, and Southend-On-Sea City Council.

The ICO's role

2023 was a demanding year for the Information Commissioners Office (ICO). With 35,000 complaints received in its last financial year, it welcomed confirmation from the Court of Appeal in Delo v Information Commissioner [2023] EWCA Civ 1141 in October that the regulator retains full discretion concerning the investigation of these complaints.

In accordance with a policy announced in June 2022, the ICO has continued to rely upon reprimands in the public sector with formal enforcement action being reserved for the most egregious cases. While the ICO published 34 reprimands, it imposed no fines against public sector bodies in 2023.

In the private sector, with the exception of February 2023’s £12.7m fine against TikTok, fines have been reserved for the “low hanging fruit” of direct marketing breaches. In 2023, there were 17 such fines amounting to a total of £1.6m.

The ICO became further embedded within national and international privacy networks. Within the UK Digital Regulatory Co-operation Forum, it is focussing on AI and online safety. Although Clearview AI successfully appealed the ICO’s decision to fine this US company for scraping UK data, the ICO is seeking to appeal this decision. In August, the ICO issued a joint statement with other regulators internationally condemning unlawful data scraping.

Data Protection reform

The Data Reform and Digital Information Bill (No. 2) (DPDI) is expected to become law in Spring 2024. Although this legislation has been under consideration since 2021, the DPDI is unlikely to significantly decrease the day-to-day regulatory burden for organisations.

The DPDI intends to move organisations from ‘tick box’ to risk based compliance and provides some welcome clarification and internal consistency. For organisations struggling with the volume and complexity of Data Subject Access Requests, there is little relief in sight. While they may refuse to respond to excessive and unfounded requests, they are now also required to introduce a complaints system.

Data Protection Officers (DPO) are no longer required by statute but Senior Responsible Individuals (SRI) will be required by organisations engaged in “high risk processing.”

Information Commissioner

In 2024, the ICO will continue to focus upon protecting the vulnerable, including children, and the novel use of technology. Having issued a preliminary enforcement notice against Snapchat with respect to its ‘MyAI’ chatbot, we can expect to see more high-profile enforcement action concerning the interface between social media, AI and children, particularly in light of the new Online Safety Act.

In the wake of significant recent data breaches, all eyes will be on the ICO’s enforcement action in 2024, including the extent to which it exercises its power to conduct mandatory interviews granted by the DPDI. The question of whether the UK diverges too far from acceptable standards of data protection to retain EU adequacy status remains live, and regulatory enforcement is a critical component of this assessment.

Broader conversations

Data protection regulation will truly become a global undertaking. Given international data flows and the extra-territorial reach of national data protection regulation, organisations will be challenged by legislative conflicts and national regulators will need to grapple with enforcement against entities beyond national borders.

At the same time, a growing privacy movement consisting of academics, civil society organisations and community groups is becoming more vocal and creative in protecting individual privacy rights. The spotlight will fall on public sector projects such as the recent appointment of Palantir to create a Federated Data Platform of NHS patient data. In light of the vast social benefits to be realised set against ever increasing threats to individual privacy, conversations will continue concerning the importance of public trust and the role of data stewardship.

Fairness will be a touchstone for data controllers and a yardstick for regulators, with the rapidly evolving field of data ethics informing how fairness relates to data processing. Rather than being a field of dry and technical regulation, data protection legislation will have an increasingly critical role at the interface between technology and every aspect of our day to day lives. And with appropriate human guidance, we can expect AI to play a role in solving some of the problems concerning discrimination, anonymisation and unlearning which it has created.

Emily Carter is a partner at Kingsley Napley