This website uses cookies

This website uses cookies to ensure you get the best experience. By using our website, you agree to our Privacy Policy

Hanna Basha

Partner, PAYNE HICKS BEACH

Mark Jones

Partner, PAYNE HICKS BEACH

Francesca Sargent

Associate, PAYNE HICKS BEACH

Quotation Marks
It is unlikely to be successful for Sadiq Khan as he would need to establish that the fake audio caused or is likely to cause serious harm to his reputation

Deepfakes: is the law capable of protecting individuals in light of new technology?

Opinion
Share:
Deepfakes: is the law capable of protecting individuals in light of new technology?

By , and

Hanna Basha, Mark Jones and Francesca Sargent discuss the application of the law to seek civil and criminal remedies for deepfakes

Fake audio purporting to be Sadiq Khan ‘does not constitute a criminal offence,’ according to the Metropolitan Police. This poses the question – can the law protect individuals in light of new artificial intelligence (AI) and ‘deepfake’ technology?   

The development of AI has increased rapidly in recent years and, like with all new technology, UK law has struggled to adapt at a similar speed to protect individuals. Despite this, there are civil and criminal routes available to those who wish to take action when their likeness or voice has been used without their consent.

Not ‘a criminal offence’

The Metropolitan Police’s announcement may give the impression that criminal proceedings cannot ever be brought in cases involving deepfakes. However, this is not entirely true; it depends on the circumstances. New legislation in the form of the Online Safety Act 2023 has criminalised the sharing of deepfakes where the distribution is non-consensual and the deepfake is intimate in nature.

While this protects potential victims of online image abuse, it does not assist those whose likeness has been replicated and used in deepfakes of a non-sexual nature. That said, even then the Online Safety Act may give better protection as large social media platforms must now remove content from their site posted contrary to the platform’s terms and conditions. Many platforms’ terms would prevent the posting of deepfakes. How useful this provision will be will depend on each platform’s specific terms and how responsive the platform is.

Other criminal routes are also worth considering. If the content of the fake audio of Sadiq Khan had been a malicious communication – such information which is grossly offensive, false or menacing – its distribution would constitute an offence for the purposes of the Malicious Communications Act 1988 and/or the Communications Act 2003. The creation and distribution of deepfakes could also amount to harassment – a criminal offence under the Protection from Harassment Act 1997.

A civil claim

Deepfake technology can produce startlingly realistic results. This raises understandable concerns regarding privacy, data processing and reputational damage.  

There is a good chance that Sadiq Khan has a claim in wrongful data processing. This sort of a claim is likely to cover the widest range of deepfake scenarios because in order to create the deepfake there needs to be some identifiable information concerning the individual. Altering the image or voice connected with the individual to create the deepfake is likely to be unlawful data processing – unlawful because it is difficult to see an exemption to allow this conduct. The issue with bringing a claim is likely to be finding the right person to sue and enforcement. However, there is always a possibility of bringing a claim against ‘persons unknown’ and using this to remove the deepfake from various sites and stopping it from spreading further.

Coupled with wrongful data processing, claims often relate to the misuse of private information. This is likely to turn on the information used, although there are also protections around intrusion. Importantly, the fact that some of the information is false or fake is not necessarily a hurdle to the success of the claim. 

Another avenue to consider is a claim of defamation. It is unlikely to be successful for Sadiq Khan as he would need to establish that the fake audio caused or is likely to cause serious harm to his reputation. The most obvious hurdle in this case is the relatively rapid and widespread publication of articles confirming the audio was not in fact him. However, it is easy to see how a defamation claim could be established in other circumstances.   

Depending on the circumstances, civil claims in harassment, copyright or passing off are also worth exploring.   

While it may well be true that the fake audio purporting to be Sadiq Khan ‘does not constitute a criminal offence,’ the law can protect individuals. The application of the law to seek civil and criminal remedies for deepfakes is still in infancy, but it is clear that there are some protections available. 

Hanna Basha and Mark Jones are both partners and Francesca Sargent is an associate at Payne Hicks Beach
phb.co.uk