Deepfakes (defined in the Oxford English Dictionary as “media that has been digitally manipulated to replace one person’s likeness convincingly with that of another, often used maliciously to show someone doing something that he or she did not do”) made headlines recently when faked sexually explicit images of popstar Taylor Swift were circulated on X (formerly known as Twitter). This resulted in international calls for greater regulation of AI, particularly as it relates to the creation of deepfakes.
The United Kingdom is one of few countries with laws against deepfakes, in the form of the Online Safety Act 2023. In this article we will consider the history of the fight against online sexual offences, and how new legislation attempts to account for the new threat posed by AI-generated images.
The law as it concerns the sharing of explicit images or film without consent has had to scramble to keep up with the rate and nature of offending. From 2015, section 33(1) of the Criminal Justice and Courts Act 2015 (“the 2015 Act”) criminalised the distribution of private sexual photographs or films without the consent of the individual(s) appearing in them, but only if it was done with the intention of causing the individual(s) distress. As such, were the goal to be e.g. sexual gratification or financial reward, the offence was not made out.
Before the 2015 Act however, the options for criminal action against those who shared intimate images without the consent of the person(s) appearing in them were extremely limited as they related to adults (explicit images of children being covered by the Protection of Children Act 1978).
If it could be shown that the behaviour formed part of a course of conduct, then the perpetrator could be charged with harassment or stalking under the Protection from Harassment Act 1997. If accompanied by a demand for money (e.g. in exchange for the pictures not being shared or being removed from whatever platform they had been put on) then this could give rise to a charge of blackmail under s.21 of the Theft Act 1968.
In some cases, prosecutions were brought under the Malicious Communications Act 1988 or the Communications Act 2003, however neither of these Acts was drafted to cope with these specific factual scenarios. In many cases, victims were simply told that they would need to use the civil courts to secure an injunction – a costly and time-consuming process.
The first reading of what was then the Criminal Justice and Courts Bill took place in February 2014, and it is worth recalling the social context at the time. In 2012, now-convicted felon Hunter Moore started a website which published explicit photographs of people without their knowledge or consent – usually with their social media or contact details attached. Copycat websites swiftly followed, many of which not only actively encouraged people to upload such photographs or videos – particularly of their ex-partners following a breakup – but also charged the victims a fee to have them removed.
Several high-profile cases also made headlines in the UK: in 2012 the ex-boyfriend of singer Tulisa Contostavlos shared an explicit video of her. And in November 2014, 21-year-old Luke King was jailed after pleading guilty to harassment, which included the changing of his profile picture on WhatsApp to an intimate image of his ex-girlfriend.
Whilst “revenge porn” (now widely referred to as image-based sexual abuse) as a concept was becoming more well-known in the UK, there seemed to be a distinct lack of options, and scores of people were being told that there was nothing police could do. Suffice to say that there were high hopes for the new provisions of the 2015 Act, which received Royal Assent on 12 February 2015.
Data released by the Metropolitan Police shows that the numbers of intimate image-based offences being reported climbed steadily in the years between 2015 (a total of 284 offences recorded) and 2021 (a total of 1093). And on January 2023 the charity ‘Refuge’ published the results of multiple Freedom of Information (FOI) requests made to police forces in England and Wales: the data showed that a total of 13,860 intimate image offences, across 24 police forces, were recorded between 1 January 2019 and 31 July 2022; disappointingly, only 4% of which resulted in charges.
To this already concerning situation, arrived the new threat of deepfakes. Now, an offender need not have actual intimate images in their possession or go to the trouble of obtaining them by hacking – they could simply be generated. This, combined with the somewhat fragmented resources available to tackle online sexual offences, resulted in the need for new legislation.
The Online Safety Act 2023
The Online Safety Act 2023 (“the 2023 Act”) received Royal Assent on 26 October 2023. Perhaps better known for its provisions concerning the removal of harmful online content than those relating to image-based sexual abuse, it nonetheless represents a significant step forward.
Part 10 of the 2023 Act concerns communications offences. Section 189 of the 2023 Act repeals sections 30-35 of the 2015 Act, and section 188 creates four new offences relating to image-based sexual abuse, by means of inserting three new sections (66B, 66C and 66D) into the Sexual Offences Act 2003.
The new offences are:
- Sharing an intimate photograph or film without consent (s.66B(1))
This offence is committed if a person (A) intentionally shares a photograph or film which shows, or appears to show, another person (B) in an intimate state, without B’s consent or a reasonable belief in B’s consent. It is a defence for A to prove that they had a reasonable excuse for sharing the photograph or film; pursuant to s.66B(6)(b), whether a belief is reasonable is to be determined having regard to all the circumstances including any steps A has taken to ascertain whether B consents.
A magistrates’ court or jury can find the defendant guilty of this offence if they are found not guilty of the more serious offences under ss.66(b)(2), (3) and (4). The offence is summary only and the maximum penalty on conviction is imprisonment for a term not exceeding the maximum term for summary offences or an unlimited fine.
- Sharing an intimate photograph or film without consent and with intent to cause alarm, distress or humiliation (s.66B(2))
This offence is committed if a person (A) intentionally shares a photograph or film which shows, or appears to show, another person (B) in an intimate state, without B’s consent and with the intention of causing B alarm, distress or humiliation. There is no requirement for A to reasonably believe that B does not consent.
- Sharing an intimate photograph or film without consent and for the purpose of obtaining sexual gratification for the person doing the sharing, or another person (s.66B(3))
This offence is committed if a person (A) intentionally shares a photograph or film which shows, or appears to show, another person (B) in an intimate state, without B’s consent or a reasonable belief in B’s consent and for the purpose of A or another person obtaining sexual gratification.
No offence is committed under ss.66B(1), (2) and (3) if it can be shown that the photograph or film was taken in a place to which the public or a section of the public had access, that there was no reasonable expectation of privacy, and that the subject was in the intimate state voluntarily.
- Threatening to share an intimate photograph or film (s.66(b)(4)
This offence is committed if a person (A) threatens to share a photograph or film which shows, or appears to show, another person (B) in an intimate state, and where A (i) intends that B or someone who knows B will fear the threat will be carried out, or (ii) is reckless as to that result.
Exceptions exist in s.66C whereby no offence is committed if the photograph or film had previously been shared consensually. Exceptions also exist for cases where the subject of the photograph or film is under 16 and it is shared in connection with care by a healthcare professional.
The offences under ss.66(b)(2), (3) and (4) are triable either way. In the magistrates’ court the maximum penalty is imprisonment, for a term not exceeding the general limit in a magistrates’ court. On conviction on indictment, the maximum penalty would be imprisonment for a maximum of two years.
The inclusion of the words “shows or appears to show”, which appear in respect of all the offences, ensures that all the offences cover deepfakes. Section 189 repeals sections 33 to 35 of the 2015 Act, making the 2023 a one-stop shop for image-based sexual abuse.
Whilst the 2023 Act closes the loophole left in the 2015 Act, which required there to be an intention to cause distress, the fight against this cruel criminality is by no means over. The data published by Refuge shows that, notwithstanding the increase in reported cases of image-based offending, the charging rate remains woefully low.
In its response to its AI white paper proposals published this month, the government acknowledged that new legislation will eventually need to be adopted to address the challenges faced by AI, but caveated: “However, acting before we properly understand the risks and appropriate mitigations would harm our ability to benefit from technological progress while leaving us unable to adapt quickly to emerging risks. We are going to take our time to get this right – we will legislate when we are confident that it is the right thing to do.”
We wait with interest to see whether this will result in any meaningful control on the creation of deepfakes, which would be another tool in the ongoing fight against image-based sexual abuse.