Posted at 15:42 on 25 Mar 2019 by Pandora / Blake
After the Digital Economy Act passed in July 2017, implementation of age verification has been repeatedly delayed. We were initially told it would start being enforced in April 2018, but it was put back till the end of 2018. In November the Minister for Digital, Margot James, claimed that it will come into effect by Easter 2019, but as far as I know things aren't yet in place to enable the BBFC to begin enforcement.
This leaves website owners and viewers in a state of uncertainty, not knowing at what point they will need to start age verifying to access porn - or publish it. A recent YouGov survey showed that more than three-quarters of Brits have no idea this is even happening. Meanwhile, there are still irregularities and uncertainties with the policy, which I lay out in detail in my recent academic article: Age verification for online porn: more harm than good? My Patrons (at Ally tier and above) can access the full article here.
Which sites will have to comply?
Website owners are eager to know if they will have to install (and possibly pay for) age verification software. The final version of the Commercial Basis Regulations states that sites will need to establish that visitors are over 18 either if a) they are marketed as being a pornographic website, or b) if more than one third of the content on the website is pornographic.
I lobbied the Department for Digital, Culture, Media and Sport (DCMS) for a two years expressing my concern that independent sex workers’ websites would be obliged to comply. This would cause unintended harm: sex workers’ clients are (understandably) careful about their privacy, and are very unlikely to be willing to enter identifying details in order to view a sex workers’ website. That would mean a significant drop off in the ability of independent sex workers to attract clients, which would force them into more exploitative working conditions to get enough work to earn a living - such as having to work for a manager in a criminalised workplace.
Previously, the age verification policy applied to any website where any payment or benefit had ever been transacted in either direction in association with making porn available online - which would apply to anybody who'd ever paid for web hosting, regardless of whether or not they ever received any income. This was clearly ludicrously broad. I proposed either a minimum income threshold or minimum traffic threshold, below which websites would be considered too small to carry much risk of making porn available to a large number of under 18s. The DCMS declined to follow this recommendation, and instead they've come up with this definition based on how it's advertised or how much of the content of the website is pornographic - this seems to be their best effort to try and constrain the previously very broad scope of the policy.
It’s really hard to understand how the "one third" rule will be applied by the BBFC. Is it one third of total pixels, bytes, images, web pages? It's extremely unclear, and obviously hasn't been designed by somebody who understands how the internet works. So it remains to be seen how this is going to be applied in practice, which leaves many small publishers uncertain about whether or not they are going to be obliged to age verify site visitors.
My recommendation to concerned website owners is to email the age verification regulator and ask them how they intend to apply this rule. Perhaps if enough people express concern, they will publish clearer guidance. My hunch is that as long as you aren't advertising your site as a source of hot sexy porn - perhaps instead promoting it as a community site, a personal site dedicated to exploring your erotic journey, or an informational site about services you offer - that should hopefully help.
Bear in mind you have to age verify if either the advertising rule or the one third rule applies. So even if your website is only 1% porn but you advertise as a porn site, you will be expected to age verify.
Another big unknown is around audio content. For the first time, the Digital Economy Act defines pornographic content as still images, video with or without audio, and audio-only. This is the first time that audio-only content has been included in the definition of pornographic material in UK law.
One would think that there would be some democratic regulatory process involved in establishing how the pornographic nature of audio content was going to be determined by the British Board of Film Classification, the new age verification regulator. However, this has not taken place.
Audio porn content is gaining in popularity, partly as the result of a laudable effort by producers to make their material more accessible to people with visual impairments. Some producers are starting to create audio description files to accompany visual content; see for example the access requirements of the London Porn Film Festival, which requires that all films be submitted with an audio description as well as subtitles.
ASMR videos and audio tracks are a growing phenomenon - as are erotic audio stories. Sex blogger Girl On The Net records herself reading her sexiest blog posts in audio-only format for people to enjoy. Funded via her Patreon, this increases the accessibility of her site, and creates sexy extras for her website visitors, whatever their access needs.
To assess what counts as "porn" for the purposes of age verifiation, the BBFC will be applying their film classification guidelines. Age verification will be required for content classifiable as 18 or higher and published ‘primarily for purposes of sexual arousal’ (which conveniently excludes 18-rated mainstream entertainment media such as Game of Thrones), and all content classifiable as R18.
However, there are no classification guidelines for audio content. How will the BBFC determine the classification of audio porn? Will they be looking for audio that might accompany porn videos, complete with breathy squelchy noises? If so, does it matter how that audio is produced? What if somebody were to produce it using Foley or special sound effects? (For example, the audio porn workshop 'Fuck me in the ear' at the London Porn Film Festival will invite participants to create audio porn using a range of unlikely materials.) Or what about a video of someone erotically reading a sexy story - where the video which that audio might accompany could be them fully dressed, sitting in a chair at their computer reading a book into a microphone? Would that be classifiable as 18?
When I posed these questions to the BBFC in September they told me that it didn't matter to them how the audio it was produced; what they were interested in was the effect it creates. Just as when they're classifying violence in film, it doesn't matter whether it was created using special effects, the point is the impression which it leaves on the viewer. Unfortunately they were not able to offer any clarity about how they would classify material, apart from to say they would "know pornography when they heard it". This echoes a well-known legal phrase, when in 1964 US Supreme Court Justice Potter Stewart described his threshold test for obscenity as "I know it when I see it". Since then, this attitude has been criticised for being so obviously open to individual bias - a bias the BBFC seem happy to embrace.
All of this leaves people producing audio porn in a state of uncertainty about whether or not they will be required to age verify. It also potentially creates a discriminatory double standard. Someone producing a written erotic blog would not be expected to age verify viewers - it's only audiovisual media that is covered by the legislation. But exactly the same material turned into a spoken word audio recording might be expected to age verify - a ridiculous contradiction, which is ableist against visually impaired internet users.
The BBFC produced guidance in October which has been widely criticized by privacy campaigners, including the Open Rights Group. The most significant update is that the UK's data watchdog, the ICO, will be able to establish a set of guidelines to protect users’ data.
However the ICO's guidelines will be voluntary - meaning any individual age verification company will get to choose whether to sign up. The government have conceded the argument that user privacy genuinely is threatened by age verification, but neither the BBFC nor the ICO will have any teeth in enforcing this voluntary privacy standard. Reporting an AV provider that is violating user privacy to the ICO will come too late to save the people whose privacy has been compromised. Once someone has been outed, the ICO can't put the cat back in the box. This voluntary privacy standard is being put together “entirely without external input from privacy groups and without a public consultation period” (SexTechGuide).
By refusing to enshrine robust privacy requirements in law, the Government has created a situation where we're completely dependent on the goodwill of international companies to respect users’ data. We've shone a light on Mindgeek’s shady business practices over the last three years this has been under discussion, and as such they have clearly felt the pressure to claim they will take privacy seriously with their products. Bear in mind that they aren't just producing an age verification solution (AgeID) - they're also producing a VPN (VPNhub) which will enable viewers to circumvent their own solution, all while hoovering up yet more data about what sites people visit! Nick Cowen has cogently expressed how this puts Mindgeek at a competitive advantage.
You'll find their quotes about how it won't even be possible for them to see this data, let alone store it, in most coverage of this topic. But we should not be dependent on the goodwill of an international company to protect data of this sensitivity. Mindgeek are not legally obliged to protect privacy, and there's no guarantee that they'll continue to do so over the next few years. Age verification will force over 18s who want to legally access porn to put their most private and sensitive data into the hands of the pornographic equivalent of Facebook, and trust it to keep this data of its own good will, without any legal backing obliging it to do so.
Data relating to an individual's sexuality or sexual orientation is sensitive information that is meant to get special protection under GDPR. The government doesn't seem to have realised that data relating to an individual's porn browsing history counts as data pertaining to their sexuality. They haven't legislated any mandatory requirements for age verification software providers to put in adequate privacy and security protections to keep this data safe.
There is an extraordinary risk of data leaks about what individual has visited what websites, which could lead to serious harm. It's not only a tempting target for blackmailers, it could conceivably lead to loss of life. The Ashley Madison hack led to several suicides. Or imagine a queer 19 year old living at home whose porn browsing history is leaked, who is kicked out by their parents, making them homeless and extremely vulnerable.
This risk is simply not worth it. Without mandatory privacy protections age verification will do more harm than good. The web blocking sanction for non-compliant websites sets an extraordinarily dangerous precedent, creating a mechanism which will allow future governments to block any sites they don't like.
There's a flurry of press activity about age verification at the moment, based on the suggestion that enforcement would begin in April. That no longer seems to be happening, but the conversation is buzzing.
Last week I was interviewed for a thorough primer on age verification which goes right back through the whole history of the policy, and is well worth a read. For example, did you know that the Special Rapporteur on freedom of expression, David Kaye, wrote an open letter to the UK Government in Jan 2017 stating that age verification "falls short of the standards of international human rights law"? I didn't. I also didn't know this:
Jerry Fishenden, former Chair of the Cabinet Office’s Privacy and Consumer Advisory Group, resigned in May 2017 citing how “advice and offers of help [were] repeatedly ignored by officials who should know better.” Also accusing the government of taking a “half-baked approach” to the security of personal data when formulating the Digital Economy Act.
In the last week, I've consulted for Open Rights Group on an informational website they're putting together on age verification, been quoted in an article about age verification at the Huffington Post, and recorded a programme for Radio 4’s World at One and PM. As I post this, I've just got home from BBC Broadcasting House, where I participated in an extended discussion on Radio 5 Live. I'll post the links to both those shows online once they're up. As if that wasn't enough, I'm also being filmed by BBC Newsnight tomorrow for a TV segment about age verification.
I’m glad to have this platform to discuss the very real issues with this policy, but it's incredibly frustrating that all this press interest is only coming up now. Some media attention would have been much more useful when the bill was still going through parliament, and we were desperately lobbying parliamentarians to mitigate the harms of this policy. Now that it's on the brink of being enforced it may make a compelling topic for discussion, but it's too late to do anything about it.
My pro bono work as a researcher and advocate is funded by Patreon. Join my campaign to help me keep doing it.