Posted at 10:11 on 12 Feb 2020 by Pandora / Blake
The ICO have released their new "Age appropriate design", a new statutory code of practice, which aims to keep children safe online in the age of social media. It affects websites likely to be accessed by children in the UK, which will be obliged to account for the "best interests" of the child, and to grant special protection to how childrens' data is used.
A lot of these standards are sensible and and useful. For instance, websites which provide parental controls are obliged to send a clear signal to the child if parents are monitoring their online activity or tracking their location. This is an important protection of a child's right to privacy.
Previous drafts of the code included compulsory age checks for websites (sound familiar?), prompting a campaign led by Open Rights Group. The final code has been revised, and does not introduce age verification for all websites including social media sites, thank goodness.
What it does have is this:
Age appropriate application: Take a risk-based approach to recognising the age of individual users and ensure you effectively apply the standards in this code to child users. Either establish age with a level of certainty that is appropriate to the risks to the rights and freedoms of children that arise from your data processing, or apply the standards in this code to all your users instead.
On my reading, it seems like it would be no bad thing for the standards in this code to be applied to all users. They are mostly concerned with making information clearly available to users, protecting users' privacy, and controlling how users' data may be misused. If this makes it harder for websites to pressure users into sharing their data, track cookies, see users' locations and so on, then that is a win for everyone's privacy. AdTech is a huge problem for user privacy, and GDPR (general data protection regulations) are inadequate to hold companies to account.
With Grindr and OKCupid the latest culprits in violating consent and sharing intimate details, perhaps it would be better if all of us were entitled to the high standards of privacy this code affords to children?
However the Open Rights Group have called for an impact assessment to be published, to investigate the potential impact of this clause if websites do decide to adopt age checks to establish the age of their site visitors.
Age Verification demands could become a barrier to adults reaching legal content, including news, opinion and social media. This would severely impact free expression. The public and Parliament deserve a thorough discussion of the implications.
Since age verification necessarily infringes users' privacy, it's unclear to me how age checks could be conducted in a way that respects the interests of the child. The code therefore seems contradictory and unworkable in its current form.
Meanwhile, Baroness Howe introduced a Private Members’ Bill in the House of Lords, attempting to re-start the #AgeVerification provisions of the Digital Economy Act 2017. (Thanks Alex Haydock for the tip off.)
Nothing has changed since age verification was technically and socially unworkable last time. It's not going to magically turn into a usable idea just because a few months have passed. If you want to help kids, go away and come back with a bill to improve sex education.
And just to increase the drama, four age verification companies have lodged a judicial review at the High Court challenging the Culture Secretary’s decision to shelve the scheme to impose age checks on all porn sites viewed in the UK.
Age verification seems to be Schrödinger's policy. But unlike the cat, I'm hoping it's dead.