It is a well-established concept, firmly rooted in the UN Convention on the Rights of the Child, that children require special legal safeguards in every part of their life. This concept even extends to the privacy and protection of children’s data, with European data protection law setting out provisions aimed specifically at children. It is in that light that the UK’s Information Commissioner’s Office (ICO) has published its code of practice for the age appropriate design of online services, which is not law but is required by the Data Protection Act 2018 and carries more weight than a simple ICO guidance piece.
The code, which should be approved by Parliament later this year, is expected to come into force in autumn 2021. It follows a period of increasing public and political scrutiny around the impact of the digital world (a world largely designed with adults in mind) on children’s development and mental health. When read alongside the Government’s April 2019 Online Harms White Paper (which, among other things, sets out plans to force social networks to block the publication of harmful material), the direction of travel in this area of privacy law and regulation is clear.
The ICO’s code is based around 15 standards of age appropriate design, to be considered in relation to (for example) new games, apps, websites, streaming services or even connected toys likely to be used or accessed by children. These standards focus on key issues such as making children’s best interest a primary consideration in design and development, minimising the collection of children’s data, and ensuring that children are not encouraged to weaken their privacy protections.
Acclaimed by the UK Information Commissioner, Elizabeth Denham, as a transformational and necessary response to children being “datafied” online by companies and organisations, the code has also been criticised by the Coalition for a Digital Economy (which represents UK tech start-ups) as a “textbook example of bad regulation” that risks entrenching the dominant position of big tech companies, by stifling start-ups who can less easily afford to develop multiple versions of products for different audiences in response to increased regulation.
Ultimately, one thing is certain: that this area of privacy law - with a broad range of vested interests and an ability to trigger strong public and political reactions - is complex. However, the ICO’s code is undoubtedly intended to be a useful tool aimed at encouraging the compliant development of age appropriate online services, and it is certainly helpful that the ICO will enforce the code in a proportionate and risk-based manner.
That being said, the way in which the online services are governed is undoubtedly evolving, and this code marks just a small step in the seemingly inevitable move towards more stringent protections of children’s online privacy.