This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
THE LENS
Digital developments in focus
| 3 minute read

Children’s privacy in focus: ICO fines Reddit £14.47m

Whereas 2025 saw the ICO’s penalties firmly aimed at data security and cyber breaches, this year has seen the issue of two significant fines to online services, Imgur (£247,590) and Reddit (£14.47m), highlighting the regulator’s long-standing focus on children’s privacy and online safety. In this blog we draw out some key lessons from these fines and consider the outlook for enforcement in this area.

High potential penalties

The ICO’s £14.47m fine for Reddit is the ICO’s highest fine in nearly three years, since the £14.5 million issued to TikTok in April 2023, also for children’s privacy failings (see this blog). We don’t yet have the full monetary penalty notice for the Reddit action, but in its Reddit announcement the ICO notes the following as relevant factors in setting the penalty at this amount: the number of children impacted, the degree of potential harm caused, Reddit’s global turnover and the duration of the infringements. Similar factors were also listed for Imgur.

T&Cs and ‘self-declaration’ of age is insufficient

A key failure in both the Reddit and Imgur cases is that the platforms relied on a prohibition or restriction in their terms and conditions (T&Cs) on access by children under 13, although Reddit subsequently added age assurance measures in July 2025. The ICO has made clear that relying on T&Cs is not sufficient to prevent children’s access to a service without other measures such as an effective age assurance mechanism.

In relation to Reddit, the ICO emphasises that age verification by self-declaration alone is also insufficient. Reddit had only required users to declare their age upon opening an account, with no further checks to verify this. The ICO previously warned Reddit that this approach was easy for children to bypass. In his statement accompanying the fine, Information Commissioner John Edwards confirms that the ICO will now be focusing on platforms relying on self-declaration age assurance. Industry should therefore take note and update practices accordingly.

Child focused DPIAs are essential

In both the Imgur and Reddit actions, the ICO highlights that the services had not carried out a data protection impact assessment (DPIA) focused on the privacy risks to children, despite children between 13 and 18 being allowed to use the platforms. Carrying out a DPIA to specifically identify and address risks to children is a requirement under the ICO’s Children’s Code where children are likely to access the service.

Lawfulness

Both the Reddit and Imgur announcements identify that the controllers lacked a legal basis for processing the personal data of children under 13. Neither service obtained parental consent for the processing. In the Reddit statement, the ICO reasons that Reddit’s failure to implement any robust age assurance mechanism resulted in the platform having no lawful basis for the processing of under-13’s data.

Outlook

The fines come hot on the heels of the ICO publishing a progress update on its Children’s Code strategy (the Progress Update) in December last year, which confirms that safeguarding children’s privacy is a key priority for the regulator, as reiterated by the Information Commissioner at the IAPP conference in London earlier this week. The Progress Update outlines the industry engagement work the ICO has been doing in this space, including:

  • securing improvements or confirming good practice in 10 platforms’ approaches to children’s privacy settings, including Twitch, Vibe and Hoop;
  • reviewing age assurance practices of platforms popular with children, including Discord, Pinterest and X; and
  • initiating engagement with Snap and Meta about the processing of children’s geolocation data.

These developments, and the recent fines, confirm the importance of ongoing compliance with the UK GDPR regime (and the ICO’s statutory Children’s Code guidance) for organisations operating in the online space whose services are ‘likely to be accessed’ by children, which is of course much broader than those services targeting children.

Similarly, in the EU, the EDPB chose children’s data as the topic to highlight in its Data Protection Day press release earlier this month, noting that it is working on guidance on processing children's data.

The ICO’s and EU DPAs' work in this area runs parallel, in the UK, to Ofcom’s enforcement of the Online Safety Act (OSA), and in the EU to the enforcement of the Digital Services Act (DSA). The ICO has confirmed that it continues to work closely with Ofcom to coordinate its approach to protecting children online, whilst in the EU the EDPB has a seat at the relevant working group of the Digital Services Board (the EU forum to facilitate co-operation and consistent enforcement of the DSA). Ofcom has outlined that it will be focusing on age assurance, child protection and risk assessments in 2026 in its OSA enforcement (see this blog).

With two active regulators in the UK highlighting their focus in these areas, and age assurance technologies advancing at a pace, organisations should be monitoring developments closely and considering whether their practices need refreshing, especially given further enforcement seems likely. 

Sign up to receive the latest insights. Click here to subscribe to The Lens Blog.

Tags

data, digital regulation, dp, dsa, osa