Skip to content

ICO bring enforcement action agains social media platforms misusing children’s personal data

Organisations using video content that is accessed by children must comply with strict safeguarding measures or face significant penalties.

Research by the Information Commissioner’s Office (ICO) found that 96% of children between the ages of 3 and 17 regularly watch videos on video sharing platforms.

Therefore, it’s unsurprising that the ICO’s focus for the rest of 2024 and into 2025 is to clamp down on social media and video sharing platforms that do not have the necessary safeguarding measures in place for their core audience, child users.

Children warrant special protection when it comes to processing their personal data, as they are less aware of the risks involved. The ICO has provided the Children’s Code consisting of 15 principles that Information Society Services (ISS) is expected to comply with.

Such compliance would help services conform with General Data Protection Regulations (GDPR).

One concern for the ICO is that following GDPR, children under the age of 13 cannot legally consent to ISS processing their personal data. For under 13s, consent from an individual with parental responsibility for the child is required.

The ICO is concerned as to how organisations are gaining this consent and the possible harm to children accessing content on these platforms where consent is not obtained.

Evidently, children under the age of 13 are accessing social media platforms. The ICO found that there were 1.4 million users under the age of 13 with accounts on TikTok, who as the ICO reported “did not do enough” to check who was using their platform and remove underage children.

The platform was ultimately fined £12.7 million for their non-compliance of GDPR.

The ICO is warning organisations of all sizes who process children’s data to comply with the Children’s Code and GDPR or risk facing enforcement procedures, such as penalties (fines), assessment notices, warnings, reprimands, and enforcement notices.

What steps organisations can take to comply

Organisations first need to review the likelihood of children using their platforms, even those designed for adults. Organisations can implement age verification of users at sign up, for example, Instagram has implemented technology that can confirm a person’s age based on a video selfie.

However, the ICO recognises that for other organisations, verification mechanisms may be dependent on the available technology to them, but putting measures in place remains crucial.

Further measures can include checking the relationships between the guardian and child, for example through third party verification systems.

An organisation must use reasonable efforts to gain parental consent and is advised to carry out a data protection impact assessment (DPIA). At Napthens we undertake DPIAs for organisations to consider their compliance and risk areas.

The ICO also suggests that organisations should minimise the data they collect, do not retain it beyond the time it is needed and adequately protect it.

Next steps

Napthens advises on all aspects of data protection compliance – contact us.

Shannon Adamson - Trainee Solicitor

Shannon Adamson | Trainee Solicitor

Shannon Adamson is a trainee solicitor within the team, based in the firm's Preston office.