Children’s data compliance in the UK – not child’s play

Viewpoints
April 6, 2023
4 minutes

On 4 April 2023, the UK data protection regulator (ICO) announced that it had fined TikTok 12.7 million GBP for misusing children’s data. This is the ICO’s third highest fine issued to date; apart from reflecting the additional considerations required for the processing of children’s data, this fine may also be indicative of further upcoming regulatory action regarding children’s data compliance.

Background 

On 26 September 2022, the ICO issued TikTok with a notice of intent, a document that precedes a potential fine, which sets out the ICO’s provisional view that TikTok had breached UK data protection laws from May 2018 to July 2020. According to the ICO, TikTok had potentially processed the data of up to 1.4 million UK children under the age of 13 without appropriate parental consent; failed to provide proper information to its users in accordance with transparency requirements; and processed special category data without an appropriate lawful basis. On this basis, the ICO was prepared to issue a fine of 27 million GBP. Subsequently, after taking into considerations representations from TikTok, the ICO decided not to pursue the provisional finding that TikTok had processed special category data without an appropriate lawful basis, and issued a finalized fine of 12.7 million GBP.

Commentary 

This fine is indicative of two enforcement trends by the ICO:

  • Higher initial fines issued in notices of intent compared to the actual penalty. The finalized ICO fine issued to TikTok is significantly lower compared to the fine initially proposed in its notice of intent, with a reduction of more than 50%. A similar pattern can be seen in other notable fines issued by the ICO; for instance, the ICO reduced its proposed fine on Clearview AI by approximately 50% compared to the fine proposed in its notice of intent, and similarly reduced its proposed fine on Marriott and British Airways by approximately 80%-90%. The reduction in the TikTok fine may be due to the ICO taking into account the public scrutiny and pressure on the organization following the publication of the proposed fine in its notice of intent; in the British Airways fine, the ICO considered the widespread media reporting and adverse impact on brand and reputation as mitigating factors in its penalty notice when determining the amount of the finalized fine. However, the ICO calculates each fine on a case-by-case basis, and it has not published its penalty notice against TikTok; once published, it will be interesting to see how much of the fine was reduced due to reputational impact, as well as the value it has placed on the unlawful processing of special category personal data (as opposed to more general processing).

  • Increased regulatory focus on children’s data. The TikTok fine follows statements previously made by the ICO that indicates an increased regulatory focus on children’s data; in its 2021-2022 annual report, the ICO stated that it had “transitioned from encouraging compliance and working alongside industry to identifying areas which would benefit from closer scrutiny and potential enforcement”, and that it would continue to do so in the coming year. Moreover, the ICO also reported that education and childcare accounted for the second highest industry sector that generated the most personal data breaches (after healthcare) in the UK. The ICO has also further indicated that, as of September 2022, it was examining over 50 online services for compliance with its age appropriate design code, with 6 ongoing investigations into organizations who have not “taken their responsibilities around child safety seriously enough.” As a result, further regulatory action relating to children’s data compliance may be seen in the year ahead.

Other considerations when processing children’s data

In the EU and UK, children’s data merit special protection under data protection law. The vulnerability of children means that organizations should implement additional measures to protect their data; such measures include:

  • incorporating principles of privacy by design and default (for example, organizations that provide online services or products to children should set privacy configurations to the highest setting by default);
  • conducting a data protection impact assessment to identify and mitigate any potential harm to children;
  • ensuring children are provided information in a concise, transparent and easily understood way; and
  • ensuring valid consent is obtained from children, or from their holder of parental responsibility.

Organizations should also consider jurisdiction-specific guidance relating to children’s data processing. Organizations that are either established in the UK, or who process the personal data of UK children, should also consider whether it can comply with the standards in the UK age appropriate design code; although not legally binding, the ICO will take the code into account when assessing whether an organization has complied with its obligations under data protection law. Similar guidance has been published by other data protection regulators, such as the Irish Data Protection Commission, and such guidance should also be considered by organizations, to the extent applicable to their processing.

Beyond data protection law, sector-specific regulations may also be applicable to the processing of children’s data. For example, the Advertising Standards Authority prohibits the marketing of age-restricted products (such as alcoholic drinks, gambling products, or electronic cigarettes) to children; organizations involved in such marketing should therefore exclude any inclusion of children’s data in its processing from the outset.