TikTok Fined £12.7m for breaching UK data protection laws and children's data rights

Last month the Information Commissioners Office (ICO) issued a £12,700,000 fine to TikTok for a number of breaches of the UK data protection laws, including a failure to use children’s data lawfully. The Information Commissioner stated that children’s data “may have been used to track children and profile them, potentially delivering harmful inappropriate content with their next scroll.”

The ICO estimated that up to 1.4 million UK children under 13 years were permitted to use the TikTok platform regardless of its own rules that did not permit children of that age or less to create an account. TikTok effectively used personal data belonging to children under 13 years without parental consent.

TikTok was also found to fail in the carrying out of adequate checks to identify and remove underage children from accessing and using its platform. Through the ICO investigation it was found that this concern was raised with senior employees at the organisation and that TikTok did not take adequate steps to remove children using the platform against TikTok’s terms of use permitting the creation and use of the platform.

Businesses offering information society services to children under 13 years must have consent from their parents or carers. They must put ‘best interests of the child’ as the primary consideration. A concept derived from the United Nations Convention on the Rights of the Child (UNCRC).

Some factors online businesses offering services to children should consider include:

· What’s the age range of people who use our service?

· What do we know about the age of individual users?

· How much personal data do we need?

· Should we be sharing personal data?

· It is fair to children to use their personal data that way? and

· When we use children’s personal data, how might it affect their privacy; health and wellbeing?

An assessment of the above factors should be undertaken by any organisation offering online services that could reasonably be used by children.

Providing services with default settings around limits on data sharing and privacy are also actions that can assist businesses with confirming to data protection obligations. Developers can also be enable settings to be switched ‘off’ for behavioural advertising and location information. These are just some of the factors businesses can address when putting the best interests of the child first. The ICO has since published the ICO Childrens Code to help protect children within the digital world with guidance for businesses by ensuring online services are better designed with children in mind. It translates the General Data Protection Regulation (GDPR) by helping businesses understand what is expected of them.

Although not legally binding, businesses that do not follow the code’s guidance could face fines up to 4% of global turnover, enforcement action including compulsory audits and stop processing orders.

Conducting a data protection assessment to identify and mitigate any potential harm to children is vital. Beyond the GDPR, the Advertising Standards Authority prohibits the marketing of age-restricted products to children and businesses involved with this marketing should exclude any inclusion of children’s data in its processing from the outset. The UK Online Safety Bill is also being considered in parliament at the Committee stage.

The stance taken by the ICO over the protections of children’s personal data indicates the direction the ICO plans to take on the application and enforcement of the UK GDPR.

Data and privacy policies are often an area of business that is not balanced against any risk factors but as the information age continues to grow and data becomes an ever increasing business asset, the data protection laws should be considered and assessed.

Table of content

Recent Posts