TikTok has been fined £12.7m by the Information Commissioner’s Office for not doing enough to protect the privacy of children.
The UK’s data watchdog found the online video platform had used the data of children aged under-13 without parental consent.
Despite implementing an age-gate of 13 years old, the ICO estimated that around 1.4 million UK children under this age were allowed to use the platform in 2020.
Information commissioner John Edwards said: “There are laws in place to make sure our children are as safe in the digital world as they are in the physical world. TikTok did not abide by those laws.
“As a consequence, an estimated one million under-13s were inappropriately granted access to the platform, with TikTok collecting and using their personal data.
“TikTok should have known better. TikTok should have done better. Our £12.7m fine reflects the serious impact their failures may have had.”
While the fine represents one of the largest the ICO has issued, it is significantly lower than had been previously anticipated based on the notice of intent TikTok received from the watchdog, saying it could face a £27m fine for these breaches.
This is not the first time the online video platform has ran into trouble for endangering the privacy of children. In 2019 in the United States, the company was handed a record $5.7m fine by the Federal Trade Commission, for improper data collection from children under 13.
In another case, in 2020, South Korea’s data watchdog found TikTok had collected data of children under 14 without the consent of legal guardians and fined the company 186m Won (£123,000).
In 2021, the European Commission had given TikTok one month to respond to allegations that it had failed to protect children from aggressive advertising on its platform.
In one of the most serious cases of children being endangered on the platform, Italy’s data protection authority imposed an immediate temporary block on TikTok’s access to data for any user whose age could not be verified following a ruling that a 10-year-old girl had died of asphyxiation while participating in a dangerous user-generated TikTok challenge.
On yesterday’s fine, TikTok said:
“TikTok is a platform for users aged 13 and over. We invest heavily to help keep under 13s off the platform and our 40,000 strong safety team works around the clock to help keep the platform safe for our community. While we disagree with the ICO’s decision, which relates to May 2018 – July 2020, we are pleased that the fine announced has been reduced to under half the amount proposed last year. We will continue to review the decision and are considering next steps.”
The company also stated that if someone does not meet its minimum age requirement, it suspends their ability to attempt to re-create an account using a different date of birth.
TikTok further noted that it regularly release figures about the number of accounts it removes that are suspected to be under the age of 13. This number was 17m+ in the last three months of 2022. The company said consent is not an appropriate legal basis to offer its service and that parental consent only applies when consent is the legal basis.