Privacy advocates have filed a federal complaint against TikTok, a mobile application popular among children, teenagers, and young adults.
According to Reuters, The Center for Digital Democracy and Campaign for a Commercial-Free Childhood both asked the Federal Trade Commission to investigate TikTok for violating consent decrees and children’s privacy online.
The complaint alleges that TikTok broke federal law when it failed to take down all videos made by children under the age of 13.
TikTok had, actually, agreed to remove any such videos under the terms of a consent agreement signed with the FTC last winter. In approving the agreement, the FTC said that TikTok—then known as Musical.ly—knew that underage children were using its platform.
Federal law dictates that social media companies and marketers cannot collect the personal information of children under 13 without first obtaining parental consent. TikTok nevertheless collected children’s names, e-mail address and other sensitive data.
The FTC, in response, levied a small fine against the company, requiring it to pay $5.7 million for privacy violations.
In a statement regarding the latest allegations, TikTok spokeswoman Hillary McQuaide maintains that the China-based company takes privacy “seriously.”
“We take privacy seriously and are committed to helping ensure that TikTok continues to be a safe and entertaining community for our users,” McQuaide said.
But The Center for Digital Democracy and its allies say that, despite signing off on the FTC’s consent agreement, TikTok never deleted personal information about users age 12 and younger.
“We found that TikTok currently has many regular account holders who are under age 13, and many of them still have videos of themselves that were uploaded as far back as 2016, years prior to the consent decree,” the groups said in a complaint.
Michael Robinson, a staff attorney at the Institute for Public Representation at Georgetown Law who’s representing privacy groups in the action, found that many children’s accounts are highly visible.
“We easily found that many accounts featuring children were still present on TikTok,” Robinson said. “Many of these accounts have tens of thousands to millions of followers, and have been around since before the order.”
TikTok, adds Reuters, does allow children under 13 to create accounts—but their accounts are restricted, both in terms of the content they can upload and how they are allowed to interact with other users.
The CDD and CFC have both said that letting children continue to make accounts, even if those accounts have limited functionality, is a breach of the Children’s Online Privacy Protection Act. That’s because TikTok still collects information from its underage users, which is then passed on to third-party marketers.
Both groups asserted that coronavirus mitigation efforts, like shelter-in-place orders, means that more children are using TikTok more regularly.
“More than a year later,” the groups said, “with quarantined and kids flocking to the site in record numbers, TikTok has failed to delete personal information collected from children and is still collecting kids’ personal information without notice to and consent of parents.”
Furthermore, it’s not particularly difficult for children to bypass TikTok’s age requirements—anyone under 13 can simply say they’re older.
“TikTok continues to be one of the most popular apps in the world, and it is widely used by children and teens in the United States, so it especially important than the FTC promptly and thoroughly investigates TikTok’s practices and take effective enforcement action,” the groups said.
But neither the CDD nor the CFC have suggested how TikTok might efficiently check young users’ age.