Tag Archives: TikTok

Coalition tells the FTC: Time is up for TiKTok

 

The Parent Coalition for Student Privacy is one of twenty advocacy, consumer, and privacy groups that filed a May 14, 2020 complaint with the Federal Trade Commission (FTC), asking them to investigate and sanction TikTok, formerly Musical.ly, for continuing to violate COPPA, the Children’s Online Privacy Protection Act. The complaint argues that TikTok continues to store and collect children’s personal information without notice to and consent of parents, in violation of its 2019 order by the FTC.

If you are not familiar with TikTok, it is a very popular social media app, with 800  million worldwide users, many of them children.  TikTok allows users to record and upload videos of themselves dancing and singing and the app has more downloads than Facebook.  As this Manchester Evening News piece points out,  the recommended ages are for 12 plus, but “online safety experts say it has been designed with the young user in mind and has a very addictive appeal.”

Why this complaint is important

Because TikTok  is a popular platform for children, parents  worry that TikTok is not safe and that it puts kids at risk of sexual predation. For example, this father warned other parents after his 7 year old daughter was asked to send nude pictures of herself on TikTok. In another instance, a 35 year old Los Angeles man was allegedly targeting girls by posing as a 13 year old boy on TikTok and engaging in  “sexual and vulgar conversations with at least 21 girls, some were as young as 9”.  This February 2020 piece in Parents  says,  “TikTok allows users to contact anyone in the world, and this comes with its own host of hazards”.  The Parents piece goes on to point out that “kids can be targeted by predators, it’s easy to encounter inappropriate content”, and “Even if you set your own account to private, you may still be exposed to sexual or violent content posted to the public feed.”

There are many more concerning  examples of underage TikTok use cited in the complaint. And as the complaint notes, it is easy for a child to fake their date of birth and sign up for an adult TikTok account.

Data is money. Children’s data is valuable, predictive and can profile the user.  As the complaint states,

“TikTok collects vast amounts of personal information including videos, usage history, the content of messages sent on the platform, and geolocation.  It shares this information with third parties and uses it for targeted advertising.”

Parents want to know how TikTok is using their children’s data.  TikTok, owned by Bytedance, uses Artificial Intelligence (AI) and facial recognition.  Per this 2018 Verge article,

 “A Bytedance representative tells The Verge that TikTok makes use of the company’s AI technologies in various ways, from facial recognition for the filters through to the recommendation engine in the For You feed. “Artificial intelligence powers all of Bytedance’s content platforms,” the spokesperson says. “We build intelligent machines that are capable of understanding and analyzing text, images and videos using natural language processing and computer vision technology. This enables us to serve users with the content that they find most interesting, and empower creators to share moments that matter in everyday life to a global audience.”
TikTok also uses persistent identifiers to track kids and TikTok algorithms create profiles of children.  Per the complaint

 

“TikTok uses the device ID and app activity data to
run its video-selection algorithm. When a child scrolls away from the video they are watching, TikTok’s algorithm uses artificial intelligence to make sophisticated inferences from the data TikTok collects to present the next video. The algorithm “entirely interprets and decides what the user will watch instead of presenting a list of recommendations to the users like Netflix and YouTube.”

 

Using personal information in this manner exceeds the limited exceptions for personalization of content. The COPPA Rule is quite clear that information collected to support internal operations may, under no circumstances, be used “to amass a profile on a specific individual.”

 

Yet TikTok does, indeed, amass a profile of each user—including child users—and draws upon that profile to suggest videos of interest to the user. That profile may be based in part on users’ overt behavior, such as liking videos. However, TikTok also appears to amass user profiles based on passive tracking.  As reported in The New Yorker, “Although TikTok’s algorithm likely relies in part, as other systems do, on user history and video-engagement patterns, the app seems remarkably attuned to a person’s unarticulated interests.” Another article observed that the algorithm “goes right to the source using AI to map out interests and desires we may not even be able to articulate to ourselves.” The profiles that TikTok amasses on its users are designed to be used not only to curate which user-generated videos appear in each users’ stream, but also to assist with advertising. ” [Emphasis added]

 

It’s time the FTC uses its power to protect children and enforce COPPA.  The FTC should investigate TikTok,  ensure TikTok is in compliance with COPPA and its consent decree. If TikTok is found in violation, the FTC should take action and sanction TikTok again–with a fine that is proportionate to the degree of TikTok’s violations.

 

We are grateful to the Campaign for a Commercial-Free Childhood (CCFC), the Center for Digital Democracy (CDD), Institute for Public Representation Georgetown University Law Center  and many others for their work on this complaint.

 

Here is The Campaign for a Commercial-Free Childhood (CCFC) full press release.  Additional coverage of the TikTok complaint can be seen as reported in the New York Times, Financial Times, Politico, Morning Tech, and Reuters