Tag Archives: FTC

Google Lawsuit, COPPA, Investigating and Blocking Ad Trackers in Children’s Apps

Google cannot escape COPPA lawsuit

There was some big news last week on the children’s privacy front: A District Judge has ruled that Google and the apps they sell on their “store” cannot  dodge a lawsuit brought by the New Mexico Attorney General. Previously, a state court had said the case couldn’t proceed, but thanks to this decision, Google will face claims that apps they hosted in the “Designed for Families” section of their Google Play Store, and ad networks they employed, had actual knowledge they were targeting and marketing children’s data, in violation of COPPA, the Children’s Online Privacy Protection Act. The apps in question are owned by Tiny Lab Productions.

This court case will be significant in highlighting how apps use cookies and advertising tools to track children across the web. As explained in the decision

“Tiny Lab Productions (“Tiny Lab”), a Lithuanian company, is a developer of child-directed, mobile game apps including Fun Kid Racing, Candy Land Racing, Baby Toilet Race: Cleanup Fun, and GummyBear and Friends Speed Racing. AdMob [AdMob is owned by Google], Twitter/MoPub, InMobi/AerServ, Applovin, and ironSource (collectively, the “Ad Networks”) sold their proprietary software development kits (“SDKs”) to Tiny Lab for installation and use in its gaming apps. Id. ¶ 13. When a Tiny Lab app is downloaded onto a child’s device in New Mexico, the Ad Networks’ SDKs are also installed as app components. Id. ¶ 5. Once so embedded, while a child in New Mexico plays one of the apps, the Ad Networks’ SDK collects personal information about that child and tracks the child’s online behavior to profile the child for targeted advertising. Id. ¶¶ 43-46. This activity is invisible to the child and her parents” [emphasis added]

Think of an advertising SDK as a unique tag that identifies the user and follows him or her across the internet; an “Identifier for Advertisers” that allows advertisers to see what sites the user visits, and stays embedded on their device even after they are done using the original app.  Ad tracking tools like cookies, persistent beacons, and fingerprinting can be installed on a child’s device when they download an app or edtech platform and these are not transparent to the student, the teacher, or the parent. We know apps track us, but it is not always easy to see how or what they do with our data. 

Several parents have asked us:

  • How often do apps use children’s information for marketing purposes? 
  • Do edtech apps use ad trackers? 
  • How would you know if your child’s app is using adware or ad trackers?
  • What can parents do?

Thankfully, others including this bipartisan group of US Senators, are asking how edtech companies use children’s data.  The Federal Trade Commission (FTC), which oversees COPPA, is also asking how online platforms use children’s data. In a move led by Commissioner Christine Wilson, the FTC announced in December 2020 that it is using its 6(b) authority to investigate several big tech companies that handle children’s data. In a joint statement issued by the FTC says, “Despite their central role in our daily lives, the decisions that prominent online platforms make regarding consumers and consumer data remain shrouded in secrecy. Critical questions about business models, algorithms, and data collection and use have gone unanswered.” 

We agree with executive director of the Campaign for a Commercial-Free Childhood Josh Golin’s statement in Bloomberg News, “These 6(b) studies will provide a much-needed window into the opaque data practices that have a profound impact on young people’s well-being”.

These FTC studies come at a time when many are also calling for COPPA to be updated. Currently COPPA only covers children 12 and under and is confusingly and inconsistently applied to schools. Through advisory guidance (though not regulation), the FTC has said that schools can consent in place of parents, but only if the app is used ONLY for educational rather than marketing purposes. [You can see the joint letter we sent the FTC with 23 organizations when they threatened to weaken COPPA, and you can also read our separate PCSP comments to the FTC here.]

COPPA says that websites and online services, including apps and general audience sites that have actual knowledge they are collecting data from children under 13, must get prior parent approval before collecting, using or disclosing a child’s information. The FTC says this “includes a child’s name, address, phone number or email address; their physical whereabouts; photos, videos and audio recordings of the child, and persistent identifiers, like IP addresses, that can be used to track a child’s activities over time and across different websites and online services.” However, many agree that actual knowledge should be updated to constructive knowledge. As implied in the case of the above Google lawsuit, constructive knowledge means the company has enough information that they knew or should have reasonably known the app was directed towards children and they were allowing for the marketing of their personal data.

Why are companies allowed to use children’s data for advertising at all?  

Parents need transparency and control over how children’s data are collected and used. We believe children should be protected, not monetized or profiled by advertisers. We think that all advertising to children under the age of 18 by any app or program used in schools should be prohibited; any data gathered by these apps should be strictly used only for educational purposes.

Apple will prohibit automatic ad tracking

This idea of prohibiting ad tracking is not that novel. Last year Apple began requiring developers in its App Store to have Privacy Labels, listing which types of data the app collects and how it uses your data. Now, Apple has just announced a new transparency feature that will prevent apps from sharing your data with third parties, without opting-IN. Apple’s white paper that discusses their new policy and prevalence on embedded trackers is entitled A Day in the Life of Your Data, and is worth taking a look at.  As TechCrunch reports,

“The App Tracking Transparency feature moves from the old method where you had to opt-out of sharing your Identifier for Advertisers (IDFA) to an opt-in model. This means that every app will have to ask you up front whether it is ok for them to share your IDFA with third parties including networks or data brokers.”

“The feature’s most prominent evidence is a notification on launch of a new app that will explain what the tracker will be used for and ask you to opt-in to it. …app developers would have to ask users for permission in order to track and share their IDFA identifier for cross-property ad targeting purposes.”

This is how Apple describes the new system:

“Under Settings, users will be able to see which apps have requested permission to track, and make changes as they see fit. This requirement will roll out broadly in early spring with an upcoming release of iOS 14, iPadOS 14, and tvOS 14, and has already garnered support from privacy advocates around the world.”

Tools you can use to see trackers and block ads

There are several tools you can use to see and block trackers on your child’s device. Here are a few:   

  • Install uBlockOrigin tracker and ad blocker; it’s free and it shows you the trackers and blocks ads. We know of schools who have installed uBlockOrigin on every student Chromebook to stop ad tracking in schools.  Ask your school if they would be willing to install an ad blocker like uBlockOrigin on school issued devices. Go here to download uBlockOrigin https://github.com/gorhill/uBlock#ublock-origin or here https://ublockorigin.com/ ; either of these links will ensure you are using Origin. Read more about uBlockOrigin here. See an example (below) of the 14 trackers blocked while a student visited her College Board MyAP Classroom account.
  • MarkUp’s Blacklight lets you paste website urls into their analysis program to see what type of ads and trackers are being used. This tool gives detailed analysis and even flags trackers that evade cookie blockers.  https://themarkup.org/blacklight  See an example (below) of the different kinds of trackers found on a student’s Google Classroom account.
  • Use a web browser that blocks ads:  Brave web browser blocks ads and reportedly loads pages quicker than Chrome. Firefox also blocks ads and has many privacy and security extensions. 

Take our App Survey

In honor of World Data Privacy Day, on January 28, we launched an App Survey for parents, asking what apps your school uses and what privacy protections and transparency notifications are in place.  The response has been incredible and we encourage all parents to share and take this survey; of course your answers will remain confidential. Click here to take the survey and if you happen to install ad blockers, let us know what you find!  

uBlock and AP Classroom trackers

Blacklight and Google Classroom ad trackers

A Privacy Blueprint for Biden

Privacy And Digital Rights For All

The weakening of The Family Educational Rights and Privacy Act (FERPA) and the Covid19 rush to usher in virtual learning and edtech in place of in-person learning, have created a perfect storm for student data collection and tracking. Students are increasingly subjected to edtech data collection, profiling, and surveillance as a condition of attending a public school. We call on the next administration to protect children and begin implementing these important recommendations within the first 100 days of office.

Leading privacy and civil rights advocates recently called on the next U.S. administration to make protecting digital privacy a top priority. The press release signed by Campaign for a Commercial-Free Childhood, Center for Digital Democracy, Color of Change, Consumer Action, Consumer Federation of America, Electronic Privacy Information Center, Privacy Rights Clearinghouse, Parent Coalition for Student Privacy, Public Citizen, and U.S. PIRG states:

“The Biden administration and the next Congress should make protecting digital privacy a top priority, and 10 leading privacy, civil rights and consumer organizations today released a memo of recommendations for executive actions on Day One, actions during the first 100 days and legislation.

“The United States is facing an unprecedented privacy and data justice crisis,” the blueprint memo reads. “We live in a world of constant data collection where companies track our every movement, monitor our most intimate and personal relationships, and create detailed, granular profiles on us. Those profiles are shared widely and used to predict and influence our future behaviors, including what we buy and how we vote. We urgently need a new approach to privacy and data protection. The time is now.”

“The U.S. urgently needs a comprehensive baseline federal privacy law. The Biden administration and Congress should not delay in setting out strong rights for internet users, meaningful obligations on businesses, and establishing a U.S. Data Protection Agency with strong enforcement powers,” said Caitriona Fitzgerald, policy director, Electronic Privacy Information Center.

“Privacy is a basic human right, and children’s personal information should not be profiled, licensed, sold, commercialized or shared with third parties as a condition of attending a public school. We hope policymakers will move to prohibit the use of student data for marketing purposes and require all public schools and education agencies to adopt strict security and privacy standards,” said Leonie Haimson, co-chair, Parent Coalition for Student Privacy.

“For far too long, companies have deceptively tracked kids and used their sensitive data to exploit their vulnerabilities and target them with marketing. Families are counting on the Biden administration and the next Congress to recognize that children and teens are vulnerable, and to put protections in place which will allow young people to use the internet more safely,” said David Monahan, campaign manager, Campaign for a Commercial-Free Childhood.

The recommendation memo, Privacy and Digital Rights for All, specifically calls for protection of children, teen, and student data, including parent consent before sharing student data:

Action item within the first year: Protect children and teens.

Action 8: Protect Children and Teens from Corporate Surveillance and Exploitative Marketing Practices Recommendations for First 100 Days
•Urge the FTC to begin 6(b) studies on ad tech and ed tech companies’ data practices and their impacts on children and teens before undertaking any rulemaking under the Children’s Online Privacy Protection Act (COPPA).
•Protect students through an executive order that requires the Department of Education (DoE) to:
o Prohibit the selling or licensing of student data;
o Issue recommendations on transparency and governance of algorithms used in education;and
o Minimize data collection on students,ensure parental consent is affirmatively obtained before disclosing student data, and issue rules enabling parents to access and also govern data on their child.
Recommendations for Legislative Action
•Ensure children and teen privacy is legislatively protected as part of a comprehensive baseline federal privacy bill that:
o Establishes the special status of children and teens as vulnerable online users; provides strong limits on collection, use, and disclosure of data, and narrowly defines permissible uses;
o Requires employing privacy policies specific to children’s data on all sites and platforms used by children; and
o Prohibits targeted marketing to children and teens under the age of 18 and profiling them for commercial purposes.
•Strengthen COPPA by raising the covered age to 17 years and under, banning behavioral and targeted ads, banning the use of student data for advertising, and requiring manufacturers and operators of connected devices and software to prominently display a privacy dashboard detailing how information on children and teens is collected, transmitted, retained, used, and protected.
See more recommended principles for protection of children and teens here.

It’s time for the U.S. to take data privacy seriously.  Citizens should have consent and control over collection and use of their data; “pay-for-privacy provisions” and “take-it-or leave it” terms of service should be prohibited.  Finally,  our most vulnerable, our children should be protected, not exploited and surveilled as a condition of attending public school.

Coalition tells the FTC: Time is up for TiKTok

 

The Parent Coalition for Student Privacy is one of twenty advocacy, consumer, and privacy groups that filed a May 14, 2020 complaint with the Federal Trade Commission (FTC), asking them to investigate and sanction TikTok, formerly Musical.ly, for continuing to violate COPPA, the Children’s Online Privacy Protection Act. The complaint argues that TikTok continues to store and collect children’s personal information without notice to and consent of parents, in violation of its 2019 order by the FTC.

If you are not familiar with TikTok, it is a very popular social media app, with 800  million worldwide users, many of them children.  TikTok allows users to record and upload videos of themselves dancing and singing and the app has more downloads than Facebook.  As this Manchester Evening News piece points out,  the recommended ages are for 12 plus, but “online safety experts say it has been designed with the young user in mind and has a very addictive appeal.”

Why this complaint is important

Because TikTok  is a popular platform for children, parents  worry that TikTok is not safe and that it puts kids at risk of sexual predation. For example, this father warned other parents after his 7 year old daughter was asked to send nude pictures of herself on TikTok. In another instance, a 35 year old Los Angeles man was allegedly targeting girls by posing as a 13 year old boy on TikTok and engaging in  “sexual and vulgar conversations with at least 21 girls, some were as young as 9”.  This February 2020 piece in Parents  says,  “TikTok allows users to contact anyone in the world, and this comes with its own host of hazards”.  The Parents piece goes on to point out that “kids can be targeted by predators, it’s easy to encounter inappropriate content”, and “Even if you set your own account to private, you may still be exposed to sexual or violent content posted to the public feed.”

There are many more concerning  examples of underage TikTok use cited in the complaint. And as the complaint notes, it is easy for a child to fake their date of birth and sign up for an adult TikTok account.

Data is money. Children’s data is valuable, predictive and can profile the user.  As the complaint states,

“TikTok collects vast amounts of personal information including videos, usage history, the content of messages sent on the platform, and geolocation.  It shares this information with third parties and uses it for targeted advertising.”

Parents want to know how TikTok is using their children’s data.  TikTok, owned by Bytedance, uses Artificial Intelligence (AI) and facial recognition.  Per this 2018 Verge article,

 “A Bytedance representative tells The Verge that TikTok makes use of the company’s AI technologies in various ways, from facial recognition for the filters through to the recommendation engine in the For You feed. “Artificial intelligence powers all of Bytedance’s content platforms,” the spokesperson says. “We build intelligent machines that are capable of understanding and analyzing text, images and videos using natural language processing and computer vision technology. This enables us to serve users with the content that they find most interesting, and empower creators to share moments that matter in everyday life to a global audience.”
TikTok also uses persistent identifiers to track kids and TikTok algorithms create profiles of children.  Per the complaint

 

“TikTok uses the device ID and app activity data to
run its video-selection algorithm. When a child scrolls away from the video they are watching, TikTok’s algorithm uses artificial intelligence to make sophisticated inferences from the data TikTok collects to present the next video. The algorithm “entirely interprets and decides what the user will watch instead of presenting a list of recommendations to the users like Netflix and YouTube.”

 

Using personal information in this manner exceeds the limited exceptions for personalization of content. The COPPA Rule is quite clear that information collected to support internal operations may, under no circumstances, be used “to amass a profile on a specific individual.”

 

Yet TikTok does, indeed, amass a profile of each user—including child users—and draws upon that profile to suggest videos of interest to the user. That profile may be based in part on users’ overt behavior, such as liking videos. However, TikTok also appears to amass user profiles based on passive tracking.  As reported in The New Yorker, “Although TikTok’s algorithm likely relies in part, as other systems do, on user history and video-engagement patterns, the app seems remarkably attuned to a person’s unarticulated interests.” Another article observed that the algorithm “goes right to the source using AI to map out interests and desires we may not even be able to articulate to ourselves.” The profiles that TikTok amasses on its users are designed to be used not only to curate which user-generated videos appear in each users’ stream, but also to assist with advertising. ” [Emphasis added]

 

It’s time the FTC uses its power to protect children and enforce COPPA.  The FTC should investigate TikTok,  ensure TikTok is in compliance with COPPA and its consent decree. If TikTok is found in violation, the FTC should take action and sanction TikTok again–with a fine that is proportionate to the degree of TikTok’s violations.

 

We are grateful to the Campaign for a Commercial-Free Childhood (CCFC), the Center for Digital Democracy (CDD), Institute for Public Representation Georgetown University Law Center  and many others for their work on this complaint.

 

Here is The Campaign for a Commercial-Free Childhood (CCFC) full press release.  Additional coverage of the TikTok complaint can be seen as reported in the New York Times, Financial Times, Politico, Morning Tech, and Reuters