This post was originally published on this site
The U.S. Federal Trade Commission and the Justice Department are suing TikTok and ByteDance, TikTok’s parent company, with violating the Children’s Online Privacy Protection Act (COPPA). The law requires digital platforms to notify and obtain parents’ consent before collecting and using personal data from children under the age of 13.
In a press release issued Friday, the FTC’s Bureau of Consumer Protection said that TikTok and ByteDance were “allegedly aware” of the need to comply with COPPA, yet spent “years” knowingly allowing millions of children under 13 on their platform. TikTok did so, the FTC alleges, even after settling with the FTC in 2019 over COPPA violations; as a part of that settlement, TikTok agreed to pay $5.7 million and implement steps to prevent kids under 13 from signing up.
“As of 2020, TikTok had a policy of maintaining accounts of children that it knew were under 13 unless the child made an explicit admission of age and other rigid conditions were met,” the FTC wrote in the press release. “TikTok human reviewers allegedly spent an average of only five to seven seconds reviewing each account to make their determination of whether the account belonged to a child.”
TikTok and ByteDance maintained and used underage users’ data, including data for ads targeting, even after employees raised concerns and TikTok reportedly changed its policy not to require an explicit admission of age, according to the FTC. More damningly, TikTok continued to allow users to sign up with third-party accounts, like Google and Instagram, without verifying that they were over 13, the FTC adds.
The FTC also found issue with TikTok Kids Mode, TikTok’s supposedly more COPPA-compliant mobile experience. Kids Mode collected “far more data” than needed, the FTC alleges, including info about users’ in-app activities and identifiers that TikTok used to build profiles (and shared with third parties) to try to prevent attrition.
When parents requested that their child’s accounts be deleted, TikTok made it difficult, the FTC said, and often failed to comply with those requests.
“TikTok knowingly and repeatedly violated kids’ privacy, threatening the safety of millions of children across the country,” FTC chair Lina Khan said in a statement. “The FTC will continue to use the full scope of its authorities to protect children online — especially as firms deploy increasingly sophisticated digital tools to surveil kids and profit from their data.”
TikTok had this to share with TechCrunch via email: “We disagree with these allegations, many of which relate to past events and practices that are factually inaccurate or have been addressed. We are proud of our efforts to protect children, and we will continue to update and improve the platform. To that end, we offer age-appropriate experiences with stringent safeguards, proactively remove suspected underage users, and have voluntarily launched features such as default screen time limits, Family Pairing, and additional privacy protections for minors.”
The FTC and Justice Department propose fining TikTok and ByteDance civil penalties up to $51,744 per violation per day and a permanent injunction to prevent future COPPA violations.