A bipartisan group of lawmakers is urging Attorney General Merrick Garland to sue TikTok for violating the Children’s Online Privacy Protection Act (COPPA), a law meant to safeguard children’s personal information on the internet.
Rep. Tim Walberg, a Michigan Republican, and Rep. Kathy Castor, a Florida Democrat, along with Democratic Sen. Ed Markey of Massachusetts, and Republican Sen. Bill Cassidy of Louisiana, signed a letter last week calling for the Department of Justice to “act expeditiously” on a case referred by the Federal Trade Commission (FTC) last month. The FTC’s complaint alleges TikTok and its parent company ByteDance are violating COPPA and breaching the terms of a 2019 settlement for previous violations of the same law.
During the Trump administration, the FTC brought a COPPA suit against Musical.ly—a ByteDance-owned app that merged into TikTok midway through the proceeding. The FTC alleged that Musical.ly had violated COPPA by collecting and using personal information from users younger than 13 without notifying or obtaining consent from their parents, and ignoring parental requests to delete their children’s information. The FTC referred the case to the DOJ to seek civil penalties.
In 2019, Musical.ly (which by then had become part of TikTok) paid a settlement of $5.7 million. FTC officials touted it as “the largest civil penalty ever obtained by the Commission in a children’s privacy case,” though it amounted to less than 1 percent of what ByteDance paid to acquire Musical.ly two years earlier. In addition to the fine, Musical.ly agreed to delete the “ill-gotten” data of children under 13 and to follow the law going forward. The company also submitted to compliance reviews, a process enabling FTC regulators to continuously monitor the company’s adherence to COPPA’s restrictions and the conditions of the order.
The settlement took place before TikTok became a household name, barely making headlines. “Back then, when TikTok was Musical.ly, the company didn’t have the salience it does now,” said a former DOJ official who had overseen the settlement. “It wasn’t as big, it wasn’t the app that everybody is on.”
Last month, FTC levied allegations that TikTok was violating its 2019 order, which could bring heavier fines—the FTC recently hit Facebook with a whopping $5 billion penalty for violating a 2012 order. An FTC official confirmed that the agency is charging TikTok on these grounds. Though the filings have not yet been made public, the FTC deviated from standard protocol when it announced its referral of the case to the DOJ last month, stating transparency was in the public interest. “That’s not typical,” the former DOJ official said. “They don’t normally announce a referral and instead just send it to the [DOJ]. One potential explanation is that they view this as extremely serious misconduct. They want it dealt with quickly and seriously.”
Congress passed COPPA in 1998, and the FTC’s COPPA Rule, went into effect in 2000. It was meant to give parents control over the online collection of personal information from their children (considered by the law to be anyone under age 13). The law applies to operators of digital platforms intended for children, such as gaming websites, as well as any digital platform with “actual knowledge” that children are using it. COPPA requires these operators to inform parents about the type of personal information they collect from their children, how it is used, and when it might be disclosed. They must also obtain parental consent before collecting, using, or disclosing a child’s personal information. Under COPPA, operators must give parents the opportunity to review the personal information collected from their child. They must also honor a parent’s request to delete their child’s information and cease further collection or use.
The FTC is responsible for writing and enforcing granular rules within COPPA’s statutory framework. In the nearly three decades since COPPA’s passage, the FTC has modified the COPPA rule only once—in 2013, to account for increased use of social media and mobile devices. Though FTC oversees enforcement of COPPA, it cannot sue violators for civil penalties under the law without first referring the case to the Department of Justice.
Since December, the FTC has been pursuing updates to the COPPA rule that would strengthen security requirements for sensitive personal information, limit data retention, require a separate opt-in for targeted advertising, and prevent operators from sending push notifications that nudge kids to stay online.
The Children and Teens’ Online Privacy Protection Act, known as “COPPA 2.0,” has bipartisan support in both houses of Congress. The bill would expand the law to cover those up to age 16, close loopholes allowing platforms to ignore the presence of underage users, ban targeted advertising to children and teens, and make it easier for them or their parents to delete their data.
The FTC’s case against TikTok—and the support it has attracted in Congress—reflects a growing consensus about the need to protect children’s privacy from tech giants. COPPA has become critical in this increasingly combative digital landscape. In a recent House Innovation, Data, And Commerce Subcommittee hearing, FTC Commissioner Andrew N. Ferguson described consumer protection laws (such as COPPA) as “one of the last available avenues to address the many challenges posed by big tech.”
In recent years, the FTC has managed to secure much larger civil penalties under COPPA. Epic Games, the developer of Fortnite, paid $275 million for COPPA violations in a 2022 settlement. In 2023, Amazon paid $25 million to settle a COPPA suit involving Alexa devices. And last year, Microsoft paid a $20 million fine for COPPA violations pertaining to children’s Xbox accounts. With enforcement of the statute growing more vigorous, and mounting public scrutiny on TikTok, it seems unlikely ByteDance will escape with another slap on the wrist.
Please note that we at The Dispatch hold ourselves, our work, and our commenters to a higher standard than other places on the internet. We welcome comments that foster genuine debate or discussion—including comments critical of us or our work—but responses that include ad hominem attacks on fellow Dispatch members or are intended to stoke fear and anger may be moderated.
You are currently using a limited time guest pass and do not have access to commenting. Consider subscribing to join the conversation.
With your membership, you only have the ability to comment on The Morning Dispatch articles. Consider upgrading to join the conversation everywhere.