Skip to content
Florida’s New Social Media Law, Explained
Go to my account

Florida’s New Social Media Law, Explained

The measure targeting minors is sure to face legal challenges before going into effect.

In this photo illustration, social media apps are displayed on an iPad on February 26, 2024, in Miami, Florida. (Photo illustration by Joe Raedle/Getty Images)

On March 25, Florida Gov. Ron DeSantis signed Florida’s H.B. 3, titled “Online Protection for Minors,” placing significant restrictions on minors’ use of social media. The law prohibits children under the age of 14 from having social media accounts and allows 14- and 15-year-olds to have accounts only with parental permission.

Proponents of the bill argue it will prevent harm to minors, while opponents say it’s overreaching and unconstitutionally infringes on minors’ and adults’ free speech rights. The legal challenges that are sure to follow will determine whether Florida has cracked the code where other states have failed.

Who is subject to the law?

The law, which takes effect January 1, 2025, applies to Florida residents using a social media platform while in the state. It applies to platforms: where at least 10 percent of users under age 16 average two or more hours a day on the platform and that allow users to upload content or view other users’ content; analyze user data to select content for users; and have at least one “addictive feature,” such as infinite scrolling, push notifications, metrics (such as likes and reposts), auto-play videos, or livestreaming. It covers companies doing business in Florida.

What does the law do?

The law requires companies to delete the accounts and all associated data of users it knows or believes are younger than 14 years old (individuals have 90 days to dispute the designation). It allows companies to create accounts for 14- and 15-year-olds only with a parent or guardian’s permission. The law doesn’t detail how parents consent to account creation, but it authorizes the Florida Department of Legal Affairs to adopt rules for implementation. Parents and guardians must also be allowed to request deletion of accounts for children 15 and under. 

The law also requires platforms that knowingly or intentionally publish or distribute substantial amounts (more than one-third of the platform’s content) of “material harmful to minors” to verify that users are at least 18 years old. (The law limits “material harmful to minors” to sexually explicit material). 

Companies must offer both standard and anonymous methods of age verification. A standard method needs to be “commercially reasonable,” which the law doesn’t define. (Another Florida statute that may influence the definition here describes it as “the usual manner on any recognized market” or “otherwise in conformity with reasonable commercial practices.”) Anonymous methods are not detailed, but the law specifies that these methods must not retain users’ personally identifying information after verification is complete. Nor can the companies share the information or use it for any other purpose.

Consumers and the Florida Department of Legal Affairs can sue companies for knowingly or recklessly violating the law. The state can recover $50,000 per violation, and individuals can recover up to $10,000.

What’s the constitutional standard for laws like this?

Because the social media restrictions limit speech, they have to pass muster under the First Amendment. In First Amendment law, different standards apply to commercial and non-commercial speech. For example, the First Amendment allows a state to prohibit false advertising, but a state couldn’t prohibit lying about military honors. A state can’t force students to say the Pledge of Allegiance, but it can require doctors to provide certain disclosures before a medical procedure. 

For non-commercial speech, courts first look at whether the restriction in question is content-based. Content-based restrictions favor or disfavor speech based on its message, ideas, subject matter, or content. A regulation on wiretapping is not content-based, but regulating the contents of phone conversations would be. In some cases, speaker-based restrictions are also content-based.

Content-based restrictions must clear a higher constitutional bar, the “strict scrutiny” standard: The state must identify a compelling government interest in the issue, and the resulting  restriction must be narrowly tailored, using the “least restrictive means,” to achieve that interest. 

If the restrictions are not content-based, courts apply “intermediate scrutiny,” which requires that a restriction on speech further an important government interest and that the incidental restriction on speech be no greater than necessary to further that interest. 

The First Amendment provides less protection for commercial speech, but when it’s intertwined with non-commercial speech, courts apply the standard for non-commercial speech.

While minors’ free-speech rights can be limited in ways adults’ rights can’t—in educational environments, for example—social media use is not one of those areas.

The standard for restrictions on sexually explicit material depends on whether the restrictions target “indecent” or “obscene” material. The First Amendment does not protect obscenity, which the Supreme Court has defined as “works which, taken as a whole, appeal to the prurient interest in sex, which portray sexual conduct in a patently offensive way, and which, taken as a whole, do not have serious literary, artistic, political, or scientific value.” Legislatures can regulate obscene material as long as they have a rational basis. But the First Amendment protects sexually explicit material that, while indecent, doesn’t qualify as obscenity. Laws restricting obscenity are content-based and must satisfy the strict scrutiny test. Some material that is obscene for minors may only be indecent for adults, so a law that permissibly restricts minors’ access may impermissibly restrict adults’ access. 

Will Florida’s law survive legal challenges?

NetChoice, an industry association whose members include Google, Meta, X, and TikTok, has challenged similar laws in other states and advocated for DeSantis to veto the bill after it passed. NetChoice has won preliminary injunctions under both strict and intermediate scrutiny against laws in Arkansas, Ohio, and California, and it challenged social media restrictions in Utah that the legislature has since repealed and replaced. Among the reasons courts have cited for rolling back restrictions are vagueness, criminal penalties, and burdensome compliance requirements.

Florida has tried to preempt many of those issues, though. It defines more carefully and specifically which entities it regulates. The Florida bill has no criminal penalties. The legislature added an anonymous identity verification provision after DeSantis vetoed an earlier version of the bill that required Florida residents to submit some form of ID to create social media accounts.

The provisions restricting access to sexually explicit content, however, are on firmer legal ground. Several states have passed laws requiring adult content providers to verify users’ age before displaying explicit content. The Free Speech Coalition, a trade organization for the adult entertainment industry, has challenged age verification laws—with only mixed success—in Louisiana, Texas, and Utah. The 5th Circuit Court of Appeals recently allowed Texas’ age verification law to take effect, applying the rational-basis test for obscenity. But the dissenting judge argued that the law encompassed some protected speech and should be evaluated under the strict scrutiny framework. That distinction will be crucial to legal challenges to H.B. 3.

What’s next?

Florida has had the benefit of seeing where courts have found fault with other states’ social media regulations and making changes to its own bill to head off challenges. Yet it still faces an uphill road because, thus far, courts have consistently held that similar laws violate the First Amendment under both intermediate and strict scrutiny. But the age verification requirements for websites with adult content are much more likely to survive legal challenge.

Nick Hafen is the head of legal technology education at BYU Law.

Share with a friend

Your membership includes the ability to share articles with friends. Share this article with a friend by clicking the button below.

Please note that we at The Dispatch hold ourselves, our work, and our commenters to a higher standard than other places on the internet. We welcome comments that foster genuine debate or discussion—including comments critical of us or our work—but responses that include ad hominem attacks on fellow Dispatch members or are intended to stoke fear and anger may be moderated.

With your membership, you only have the ability to comment on The Morning Dispatch articles. Consider upgrading to join the conversation everywhere.