The State, the Screen, and the Child: Rethinking Protection, Autonomy, and Algorithmic Control

Internationally, governments have been cracking down on social media companies by targeting the effects of their algorithms on children and youth. Australia has recently implemented a ban for several social media platforms including Facebook, Instagram, YouTube, Snapchat, and X for Australian citizens under 16 years old. Anika Wells, Australia’s Communications Minister, has said that in order for these platforms to conduct business in Australia, they must “take reasonable steps to deactivate accounts for users under 16, prevent children from registering new accounts, check ages, and also prevent workarounds to bypass the restrictions.” [1] The United Kingdom, who recently introduced similar age restriction legislation, and Australia have introduced policies requiring these companies to implement ID checks and the use of facial age estimation technology. [2]

The United States has followed suit with its own policies regarding the presumed online safety of the nation’s youth. In June 2025, the Supreme Court issued a decision in Free Speech Coalition v. Paxton that upheld a Texas law requiring adult websites to verify users’ ages before allowing access. [3] This case, in particular, addressed pornographic content, but the Texas law, H.B. 1181, simply states that it “specifically targets sites where more than one-third is adult content.” [4] The trouble occurs with the definition of adult content: The Texas law made the obvious distinction for pornographic content but also focused on content that was considered “obscene.” 

The definition of “obscene” was created during Miller v. California

 “(a) whether “the average person, applying contemporary community standards” would find that the work, taken as a whole, appeals to be prurient interest, []; (b) whether the work depicts or describes, in a patently offensive way, sexual conduct specifically defined by the applicable state law; and (c) whether the work, taken as a whole, lacks serious literary, artistic, political, or scientific value.” [5] 

Miller's definition of obscenity can and has been argued and manipulated to determine what qualifies as “obscene and adult content.” This ambiguity causes problems for social media companies under Texas law that target multiple age demographics and allow for content creation from all ages, such as TikTok, Instagram, and YouTube. 

The Court ruled that the First Amendment and privacy concerns of age verification requirements that were argued during Miller were not enough to introduce the strict scrutiny standard: The government only needs to show the law serves an “important interest and is substantially related to achieving that goal” rather than show “the law serves compelling interest using the least restrictive means possible.” to which the Supreme Court did not see enough evidence of. [6]

The ruling marked a significant departure from the ruling in Reno v. ACLU, in which the Court struck down parts of the Communications Decency Act (CDA) as overly broad and vague, emphasizing the internet as a “uniquely open and democratic medium.” [7] In Reno, the Court argued that age verification technology was “impractical and would chill adult access,” but with the expansion of such technologies, the Court in Free Speech Coalition acknowledged that pornography has become more easily accessible online, thus prompting a higher state interest in protecting youth from said content. The decision has allowed for other age-verification and platform design requirement laws to be enacted by state legislatures all over the nation. [8]

Free Speech Coalition is just one of many lawsuits against big social media companies for their effects on children. A bipartisan coalition of about 42 state attorneys around the nation have chosen to sue Meta Platforms. Inc. (Meta), alleging that they “designed and deployed features on Facebook and Instagram that encouraged addictive behaviors it knew to be harmful to its young users’ mental and physical health.” [9] Thirty-three of those state attorneys decided to jointly file against Meta in the US District Court for the Northern District of California. They have claimed that the company has violated the 1998 Children’s Online Privacy Protection Act (COPPA), which is a rule imposed by the Federal Trade Commission (FTC) that imposes certain requirements: “limiting the amount of information they obtain on children from marketing and advertising purposes, on operators of websites or online services directed to children under 13 years of age, and on operators that have actual knowledge that they are collecting personal information online for a child under 13 years of age.” [10] Nine other states have chosen to file individually in their own state courts. These states have tailored their complaints to respective state laws, producing some unprecedented legal strategies.

Massachusetts, in particular, is suing Meta under the Commonwealth’s “public nuisance” statute which is “a strategy that hundreds of school districts across the US have been mounting against the social media companies.” [11] Under Massachusetts common law, a “public nuisance” is when someone’s conduct unreasonably interferes with a right common to the general public, “such as interference with public health and public safety and public peace, and public comfort or convenience”. [12] The Commonwealth argues Meta has “knowingly and unconscionably manipulated, exploited, and preyed upon vulnerable youth’s developing brains by purposely designing and employing platform tools and features that utilized addictive, habit-forming dopamine response patterns, ... to prolong and maximize their time spent on its platforms.. causing serious psychological and physical harm.” [13] The complaint states Meta has caused several injuries to Massachusetts youth including “a crisis of youth addiction in approximately 29,000 Massachusetts youth ages 13-17 who use the platform for more than two hours or more per day.” [14] It blames Meta for the health-care costs of mental health services that these teens may use and the disruption to public education specifying “loss of productivity, disruption, and poor school performance related to excessive and addictive social media use.” [15]

The Commonwealth claims that Meta had extensive research that identified the harms of the algorithm to youth but chose to continue based on the profit and revenue that they receive from advertisers and conduct done on the app. The suit calls for the Court to determine if Meta engaged in “unfair and deceptive acts and practices” in violation of Massachusetts General Laws, as well as an order for them to pay Massachusetts compensatory damages for the harm done caused by the nuisance. [16] 

Previously, many social media companies were able to circumvent lawsuits and other troubles by claiming safety under Section 230 of the Communications Act of 1934. Enacted as part of the Communications Decency Act of 1996, Section 230 provides limited federal immunity to providers and users of interactive computer services. The statute essentially states that website operators hold no liability and no cause of action may be brought against them under any state or local law for comments made on their websites that originate from third parties. This immunity stands regardless of whether the service provider has knowledge of the content in question; even if the service provider edits the material, there is no liability unless the edits make it defamatory.  [17] 

However, Massachusetts Superior Court Justice Peter B. Krupp denied Meta Section 230 immunity from the state’s lawsuits and allowed the public nuisance complaint to continue. [18] Meta’s motions to dismiss in the lawsuits set forth by the bipartisan coalition are seeing similar outcomes. The case is still to be decided in court, but the problems still stand: whether these new bans bring up serious privacy concerns through ID verification, whether they violate First Amendment protections for users and content creators on these apps, and, as some critics have pointed out, whether they reflect ever-growing government control over the Internet under the guise of children’s safety. 

At its core, the litigation against these companies reflect a growing question in our modern age with the rise of algorithms: how will the government balance their duty to protect children while upholding free expression of everyone? We are taught that the state has duties to its citizens, those duties are to protect them and to support them especially when it comes to children, one of the most protected and closely watched groups in the United States. These lawsuits against Meta. Inc. and other social media companies hold the same underlying premise that they “prey on our young people and have chosen to profit by knowingly targeting and exploiting their vulnerabilities.” [19] On its face, such state action appears aligned with the traditional governmental role of safeguarding the collective welfare of children against the dangers of algorithms. But, do we believe that duty to be bigger than the ability to create and share? These apps have allowed for the cultivation of movements and common experience, creating solidarity and sparking political change. These lawsuits will ultimately help us define and understand whether we prioritize protection or expression in the age of algorithms.


Sources

  1. Taylor, Josh. “How Will Australia’s Under-16s Social Media Ban Be Enforced, and Which Platforms Will Be Exempt?” the Guardian. The Guardian, July 31, 2025. https://www.theguardian.com/technology/2025/aug/01/how-australia-under-16s-social-media-ban-enforced-tiktok-instagram-facebook-exempt-platforms.

  2. Ibid.

  3. Free Speech Coalition, Inc. v. Paxton, No. 23-1122, syllabus (U.S. June 27, 2025).

  4. Texas H.B. 1181, 88th Leg., House Comm. Rep. version (2023), https://capitol.texas.gov/tlodocs/88R/billtext/html/HB01181H.htm.

  5. LII / Legal Information Institute. “Obscenity,” 2023. https://www.law.cornell.edu/wex/obscenity.

  6. LII / Legal Information Institute. “Strict Scrutiny,” 2015. https://www.law.cornell.edu/wex/strict_scrutiny.

  7. Arciniega, Jessica, Morgan Sexton, and Amelia Vance. “Supreme Court Upholds Age Verification: A Game-Changer for Child Online Safety Laws.” Public Interest Privacy, July 1, 2025. https://publicinterestprivacy.org/paxton-age-verification/.

  8. Ibid. 

  9. Commonwealth of Massachusetts. Complaint and Jury Demand: Commonwealth of Massachusetts v. Meta Platforms, Inc. and Instagram, LLC. Suffolk Superior Court, Civil Action No. 2384CV. October 24, 2023. 

  10. Children’s Online Privacy Protection Rule, 16 C.F.R. § 312 (2025), https://www.ftc.gov/legal-library/browse/rules/childrens-online-privacy-protection-rule-coppa.

  11. Miller, Gabby. “Social Media Lawsuits by State Attorneys General Surmount Section 230, Other Challenges.” Tech Policy Press, October 24, 2024. https://www.techpolicy.press/social-media-lawsuits-by-state-attorneys-general-surmount-section-230-other-challenges/.

  12. Commonwealth v. Meta Platforms (2023).

  13. Ibid.

  14. Ibid

  15. Ibid

  16. Ibid

  17. Congress.gov. “Section 230: An Overview,” 2025. https://www.congress.gov/crs-product/R46751.

  18. Miller, “Social Media Lawsuits by State Attorneys General.”

  19. Mass.gov. “AG Campbell Files Lawsuit against Meta, Instagram for Unfair and Deceptive Practices That Harm Young People,” 2023. https://www.mass.gov/news/ag-campbell-files-lawsuit-against-meta-instagram-for-unfair-and-deceptive-practices-that-harm-young-people.

Previous
Previous

The Anthropic Lawsuit and the Void in AI Governance

Next
Next

Chevron and Forever Chemicals