Privacy laws regulating the collection and processing of children’s personal information are an increasingly compliance-heavy area in what is already an incredibly dynamic and complex privacy law landscape. There are many laws, ranging from the federal Children’s Online Privacy Protection Act (COPPA) in the United States to state-specific laws with components that regulate children’s data, as well as international frameworks that have compliance applications for children’s personal information. The considerations are varied. Much of the analysis, when it comes to determining which privacy laws must be accounted for, will involve identifying the types of processing and jurisdictional applications based on the residency of the children in question. In the educational context, such as with education technology (EdTech), targeted advertising, or online games and apps, there is a broad application of children’s compliance obligations when children’s data is involved.
Several Key Children’s Privacy Laws
While there is an ever-expanding number of privacy laws regulating children’s personal information, there are several key laws that often come into play in many scenarios, which we overview below:
Children’s Online Privacy Protection Act (COPPA)
The Children’s Online Privacy Protection Act (COPPA) is a U.S. federal law enacted in 1998 to protect the privacy of children under 13 years old. The Federal Trade Commission (FTC) enforces COPPA, which requires websites and online services to obtain verifiable parental consent before collecting, using, or disclosing personal information from children, among other compliance obligations. The FTC proposed amendments to COPPA, which aimed to expand the scope of COPPA to require separate opt-in consent for targeted advertising, prohibit conditioning a child’s participation in the collection of personal information, limit data retention, and strengthen data security requirements. In January 2025, the FTC finalized the update to COPPA.
New York Child Data Protection Act
The New York Child Data Protection Act mandates that operators of online services collecting personal data from minors delete such data within 30 days unless they comply with COPPA or obtain informed consent. The act also restricts the provision of addictive feeds to minors and enforces overnight notification bans without parental consent.
California’s Consumer Privacy Act (CCPA)
California, a first-mover on the state level when it comes to comprehensive privacy laws, has one of the most mature and robust privacy laws, including compliance obligations when processing children’s data. For example, the California Consumer Privacy Act (CCPA), as amended by the California Privacy Rights Act (CPRA), states that for certain processing, such as in the context of selling or for targeted advertising, if a business knows that the individual is under the age of 16, they must get “affirmative authorization” (“opt-in”) for the sale or “sharing” for targeted advertising of the child’s personal information. Further, for children under the age of 13, “opt-in” consent to such sale or “sharing” must be obtained from the child’s parent or guardian.
Colorado Privacy Act Amendments
Effective October 1, 2025, the Colorado Privacy Act includes enhanced protections for minors’ data, prohibiting its use for targeted advertising, selling, or profiling without consent. It also mandates data protection assessments for services posing heightened risks to minors.
General Data Protection Regulation (GDPR)
The GDPR, enforced in the European Union (with a similar law in the United Kingdom post-Brexit), includes specific provisions for the protection of children’s data. A significant portion of GDPR fines involves violations related to children’s privacy, highlighting the importance of robust compliance measures for businesses operating in the EU.
California Age-Appropriate Design Code Act (CAADCA)
The CAADCA, effective July 1, 2024, imposes strict obligations on businesses providing online services likely to be accessed by children under 18. It requires companies to implement privacy by design principles and imposes severe penalties for non-compliance.
Sectors of Particular Risk
While any processing of children’s data requires additional compliance considerations, several sectors are increasingly in the crosshairs of regulators and, therefore, need even more stringency, including the following:
Education Technology (EdTech)
The FTC has increased its enforcement of COPPA against EdTech providers. Companies must ensure that they do not collect more personal data than necessary and must obtain verifiable parental consent. Recent actions against Edmodo highlight the importance of compliance in this sector.
Targeted Advertising
Businesses involved in targeted advertising must navigate stringent consent requirements and prohibitions on using children’s data for advertising purposes. The FTC’s settlement with OpenX, along with proposed COPPA amendments and state laws such as California’s CCPA and Colorado’s Privacy Act, emphasizes the need for separate opt-in consent for targeted advertising to children.
Online Games and Apps
Online games and apps frequently attract young users, making compliance with children’s privacy laws critical. Recent enforcement actions against companies, such as the FTC’s record-breaking $5.5 billion settlement with Epic Games, underscore the severe consequences of non-compliance, including substantial fines and operational bans.
Children’s Privacy Regulatory Enforcement and Private Actions
With the increasingly ubiquitous data processing of children, regulators ranging from the federal government to state enforcement, as well as international regulators, are ramping up enforcement for non-compliance with children’s privacy laws, including some of the most notable enforcement actions we overview below:
FTC Enforcement
The FTC has been active in enforcing children’s privacy laws, as seen in cases against Edmodo and NGL Labs. These actions often result in significant settlements and operational restrictions, emphasizing the importance of compliance.
GDPR Fines
GDPR enforcement has led to substantial fines for social media platforms, such as the hefty fine levied against Instagram and the Irish Data Protection Commissioner’s case against TikTok for allegedly mishandling children’s data. These fines highlight the rigorous standards and severe penalties associated with GDPR compliance.
Department of Justice and State Actions
The FTC, via a referral to the U.S. Department of Justice, has sued TikTok for illegally collecting children’s data and failing to comply with COPPA. This lawsuit exemplifies the potential legal ramifications for companies that violate children’s privacy laws. State attorneys general are also active in enforcing non-compliance with how companies use children’s data, as was illustrated in the case brought by Arkansas against Meta and TikTok.
Helping Clients Navigate COPPA and Other Children’s Privacy Compliance
At RICHT, we are privacy-focused technology lawyers helping clients stay ahead of the ever-changing compliance curve, including when it comes to data processing involving the personal information of children. Whether it is COPPA, GDPR, or state laws such as the CCPA, we help guide clients to success while accounting for and mitigating risk.
Learn How We Can Help You Comply With Children’s Privacy Laws
Children’s Privacy Law Resources
- Attorney General Bonta Joins States in Securing $5.1 Million in Settlements from Education Software Company for Failing to Protect Students’ Data California Attorney General Rob Bonta, along with Connecticut and New York AGs, announced a $5.1 million settlement with Illuminate Education, Inc. following a 2021 data breach exposing sensitive personal and medical information of millions of students, including over 434,000 California students. The investigation revealed Illuminate’s failure to secure data through proper credential management, monitoring, and backup protection, alongside deceptive privacy claims. As part of the settlement, Illuminate will enhance access controls, real-time monitoring, safeguards for backups, breach notification, and data retention reviews with school districts. This marks the DOJ’s first enforcement under California’s K-12 Pupil Online Personal Information Protection Act (KOPIPA), underscoring Attorney General Bonta’s commitment to safeguarding children’s data.
Read More → - Florida AG Sues Roku Over Alleged Privacy Violations Involving Children’s and Sensitive Data
Florida Attorney General James Uthmeier filed a lawsuit claiming Roku illegally processes data from children without parental consent and collects sensitive data from adults and teens without proper authorization. The complaint alleges Roku, which is in nearly half of U.S. households, profits by collecting and selling user data, including precise geolocation and viewing habits, to third parties, some of whom reidentify users. The lawsuit cites violations of Florida’s 2023 Digital Bill of Rights, requiring affirmative consent for minors’ data and explicit permission for processing sensitive information. Roku is also accused of failing to identify which users are children despite having features like kid-specific screensavers. Roku disputes the claims and intends to fight the suit. Similar litigation is ongoing in Michigan.
Read More → - OpenAI Adds Parental Controls to ChatGPT After Safety Concerns
OpenAI has introduced new parental controls for ChatGPT aimed at creating a safer, age-appropriate experience for teens. The update follows growing scrutiny over the chatbot’s role in recent youth safety incidents and a federal inquiry into AI’s potential harms to minors. Parents and teens can now link accounts, allowing guardians to oversee how ChatGPT operates for younger users. Linked teen accounts automatically receive extra content protections that filter out graphic, sexual, or violent material, as well as viral challenges and unrealistic beauty content. Parents can further fine-tune settings—limiting usage hours, disabling memory, turning off image and voice features, and opting out of data training. OpenAI also introduced a notification system that alerts parents if the system detects signs of self-harm, with a specialist team reviewing potential emergencies. While the company warns no system is foolproof, it emphasized that acting early is better than missing signs of danger.
Read More → - Justice Department Files Complaint Against Social Media Company Iconic Hearts Holdings Inc.
The U.S. Department of Justice, working with the Federal Trade Commission, has filed a complaint against Iconic Hearts Holdings Inc., alleging violations of children’s privacy laws by the social media platform. The complaint seeks to address improper collection and handling of minors’ personal data in violation of federal privacy regulations designed to protect children online. This action reflects renewed government scrutiny of tech companies’ compliance with children’s privacy laws and underscores regulators’ growing enforcement efforts to safeguard young users’ digital rights.
Read More → - Analyzing the 2025 Children’s Privacy Laws and Regulations:
Children’s privacy remained a key focus in 2025, with multiple states passing new laws or amendments regulating the processing of minors’ personal data, adding to an increasingly complex patchwork. States including California, Colorado, Connecticut, Montana, Oregon, Arkansas, Nebraska, Vermont, Louisiana, Texas, and Utah introduced measures addressing age-appropriate design, parental consent, restrictions on targeted advertising, and data minimization for children and teens under various age thresholds. Additionally, new app store accountability laws require age verification, parental consent, and safety protections for minors using digital services. These evolving laws emphasize heightened protections for minors across sectors, requiring businesses to carefully navigate differing state obligations and implement robust compliance frameworks.
Read More → - Disney to Pay $10 Million to Settle FTC Allegations the Company Enabled Unlawful Collection of Children’s Personal Data:
The Federal Trade Commission announced a $10 million settlement with Disney over allegations that the company failed to properly label children’s videos on YouTube, leading to unlawful collection of personal data from children under 13 without parental consent. The complaint highlighted Disney’s misclassification of videos as not “Made for Kids,” allowing targeted advertising and exposure to inappropriate features. Under the settlement, Disney must pay the penalty, comply with the Children’s Online Privacy Protection Rule (COPPA), and implement a video-review program unless age assurance technologies are adopted by YouTube.
Read More → - Google to Pay $30 Million to Settle YouTube Children’s Privacy Lawsuit
Google has agreed to a $30 million settlement to resolve a class action alleging that YouTube collected personal data from millions of children under 13 without parental consent to target ads. The lawsuit covers U.S. children who used YouTube from July 2013 to April 2020, with an estimated 35 to 45 million eligible for compensation. Despite the settlement, Google denies any wrongdoing, and the deal requires court approval before payouts.
Read More → - Louisiana Sues Roblox Over Child Safety Failures:
Louisiana Attorney General Liz Murrill filed a lawsuit against Roblox, accusing the platform of enabling child predators and failing to protect its youngest users. The suit claims Roblox knowingly allowed systemic sexual exploitation by letting users create accounts without proper age verification, enabling predators to pose as children. It highlights disturbing content and cases like a suspect arrested for possessing child abuse material while active on Roblox using voice-altering technology. The state alleges Roblox prioritized profits over child safety. In response, Roblox announced new, stricter content rules, enhanced age verification requiring ID for adult-only areas, and AI-driven real-time content monitoring. The lawsuit seeks injunctions, fines, and reforms to the platform’s safety practices. Read More → - Are New Global Age Verification Requirements Creating a Children’s Online Safety Legal Patchwork?
A global surge in age verification laws aims to protect children online, but also creates a complex patchwork of regulations. Australia leads with strict rules banning under-16s from social media, while the EU pushes for privacy-preserving, auditable age checks under the Digital Services Act. The UK adopts a middle-ground approach with varied verification methods under the Online Safety Act, emphasizing privacy and proportionality. Canada takes a cautious, consultative stance, focusing on risk-based, privacy-respecting solutions. The U.S. is more fragmented, grappling with First Amendment concerns and state-level laws that vary widely. Amid this landscape, experts call for international standards and technology innovations like zero-knowledge proofs to harmonize compliance and protect children effectively. Read More → - COPPA Rules enter into force: The updated Children’s Online Privacy Protection (COPPA) Rules, effective June 23, 2025, introduce new definitions for ‘mixed audience website or online service,’ expand definitions for ‘online contact information’ and ‘personal information,’ and impose limits on data retention. Read More
- While Federal Legislation Flounders, State Privacy Laws for Children and Teens Gain Momentum: In the absence of more robust federal requirements, states are stepping in to regulate not only the processing of all minors’ data, but also online platforms used by teens and children. Read More
- Michigan AG Dana Nessel files lawsuit against Roku for allegedly violating children’s data privacy laws: Michigan Attorney General Dana Nessel has filed a lawsuit against Roku, Inc., claiming the company violates the Children’s Online Privacy Protection Act (COPPA) and the Michigan Consumer Protection Act. Read More
- Updated COPPA Rule (Finally) Finalized Today: Today the updated Children’s Online Privacy Protection (COPPA) Rule was published in the Federal Register, finalizing the much-needed modernization to the COPPA Rule. After a nearly 6-year rulemaking process, the Federal Trade Commission unanimously approved the updated COPPA Rule in January 2025, but it did not become effective until today’s publication. Finalizing the COPPA Rule improves the Commission’s ability to protect kids online. Read More
- Top 5 impacts of the new COPPA Rule
- Snap Shares Drop as FTC Refers MyAI Chatbot Complaint to the DOJ
- FTC Finalizes Children’s Privacy Rule Minimizing Data Collection
- Texas: AG sues TikTok for deceptive promotion under Deceptive Trade Practices Act
- Colorado Finalizes Privacy Act Rules: Key Updates for Businesses
- Attorney General Ken Paxton Launches Investigations into Character.AI, Reddit, Instagram, Discord, and Other Companies over Children’s Privacy and Safety Practices as Texas Leads the Nation in Data Privacy Enforcement
- Free Speech Battles and Age-Appropriate Balance: Maryland and Connecticut Try Again for Youth Safety Rules
- Children’s State Privacy Law Update and Tracker
- Guide to Navigating COPPA
- Complying with COPPA: Frequently Asked Questions
- How Will Contextual Advertising Fare When The FTC Revises Its COPPA Rule?
- Implementing Kids’ Privacy Protections Around The World: The PerfectPetPal Case Study
- Was Tilting Point Media’s Children’s Privacy “Nautical Nonsense”?
- Instagram Rolls Out Teen Accounts With Privacy, Parental Controls As Scrutiny Mounts
- Texas Attorney General Sues TikTok Over Alleged Sale of Minors’ PII