Privacy laws regulating the collection and processing of children’s personal information are an increasingly compliance-heavy area in what is already an incredibly dynamic and complex privacy law landscape. There are many laws, ranging from the federal Children’s Online Privacy Protection Act (COPPA) in the United States to state-specific laws with components that regulate children’s data, as well as international frameworks that have compliance applications for children’s personal information. The considerations are varied. Much of the analysis, when it comes to determining which privacy laws must be accounted for, will involve identifying the types of processing and jurisdictional applications based on the residency of the children in question. In the educational context, such as with education technology (EdTech), targeted advertising, or online games and apps, there is a broad application of children’s compliance obligations when children’s data is involved.
Several Key Children’s Privacy Laws
While there is an ever-expanding number of privacy laws regulating children’s personal information, there are several key laws that often come into play in many scenarios, which we overview below:
Children’s Online Privacy Protection Act (COPPA)
The Children’s Online Privacy Protection Act (COPPA) is a U.S. federal law enacted in 1998 to protect the privacy of children under 13 years old. The Federal Trade Commission (FTC) enforces COPPA, which requires websites and online services to obtain verifiable parental consent before collecting, using, or disclosing personal information from children, among other compliance obligations. The FTC proposed amendments to COPPA, which aimed to expand the scope of COPPA to require separate opt-in consent for targeted advertising, prohibit conditioning a child’s participation in the collection of personal information, limit data retention, and strengthen data security requirements. In January 2025, the FTC finalized the update to COPPA.
New York Child Data Protection Act
The New York Child Data Protection Act mandates that operators of online services collecting personal data from minors delete such data within 30 days unless they comply with COPPA or obtain informed consent. The act also restricts the provision of addictive feeds to minors and enforces overnight notification bans without parental consent.
California’s Consumer Privacy Act (CCPA)
California, a first-mover on the state level when it comes to comprehensive privacy laws, has one of the most mature and robust privacy laws, including compliance obligations when processing children’s data. For example, the California Consumer Privacy Act (CCPA), as amended by the California Privacy Rights Act (CPRA), states that for certain processing, such as in the context of selling or for targeted advertising, if a business knows that the individual is under the age of 16, they must get “affirmative authorization” (“opt-in”) for the sale or “sharing” for targeted advertising of the child’s personal information. Further, for children under the age of 13, “opt-in” consent to such sale or “sharing” must be obtained from the child’s parent or guardian.
Colorado Privacy Act Amendments
Effective October 1, 2025, the Colorado Privacy Act includes enhanced protections for minors’ data, prohibiting its use for targeted advertising, selling, or profiling without consent. It also mandates data protection assessments for services posing heightened risks to minors.
General Data Protection Regulation (GDPR)
The GDPR, enforced in the European Union (with a similar law in the United Kingdom post-Brexit), includes specific provisions for the protection of children’s data. A significant portion of GDPR fines involves violations related to children’s privacy, highlighting the importance of robust compliance measures for businesses operating in the EU.
California Age-Appropriate Design Code Act (CAADCA)
The CAADCA, effective July 1, 2024, imposes strict obligations on businesses providing online services likely to be accessed by children under 18. It requires companies to implement privacy by design principles and imposes severe penalties for non-compliance.
Sectors of Particular Risk
While any processing of children’s data requires additional compliance considerations, several sectors are increasingly in the crosshairs of regulators and, therefore, need even more stringency, including the following:
Education Technology (EdTech)
The FTC has increased its enforcement of COPPA against EdTech providers. Companies must ensure that they do not collect more personal data than necessary and must obtain verifiable parental consent. Recent actions against Edmodo highlight the importance of compliance in this sector.
Targeted Advertising
Businesses involved in targeted advertising must navigate stringent consent requirements and prohibitions on using children’s data for advertising purposes. The FTC’s settlement with OpenX, along with proposed COPPA amendments and state laws such as California’s CCPA and Colorado’s Privacy Act, emphasizes the need for separate opt-in consent for targeted advertising to children.
Online Games and Apps
Online games and apps frequently attract young users, making compliance with children’s privacy laws critical. Recent enforcement actions against companies, such as the FTC’s record-breaking $5.5 billion settlement with Epic Games, underscore the severe consequences of non-compliance, including substantial fines and operational bans.
Children’s Privacy Regulatory Enforcement and Private Actions
With the increasingly ubiquitous data processing of children, regulators ranging from the federal government to state enforcement, as well as international regulators, are ramping up enforcement for non-compliance with children’s privacy laws, including some of the most notable enforcement actions we overview below:
FTC Enforcement
The FTC has been active in enforcing children’s privacy laws, as seen in cases against Edmodo and NGL Labs. These actions often result in significant settlements and operational restrictions, emphasizing the importance of compliance.
GDPR Fines
GDPR enforcement has led to substantial fines for social media platforms, such as the hefty fine levied against Instagram and the Irish Data Protection Commissioner’s case against TikTok for allegedly mishandling children’s data. These fines highlight the rigorous standards and severe penalties associated with GDPR compliance.
Department of Justice and State Actions
The FTC, via a referral to the U.S. Department of Justice, has sued TikTok for illegally collecting children’s data and failing to comply with COPPA. This lawsuit exemplifies the potential legal ramifications for companies that violate children’s privacy laws. State attorneys general are also active in enforcing non-compliance with how companies use children’s data, as was illustrated in the case brought by Arkansas against Meta and TikTok.
Helping Clients Navigate COPPA and Other Children’s Privacy Compliance
At RICHT, we are privacy-focused technology lawyers helping clients stay ahead of the ever-changing compliance curve, including when it comes to data processing involving the personal information of children. Whether it is COPPA, GDPR, or state laws such as the CCPA, we help guide clients to success while accounting for and mitigating risk.
Learn How We Can Help You Comply With Children’s Privacy Laws
Featured Video: Children’s Privacy in 2026: Navigating New COPPA Rules and US State Laws
Children’s Privacy Law News & Developments
- Michigan Court Narrows Roku Suit: A federal court dismissed most claims in a privacy lawsuit against Roku while allowing Children’s Online Privacy Protection Act allegations to proceed. The ruling emphasizes the difficulty of maintaining broad privacy claims under state law when specific federal statutes apply. OUR TAKEAWAY: Companies must prioritize strict COPPA compliance and precise data handling disclosures, as courts are increasingly focused on specific statutory protections for minors over generalized privacy grievances. Read More →
- Bondu AI Toy Data Breach: A security vulnerability in Bondu’s web portal allowed anyone with a Gmail account to access over 50,000 private chat logs between children and their AI toys. This flaw exposed children’s information, prompting urgent calls for stricter security standards in the smart toy industry. OUR TAKEAWAY: Companies must prioritize “security-by-design” and implement robust authentication protocols to prevent the catastrophic exposure of children’s private data to unauthorized users. Read More →
- Children’s Privacy in 2026: Global regulations are shifting toward strict age-based bans and substantive safety standards rather than simple parental consent models. This transition reflects a broader international effort to fundamentally redesign how digital platforms interact with young users. OUR TAKEAWAY: Organizations must pivot from passive notice-and-consent frameworks toward proactive, risk-based engineering to satisfy increasingly aggressive cross-jurisdictional enforcement trends. Read More →
- South Carolina’s New Online Safety Law: South Carolina recently enacted the Age-Appropriate Code Design Act, imposing immediate, expansive obligations on online services likely accessed by minors. The law mandates strict design standards, data minimization, and annual third-party audits to mitigate potential harms. OUR TAKEAWAY: Companies must immediately audit their platform’s design features and data practices to ensure compliance and avoid significant exposure to treble damages and personal officer liability. Read More →
- FTC COPPA Age Verification Policy: The FTC issued a new policy statement encouraging the use of robust age verification technologies to better protect children online. This guidance clarifies that implementing such tools can help companies demonstrate compliance and mitigate legal risks under COPPA. OUR TAKEAWAY: Companies must prioritize high-assurance age estimation methods over simple self-reporting to align with the FTC’s heightened enforcement focus on child privacy. Read More →
- FPF Updates Age Assurance Infographic: The Future of Privacy Forum’s updated resource introduces “inference” as a fourth technology category while downplaying standalone age declarations. It highlights emerging standards and risk mitigations like zero-knowledge proofs to balance user privacy with regulatory compliance. OUR TAKEAWAY: Compliance teams should adopt a risk-based, layered “waterfall” approach to age assurance that prioritizes data minimization and emerging cryptographic standards over easily bypassed self-reporting methods. Read More →
- FTC’s 2026 Age Verification Goals: The Federal Trade Commission is pushing for sophisticated age verification standards to enhance online safety for minors. This initiative prioritizes privacy-preserving technologies that minimize data collection while ensuring robust verification across digital platforms. OUR TAKEAWAY: Companies must transition toward privacy-by-design authentication methods to meet heightened regulatory expectations for protecting young users without compromising general consumer data security. Read More →
- South Carolina Enacts Age-Appropriate Design Code Act: South Carolina has officially joined the growing list of states regulating youth online safety by enacting legislation that requires platforms to prioritize the privacy and safety of minor users through mandatory impact assessments and default high-privacy settings. OUR TAKEAWAY: Legal teams of platforms within scope of the law must immediately update their privacy-by-design frameworks to include specific “best interest of the child” documentation, as this law creates a stringent compliance floor for digital services accessible to South Carolina residents under the age of 18. Read More →
- Imgur Owner MediaLab Fined Over Children’s Privacy Failures: The UK Information Commissioner’s Office (ICO) has issued a £247,590 fine to MediaLab for failing to implement age-assurance measures or obtain parental consent, which allowed children under 13 to access the Imgur platform and potentially encounter harmful content. OUR TAKEAWAY: Organizations operating platforms likely to be accessed by minors must conduct formal Data Protection Impact Assessments (DPIAs) and implement robust age-verification technical controls to meet the strict “best interests of the child” standard mandated by the UK Children’s Code. Read More →
- FTC shares insight into its children’s privacy priorities: The Federal Trade Commission (FTC) is prioritizing enforcement of the amended COPPA Rule, which mandates separate parental consent for third-party data disclosures and limits data retention, effective April 22, 2026. Additionally, the agency will enforce the “TAKE IT DOWN Act” civil provisions regarding nonconsensual intimate deepfakes and is actively scrutinizing age verification technologies through upcoming workshops and industry actions. OUR TAKEAWAY: Organizations must urgently update their consent flows to unbundle third-party sharing from general service terms and implement robust age verification mechanisms to align with the FTC’s aggressive enforcement stance on minor safety and deepfake removal. Read More →
- ChatGPT to predict users’ ages to protect teen users: OpenAI has deployed an age prediction model for ChatGPT that analyzes behavioral signals—such as usage patterns, activity times, and account history—to identify users under 18 and automatically apply stricter safety content filters. The system, which aims to mitigate legal and regulatory risks surrounding minor safety, defaults to “teen mode” when uncertain but allows adults to restore full access via third-party identity verification. OUR TAKEAWAY: Compliance officers must ensure that any behavioral profiling for safety purposes is strictly ring-fenced from commercial data usage and clearly disclosed to avoid “surveillance” allegations under privacy laws like the GDPR and COPPA. Read More →
- App Store Age-Assurance Laws for Developers: New state age-assurance regimes (e.g., in Texas, Utah, Louisiana, and California) require app stores to verify user age at account creation, place users into age bands (typically under 13, 13–15, 16–17, and 18+), and obtain parental consent before minors can download apps or make in‑app purchases. Developers must consume age and consent “signals” from the app stores, treat those signals as actual knowledge of a user’s age, and then configure age-appropriate experiences—adjusting features, content, default settings, and data practices for children and teens, and honoring parental controls and revocations of consent across all platforms where the app is available. These laws also restrict how developers can use the age/consent data they receive (often limiting it to legal compliance and safety purposes) and may require deletion after age checks, so apps need updated data flows, retention rules, and COPPA/state youth-privacy compliance tied directly to app-store signals rather than self-declared ages. Read More →
- Global Privacy Regulators Launch Enforcement Sweep Focused on Children’s Data Protection: The Global Privacy Enforcement Network (GPEN), comprising over 30 national data protection authorities, including the FTC, California Attorney General, UK ICO, CNIL, and Irish DPC, has initiated its annual privacy sweep targeting digital services used by children. This year’s focus is on assessing compliance with data collection limitations, transparency in privacy practices, and age verification mechanisms on websites and mobile apps catering to minors. The sweep coincides with increased legislative efforts in the U.S. to protect children’s online privacy and safety, and marks the tenth anniversary of a similar 2015 enforcement effort. Companies servicing children should review their data practices to ensure adherence to applicable regulations as regulatory scrutiny intensifies. Read More →
- Attorney General Bonta Joins States in Securing $5.1 Million in Settlements from Education Software Company for Failing to Protect Students’ Data California Attorney General Rob Bonta, along with Connecticut and New York AGs, announced a $5.1 million settlement with Illuminate Education, Inc. following a 2021 data breach exposing sensitive personal and medical information of millions of students, including over 434,000 California students. The investigation revealed Illuminate’s failure to secure data through proper credential management, monitoring, and backup protection, alongside deceptive privacy claims. As part of the settlement, Illuminate will enhance access controls, real-time monitoring, safeguards for backups, breach notification, and data retention reviews with school districts. This marks the DOJ’s first enforcement under California’s K-12 Pupil Online Personal Information Protection Act (KOPIPA), underscoring Attorney General Bonta’s commitment to safeguarding children’s data.
Read More → - Florida AG Sues Roku Over Alleged Privacy Violations Involving Children’s and Sensitive Data
Florida Attorney General James Uthmeier filed a lawsuit claiming Roku illegally processes data from children without parental consent and collects sensitive data from adults and teens without proper authorization. The complaint alleges Roku, which is in nearly half of U.S. households, profits by collecting and selling user data, including precise geolocation and viewing habits, to third parties, some of whom reidentify users. The lawsuit cites violations of Florida’s 2023 Digital Bill of Rights, requiring affirmative consent for minors’ data and explicit permission for processing sensitive information. Roku is also accused of failing to identify which users are children despite having features like kid-specific screensavers. Roku disputes the claims and intends to fight the suit. Similar litigation is ongoing in Michigan.
Read More → - OpenAI Adds Parental Controls to ChatGPT After Safety Concerns
OpenAI has introduced new parental controls for ChatGPT aimed at creating a safer, age-appropriate experience for teens. The update follows growing scrutiny over the chatbot’s role in recent youth safety incidents and a federal inquiry into AI’s potential harms to minors. Parents and teens can now link accounts, allowing guardians to oversee how ChatGPT operates for younger users. Linked teen accounts automatically receive extra content protections that filter out graphic, sexual, or violent material, as well as viral challenges and unrealistic beauty content. Parents can further fine-tune settings—limiting usage hours, disabling memory, turning off image and voice features, and opting out of data training. OpenAI also introduced a notification system that alerts parents if the system detects signs of self-harm, with a specialist team reviewing potential emergencies. While the company warns no system is foolproof, it emphasized that acting early is better than missing signs of danger.
Read More → - Justice Department Files Complaint Against Social Media Company Iconic Hearts Holdings Inc.
The U.S. Department of Justice, working with the Federal Trade Commission, has filed a complaint against Iconic Hearts Holdings Inc., alleging violations of children’s privacy laws by the social media platform. The complaint seeks to address improper collection and handling of minors’ personal data in violation of federal privacy regulations designed to protect children online. This action reflects renewed government scrutiny of tech companies’ compliance with children’s privacy laws and underscores regulators’ growing enforcement efforts to safeguard young users’ digital rights.
Read More → - Analyzing the 2025 Children’s Privacy Laws and Regulations:
Children’s privacy remained a key focus in 2025, with multiple states passing new laws or amendments regulating the processing of minors’ personal data, adding to an increasingly complex patchwork. States including California, Colorado, Connecticut, Montana, Oregon, Arkansas, Nebraska, Vermont, Louisiana, Texas, and Utah introduced measures addressing age-appropriate design, parental consent, restrictions on targeted advertising, and data minimization for children and teens under various age thresholds. Additionally, new app store accountability laws require age verification, parental consent, and safety protections for minors using digital services. These evolving laws emphasize heightened protections for minors across sectors, requiring businesses to carefully navigate differing state obligations and implement robust compliance frameworks.
Read More → - Disney to Pay $10 Million to Settle FTC Allegations the Company Enabled Unlawful Collection of Children’s Personal Data:
The Federal Trade Commission announced a $10 million settlement with Disney over allegations that the company failed to properly label children’s videos on YouTube, leading to unlawful collection of personal data from children under 13 without parental consent. The complaint highlighted Disney’s misclassification of videos as not “Made for Kids,” allowing targeted advertising and exposure to inappropriate features. Under the settlement, Disney must pay the penalty, comply with the Children’s Online Privacy Protection Rule (COPPA), and implement a video-review program unless age assurance technologies are adopted by YouTube.
Read More → - Google to Pay $30 Million to Settle YouTube Children’s Privacy Lawsuit
Google has agreed to a $30 million settlement to resolve a class action alleging that YouTube collected personal data from millions of children under 13 without parental consent to target ads. The lawsuit covers U.S. children who used YouTube from July 2013 to April 2020, with an estimated 35 to 45 million eligible for compensation. Despite the settlement, Google denies any wrongdoing, and the deal requires court approval before payouts.
Read More → - Louisiana Sues Roblox Over Child Safety Failures:
Louisiana Attorney General Liz Murrill filed a lawsuit against Roblox, accusing the platform of enabling child predators and failing to protect its youngest users. The suit claims Roblox knowingly allowed systemic sexual exploitation by letting users create accounts without proper age verification, enabling predators to pose as children. It highlights disturbing content and cases like a suspect arrested for possessing child abuse material while active on Roblox using voice-altering technology. The state alleges Roblox prioritized profits over child safety. In response, Roblox announced new, stricter content rules, enhanced age verification requiring ID for adult-only areas, and AI-driven real-time content monitoring. The lawsuit seeks injunctions, fines, and reforms to the platform’s safety practices. Read More → - Are New Global Age Verification Requirements Creating a Children’s Online Safety Legal Patchwork?
A global surge in age verification laws aims to protect children online, but also creates a complex patchwork of regulations. Australia leads with strict rules banning under-16s from social media, while the EU pushes for privacy-preserving, auditable age checks under the Digital Services Act. The UK adopts a middle-ground approach with varied verification methods under the Online Safety Act, emphasizing privacy and proportionality. Canada takes a cautious, consultative stance, focusing on risk-based, privacy-respecting solutions. The U.S. is more fragmented, grappling with First Amendment concerns and state-level laws that vary widely. Amid this landscape, experts call for international standards and technology innovations like zero-knowledge proofs to harmonize compliance and protect children effectively. Read More → - COPPA Rules enter into force: The updated Children’s Online Privacy Protection (COPPA) Rules, effective June 23, 2025, introduce new definitions for ‘mixed audience website or online service,’ expand definitions for ‘online contact information’ and ‘personal information,’ and impose limits on data retention. Read More
- While Federal Legislation Flounders, State Privacy Laws for Children and Teens Gain Momentum: In the absence of more robust federal requirements, states are stepping in to regulate not only the processing of all minors’ data, but also online platforms used by teens and children. Read More
- Michigan AG Dana Nessel files lawsuit against Roku for allegedly violating children’s data privacy laws: Michigan Attorney General Dana Nessel has filed a lawsuit against Roku, Inc., claiming the company violates the Children’s Online Privacy Protection Act (COPPA) and the Michigan Consumer Protection Act. Read More
- Updated COPPA Rule (Finally) Finalized Today: Today the updated Children’s Online Privacy Protection (COPPA) Rule was published in the Federal Register, finalizing the much-needed modernization to the COPPA Rule. After a nearly 6-year rulemaking process, the Federal Trade Commission unanimously approved the updated COPPA Rule in January 2025, but it did not become effective until today’s publication. Finalizing the COPPA Rule improves the Commission’s ability to protect kids online. Read More
- Top 5 impacts of the new COPPA Rule
- Snap Shares Drop as FTC Refers MyAI Chatbot Complaint to the DOJ
- FTC Finalizes Children’s Privacy Rule Minimizing Data Collection
- Texas: AG sues TikTok for deceptive promotion under Deceptive Trade Practices Act
- Colorado Finalizes Privacy Act Rules: Key Updates for Businesses
- Attorney General Ken Paxton Launches Investigations into Character.AI, Reddit, Instagram, Discord, and Other Companies over Children’s Privacy and Safety Practices as Texas Leads the Nation in Data Privacy Enforcement
- Free Speech Battles and Age-Appropriate Balance: Maryland and Connecticut Try Again for Youth Safety Rules
- Children’s State Privacy Law Update and Tracker
- Guide to Navigating COPPA
- Complying with COPPA: Frequently Asked Questions
- How Will Contextual Advertising Fare When The FTC Revises Its COPPA Rule?
- Implementing Kids’ Privacy Protections Around The World: The PerfectPetPal Case Study
- Was Tilting Point Media’s Children’s Privacy “Nautical Nonsense”?
- Instagram Rolls Out Teen Accounts With Privacy, Parental Controls As Scrutiny Mounts
- Texas Attorney General Sues TikTok Over Alleged Sale of Minors’ PII