Cracking the Code(s): What Age Appropriate Design Means for Parents and Families

Introduction

States have taken up the gauntlet of children’s online safety and privacy as federal options stall. Parents and policymakers are concerned over addictive design features, ensuring their children are in age appropriate online spaces, and protecting minors’ data. One popular approach to address these concerns has been the Age Appropriate Design Code (AADC).

Since its initial passage in the UK in 2020, Age Appropriate Design Codes have sprouted across the country with California passing the first variation in 2022 and Maryland following in 2024. Now, Nebraska and Vermont join the exclusive list of states taking this approach to kids’ online safety and privacy. Despite momentum for the Age Appropriate Design Code, many states are moving toward a narrower approach to online regulation as free speech concerns and legal challenges stall larger online safety bills.

Meanwhile, children’s online experiences hang in the balance. In the constant battle of privacy versus safety, the need for tailored policy solutions that address specific harms is crucial. Young people should have access to safe experiences online even as legislatures and courts debate the future of the Kids Codes. 

The State Breakdown 

While California boldly attempted to apply the Age Appropriate Design in a U.S. context, other states took a wait and see approach with the hopes of bypassing First Amendment challenges. Other legal analyses explain the granular differences between each law; this brief, however, primarily focuses on how these differences will impact parents and families. 

  • What will your child’s online experience look like? 
  • Will new age assurance requirements impact adults’ access and privacy? 
  • Will any of these laws go into effect given recent legal precedent? 

These are all important questions that parents and families want answered as legislation at the state level moves at a rapid pace. 

California’s Challenges

The earliest success but the first failure. While California was the first state to pass the AADC, it was also the first to have the law halted by legal challenges. 

Duty of Care

One of the reasons why California’s law has been so contentious is because it followed the UK in requiring tech companies to act in the “best interest of children.” Referred to as the “duty of care,” this is the most notable sticking point in the law as the jurisdictional definitions of this term impacts the law’s constitutionality. The UK has a clearly established definition of children’s rights since ratifying the UN Convention on the Rights of a Child, while the US has not and there is ambiguous precedent on what those rights are for minors. The US has long since established that children have First Amendment protections and FOSI believes that those rights extend to accessing information online. But how these rights are applied in a US and digital context remains to be seen.

Data Protection Impact Assessments

Data Protection Impact Assessments (DPIA), among other obligations, require companies to assess their platforms for the risks they pose to minors, specifically “whether the design of the product, service, or feature could harm children including by exposing children to harmful, or potentially harmful content.” The most recent court ruling found that California’s law was overbroad in key definitions and requirements and would “force” companies to overmoderate legally protected speech. Part of the justification for ruling that the law will likely be unconstitutional hinges on the state’s inability to show how the DPIAs apply to the design of the website and not the content.  

Age Assurance

California’s law required platforms to either estimate the age of child users or treat all users as minors – impacting adult’s online experiences. At the time the law was passed and subsequently litigated, age assurance requirements were relatively rare for platforms. Now, more laws are requiring these solutions and the technology has caught up to the moment. It is possible that this will be much less of an issue for future design codes and kids’ online safety laws more generally.

Impact

The California law is currently enjoined. In an effort to keep the law alive, California Attorney General Rob Bonta has appealed this decision. While the law will likely not go into effect, California’s early adoption of the law set the stage for the states that followed. 

Maryland Moves in Another Direction

Maryland made waves as the second state to pass a Kids Code, deliberately defining key terms and excluding express age assurance requirements. 

What platforms does the law cover?

The Maryland AADC is similar to California’s in this regard. A platform is covered if it:

  1. Has over $25M in annual revenue 
  2. Processes or sells the data of more than 50,000 users and/or 
  3. Acquires at least half of its annual revenue from selling data. 

It covers all platforms “reasonably likely to be accessed by minors.” This means that many of the large, popular platforms you or your child use may be included. 

Age Assurance

Maryland did not expressly require that platforms conduct age assurance. Rather, the law requires that platforms apply increased privacy protections for minors if the platform has met any of these six requirements:

  1. The online product is directed to children as outlined by COPPA
  2. The platform has determined through “reliable evidence” that the product is “routinely accessed by a significant number of children”
  3. The platform is similar to other platforms that satisfy bullet 2 
  4. The platform advertises to children
  5. The platform’s internal research demonstrates that the platform is accessed by children
  6. The platform knows or should have known the user is a child.

While the law does not expressly require age assurance, critics argue that companies may still be compelled to conduct age assurance or else they would have to moderate otherwise protected speech. 

Duty of Care

Maryland defines that the “best interest of a child” means that a platform must design their platform in a way that does not:

  • Benefit the company to the detriment of the child user
  • Result in reasonably foreseeable and material physical or financial harm 
  • Result in severe and reasonably foreseeable psychological or emotional harm
  • Intrude on a child’s privacy 
  • Discriminate against the child on the basis of race, color, religion, national origin, disability, gender identity, or sexual orientation. 

The breadth of this definition can be seen as key to the survival of the law. By defining the duty of care, Maryland attempted to evade the precedented legal scrutiny and  preempt individuals from using the law to support censorship for kids’ content that they are legally allowed to see. 

Additional Tools Required

The Maryland AADC requires that platforms:

  • Set all minors’ accounts to the highest privacy settings by default and 
  • Create tools that make it easy for children and their guardians to “exercise their privacy rights and report concerns.” 

This would be a helpful addition for parents and families to have easy-to-use tools that give users autonomy in their online experiences. 

Impact

The Maryland Kids’ Code is technically in effect, however NetChoice filed a complaint against the law earlier this year that leaves the future of the law uncertain, arguing that the law forces the government to censor online speech. 

Next Steps in Nebraska and Vermont

Representing the newest iterations of AADCs in the states, Nebraska and Vermont take strikingly different paths towards the design codes.

What platforms do they cover?

Where Nebraska and Vermont overlap is in their definitions of covered platforms. Both states’ laws strictly apply to platforms that derive more than 50% of their revenue from selling personal information. In contrast to Maryland and California’s versions, this is a narrow group of platforms and may not cover large platforms that are subscription or advertising based. 

Age Assurance

Nesbraka’s law does not explicitly require an age assurance mechanism and instead requires an “actual knowledge” standard. This means that the platforms must either use the user’s self declared age or information that they already have on a user based on “marketing, advertising, or product development.”

Vermont on the other hand uses a similar standard as Maryland and covers platforms “reasonably likely to be accessed by minors.” The four criteria that Vermont uses to determine this are:

  1. The online platform falls under an online service directed to minors as defined by COPPA
  2. ”Competent and reliable evidence” or internal research shows that at least 2% of the audience is between two and seventeen years old 
  3. The online platform knows or should have known that at least two percent of its audience is two to seventeen years old. The company cannot collect more data to make this determination. 

In both states, questions remain on the scope of the knowledge standard and how it will be applied to the content that will be shown to the user and the user experience. 

Duty of Care

Nebraska does not have a duty of care requirement, choosing to eliminate this roadblock altogether and instead focus on increased privacy and safety tools for minors. 

Similar to Maryland, Vermont defines harm to a minor. The duty of care means “the use of the personal data of a covered minor and the design of an online service, product, or feature will not result in

  1. Emotional distress
  2. Compulsive use of the online service, product, or feature
  3. Discrimination against a covered minor based upon race, ethnicity, sex, disability, sexual orientation, gender identity, gender expression, religion, or national origin.”

Additional Tools Required

One of the most impactful provisions for parents and families are the requirements to provide user controls. Nebraska outlines a number of tools that allow minors the options to:

  • Block or limit unwanted contact from other users
  • Block the user’s personal data from other users
  • Control all “unnecessary” design features 
  • Allow minors to opt into a chronological feed as opposed to a personalized feed
  • Opt out of in-app purchases
  • Limit screen time.

Vermont requires all of the privacy and safety settings to be set to the highest level by default. Some of Vermont’s default settings include to:

  • Prevent a minor’s account and their content from being shown to an adult’s account unless specified by the minor
  • Prohibit an adult from liking or commenting on a minor’s account unless specified by the minor
  • Prohibiting direct messaging from minors to adults and vice versa
  • Prohibits the platform from displaying other accounts that the user is connected to.

Both Vermont and Nebraska require platforms to, by default, prohibit overnight notifications. These are positive requirements that enable safety by design for young users while still giving autonomy to older teens. 

A Potential Path Forward

While the fate of the Age Appropriate Design Code in the US is uncertain, the fate of kids’ and caregivers’ online experiences should not hang in the balance. A number of bills have been introduced and laws passed that take a more targeted approach to address specific harms, such as companion chatbot legislation or bills addressing algorithms. These approaches may be less vulnerable to First Amendment challenges.  

Recommendations 

While we have yet to crack the age appropriate design code, it is imperative that young people are centered, parents are informed, industry continues to innovate safely, and policymakers integrate research into their proposals. Below are ways that each of these stakeholders can support young people online.

  • Parents
    • Utilize parental controls.
    • Talk with your children early and often about their online experiences. 
    • Stay up-to-date on available technology. Tools like FOSI’s Platforms Explained series dive deep into popular platforms among young people in easy-to-understand terms. 
  • Legislators
  • Industry
    • Build safety into platforms and prioritize the user experience with easy-to-use tools. 
    • Enhance practical transparency measures such as labels when users are interacting with AI. 
    • Establish your platform’s age assurance protocols ensuring flexibility for users, mechanisms proportional to the risk posed to minors, and preserves the privacy of the user.

With collaboration, research, and safety guardrails, a safer internet for young people is possible.

Marissa Edmund

Marissa Edmund is the Policy Specialist at the Family Online Safety Institute (FOSI), working with the policy team to monitor emerging issues in tech policy and contribute her policy and research expertise to ensure the online world is safer for children and families. Prior to joining FOSI, Marissa spent two years at the Center for American Progress as the Sr. Policy Analyst for Gun Violence Prevention focusing heavily on gender-based and domestic violence. Before that, Marissa was the Policy Coordinator at the National Network to End Domestic Violence where she was able to actualize her passion for family safety by conducting research, engaging membership, and educating lawmakers on ways to support survivors of domestic violence. Marissa holds a B.A. in Political Science from the University of Maryland Baltimore County (UMBC).