
Introduction
States have taken up the gauntlet of children’s online safety and privacy as federal options stall. Parents and policymakers are concerned over addictive design features, ensuring their children are in age appropriate online spaces, and protecting minors’ data. One popular approach to address these concerns has been the Age Appropriate Design Code (AADC).
Since its initial passage in the UK in 2020, Age Appropriate Design Codes have sprouted across the country with California passing the first variation in 2022 and Maryland following in 2024. Now, Nebraska and Vermont join the exclusive list of states taking this approach to kids’ online safety and privacy. Despite momentum for the Age Appropriate Design Code, many states are moving toward a narrower approach to online regulation as free speech concerns and legal challenges stall larger online safety bills.
Meanwhile, children’s online experiences hang in the balance. In the constant battle of privacy versus safety, the need for tailored policy solutions that address specific harms is crucial. Young people should have access to safe experiences online even as legislatures and courts debate the future of the Kids Codes.
The State Breakdown
While California boldly attempted to apply the Age Appropriate Design in a U.S. context, other states took a wait and see approach with the hopes of bypassing First Amendment challenges. Other legal analyses explain the granular differences between each law; this brief, however, primarily focuses on how these differences will impact parents and families.
These are all important questions that parents and families want answered as legislation at the state level moves at a rapid pace.
California’s Challenges
The earliest success but the first failure. While California was the first state to pass the AADC, it was also the first to have the law halted by legal challenges.
Duty of Care
One of the reasons why California’s law has been so contentious is because it followed the UK in requiring tech companies to act in the “best interest of children.” Referred to as the “duty of care,” this is the most notable sticking point in the law as the jurisdictional definitions of this term impacts the law’s constitutionality. The UK has a clearly established definition of children’s rights since ratifying the UN Convention on the Rights of a Child, while the US has not and there is ambiguous precedent on what those rights are for minors. The US has long since established that children have First Amendment protections and FOSI believes that those rights extend to accessing information online. But how these rights are applied in a US and digital context remains to be seen.
Data Protection Impact Assessments
Data Protection Impact Assessments (DPIA), among other obligations, require companies to assess their platforms for the risks they pose to minors, specifically “whether the design of the product, service, or feature could harm children including by exposing children to harmful, or potentially harmful content.” The most recent court ruling found that California’s law was overbroad in key definitions and requirements and would “force” companies to overmoderate legally protected speech. Part of the justification for ruling that the law will likely be unconstitutional hinges on the state’s inability to show how the DPIAs apply to the design of the website and not the content.
Age Assurance
California’s law required platforms to either estimate the age of child users or treat all users as minors – impacting adult’s online experiences. At the time the law was passed and subsequently litigated, age assurance requirements were relatively rare for platforms. Now, more laws are requiring these solutions and the technology has caught up to the moment. It is possible that this will be much less of an issue for future design codes and kids’ online safety laws more generally.
Impact
The California law is currently enjoined. In an effort to keep the law alive, California Attorney General Rob Bonta has appealed this decision. While the law will likely not go into effect, California’s early adoption of the law set the stage for the states that followed.
Maryland Moves in Another Direction
Maryland made waves as the second state to pass a Kids Code, deliberately defining key terms and excluding express age assurance requirements.
What platforms does the law cover?
The Maryland AADC is similar to California’s in this regard. A platform is covered if it:
It covers all platforms “reasonably likely to be accessed by minors.” This means that many of the large, popular platforms you or your child use may be included.
Age Assurance
Maryland did not expressly require that platforms conduct age assurance. Rather, the law requires that platforms apply increased privacy protections for minors if the platform has met any of these six requirements:
While the law does not expressly require age assurance, critics argue that companies may still be compelled to conduct age assurance or else they would have to moderate otherwise protected speech.
Duty of Care
Maryland defines that the “best interest of a child” means that a platform must design their platform in a way that does not:
The breadth of this definition can be seen as key to the survival of the law. By defining the duty of care, Maryland attempted to evade the precedented legal scrutiny and preempt individuals from using the law to support censorship for kids’ content that they are legally allowed to see.
Additional Tools Required
The Maryland AADC requires that platforms:
This would be a helpful addition for parents and families to have easy-to-use tools that give users autonomy in their online experiences.
Impact
The Maryland Kids’ Code is technically in effect, however NetChoice filed a complaint against the law earlier this year that leaves the future of the law uncertain, arguing that the law forces the government to censor online speech.
Next Steps in Nebraska and Vermont
Representing the newest iterations of AADCs in the states, Nebraska and Vermont take strikingly different paths towards the design codes.
What platforms do they cover?
Where Nebraska and Vermont overlap is in their definitions of covered platforms. Both states’ laws strictly apply to platforms that derive more than 50% of their revenue from selling personal information. In contrast to Maryland and California’s versions, this is a narrow group of platforms and may not cover large platforms that are subscription or advertising based.
Age Assurance
Nesbraka’s law does not explicitly require an age assurance mechanism and instead requires an “actual knowledge” standard. This means that the platforms must either use the user’s self declared age or information that they already have on a user based on “marketing, advertising, or product development.”
Vermont on the other hand uses a similar standard as Maryland and covers platforms “reasonably likely to be accessed by minors.” The four criteria that Vermont uses to determine this are:
In both states, questions remain on the scope of the knowledge standard and how it will be applied to the content that will be shown to the user and the user experience.
Duty of Care
Nebraska does not have a duty of care requirement, choosing to eliminate this roadblock altogether and instead focus on increased privacy and safety tools for minors.
Similar to Maryland, Vermont defines harm to a minor. The duty of care means “the use of the personal data of a covered minor and the design of an online service, product, or feature will not result in
Additional Tools Required
One of the most impactful provisions for parents and families are the requirements to provide user controls. Nebraska outlines a number of tools that allow minors the options to:
Vermont requires all of the privacy and safety settings to be set to the highest level by default. Some of Vermont’s default settings include to:
Both Vermont and Nebraska require platforms to, by default, prohibit overnight notifications. These are positive requirements that enable safety by design for young users while still giving autonomy to older teens.
A Potential Path Forward
While the fate of the Age Appropriate Design Code in the US is uncertain, the fate of kids’ and caregivers’ online experiences should not hang in the balance. A number of bills have been introduced and laws passed that take a more targeted approach to address specific harms, such as companion chatbot legislation or bills addressing algorithms. These approaches may be less vulnerable to First Amendment challenges.
Recommendations
While we have yet to crack the age appropriate design code, it is imperative that young people are centered, parents are informed, industry continues to innovate safely, and policymakers integrate research into their proposals. Below are ways that each of these stakeholders can support young people online.
With collaboration, research, and safety guardrails, a safer internet for young people is possible.