California lawmakers passed first-of-its-kind legislation on Monday designed to improve the online safety and privacy protections for children.
The bill, the California Age-Appropriate Design Code Act, will require firms such as TikTok, Instagram and YouTube to install guardrails for users under the age of 18, including defaulting to higher privacy settings for minors and refraining from collecting location data for those users.
It also requires companies to analyze their algorithms and products to determine how they may affect young users, assessing whether they are designed to be addictive or could cause additional harm to children.
Children’s safety advocates have applauded the bill, which passed in a vote of 33 to 0, saying similar federal legislation is needed to protect young users. The bill is “a huge step forward toward creating the internet that children and families deserve”, said Josh Golin, executive director at advocacy group Fairplay.
“For far too long, tech companies have treated their egregious privacy and safety issues as a PR problem to be addressed only through vague promises, obfuscations, and delays,” he said. “Now, tech platforms will be required to prioritize young Californians’ interests and wellbeing ahead of reckless growth and shareholder dividends.”
Meanwhile some privacy advocates have raised concerns about the sweeping scope of the bill, as it may require all users to authenticate their age and restrict anonymous browsing online.
“The bill will dramatically degrade the internet experience for everyone and will empower a new censorship-focused regulator who has no interest or expertise in balancing complex and competing interests,” wrote Eric Goldman, a law professor and critic of the bill.
California is the first state in the US to pass legislation auditing apps “likely to be accessed” by users under the age of 18. It comes after the state failed to pass a separate bill targeting online children’s safety earlier in August. That bill, AB 2408, would have allowed companies to be sued for designing features that keep young users addicted to the app.
The bill now faces a final vote before being sent to the governor, Gavin Newsom, to be signed into law. If enacted, it will go into effect in 2024 and companies could face fines as high as $7,500 per user if found to be violating the protective measures for children.
The bill reflects similar legislation, called the Age Appropriate Design Code, that went into effect in the UK in 2021. It comes as social media firms face more scrutiny for their public health impact, particularly on their most young and vulnerable users.
In 2021, Meta whistleblower Frances Haugen revealed internal research at the Instagram parent company that showed the drastic mental health impacts of the app on teen users. Such revelations intensified calls from advocacy groups to strengthen protections for young users. Many have called for the federal children protection law, known as Coppa, to be updated to better protect children.
The passage of the California Age-Appropriate Design Code Act is “a monumental step toward protecting California kids online”, said Jim Steyer, founder and CEO of children’s online safety organization Common Sense Media, but more action needs to be taken.
The act “is only part of the change we need to better protect young people from the manipulative and dangerous practices online platforms employ today”, he said. “California legislators, and lawmakers around the country, need to follow up on this important development by enacting additional online privacy and platform accountability measures.”