It’s for the children, they say.

A new Californian law that promises to protect minors from harms posed by online platforms like Instagram, Youtube and Tiktok. Instead, though, it threatens to increase censorship of controversial and politically sensitive speech, while slamming start-ups with immense costs and compromising the privacy of those it’s meant to protect.

Set to take effect in 2024, the California Age-Appropriate Design Code Act doesn’t specify tangible harms it’s meant to shield minors from. Nor does it empower parents with oversight over what their kids see online. Instead, it will use the threat of exorbitant fines to force big and small firms alike to identify and “mitigate harmful or potentially harmful” speech to minors, while requiring them to tool their algorithms to “prioritize” content that’s in their “best interests” and supports their “well-being.”

The inherently subjective nature of these terms means that companies will be forced to censor content based on what Big Brother or Big Bureaucracy thinks or says is harmful, while promoting content and speech they approve of. Companies also face lawsuits if the attorney general isn’t happy with how they enforce their own moderation standards. This could easily be weaponized by partisan AGs from either party to score political points by signaling the kinds of content they deem to be inappropriate for minors. In this respect, the law could encourage the kind of collusion between tech giants and the government to suppress or promote viewpoints or agendas that violates the first amendment.

While the law’s intention of protecting minors from age-inappropriate content is commendable, it has a critical flaw. It classifies everyone under 18 as a child, even minors who are nearly old enough to vote, get conscripted or serve on juries. This overbroad definition and the threat of billions in fines means that regardless of what politicians or regulators choose to take action on, companies are still likely to err on the side of censorship when it comes to age-appropriate content. That will likely mean shielding minors from important resources, including research on controversial subjects they might find necessary for school or college projects.

It’s also hard to see how several of the bill’s features, including a ban on enabling auto-play for all videos shown to minors, have anything to do with protecting kids rather than merely undermining the functionality of online entertainment platforms.

But perhaps the Act’s worst features are those around privacy. On one hand, it requires extensive paperwork, including privacy impact assessments and subjective “harm” assessments around new website features and how they could impact minors. This will lead to increased costs for start-ups and delays in bringing new innovations to the market for all users.

The law also requires stricter identity and age verification requirements for minors. This would likely involve collecting and storing sensitive identity information and documentation. With the ever-present threat of cyberattacks that have compromised the servers of even the world’s top tech giants and governments while exposing millions of users’ sensitive personal data to hackers, forcing businesses regardless of size and resources to collect and store such content is a massive privacy risk for those the law claims to protect. These businesses, which differ in data protection standards and capabilities, would become lucrative targets for hackers.

News stories, like Balenciaga’s recent advertising campaigns, apparently showing children with teddy bears in bondage gear, and internal studies linking Instagram use to self-harm and self-image issues for teens, rightly raise concerns about protecting minors online.

But targeted laws around these concrete problems and harms accompanied by better education to empower minors in navigating the online world would be far preferable and beneficial for them than radical legislation that infantilizes teenagers, suppresses speech, compromises privacy, and risks making the internet less functional for everyone.

Originally published here



More Posts

Subscribe to our Newsletter

Scroll to top