California bills to protect children show promise
California lawmakers are considering a pair of bills to regulate children’s use of online services. One is very promising, but the other can have unfortunate and unintended consequences.
Although I have some concerns about its details, I’m generally impressed with the California Age Appropriate Design Code Act (AB 2273), but have my doubts about the Social Media Platform Duty to Children Act (AB 2408). Both bills are co-sponsored by Assembly Members Jordan Cunningham (R-San Luis Obispo) and Buffy Wicks (D-Oakland). You can listen to my interview with Congresswoman Buffy Wicks on connectsafely.org/wicks.
the California age-appropriate design code The law is designed to protect minors under the age of 18, which is different from current federal law, the Children’s Online Privacy Protection Act (COPPA) which only applies to children under the age of 13. But unlike COPPA, this bill does not completely eliminate a company’s right to collect personally identifiable information, which effectively forced social media companies to ban children under 13. Instead, the California bill cleverly requires companies to apply “a high level of privacy protection” to teenagers by default, which some companies are already doing. It also requires companies to post their privacy information and terms of service prominently in language “appropriate for children” who are likely to use the service. If it becomes law, the bill will also prohibit companies from using any information from people under the age of 18 for purposes other than providing their services. This makes much more sense than COPPA’s outright ban on collecting personal information from children under 13.
One thing I like about this bill is that it requires the service to provide “a clear signal” to the child when they are being tracked or monitored if the service has a feature that allows the “parent, guardian or other consumer of the child to monitor the child’s online activity or track his or her location. I’ve long argued that parents shouldn’t use any parental control or monitoring tools in stealth mode.
The California bill is inspired by the UK age-appropriate design code, and because most social media companies that operate in California and other US states also operate in the UK, many have already adopted parts of the UK code for their US users.
As always, it is important to read the fine print and think about how this bill will be implemented. I have some concerns about the proposed task force, which would have regulatory powers. He would be appointed by the California Privacy Protection Agency, which is itself a new agency. Who would be part of this working group and to whom would they ultimately be responsible? This brand new agency is not fully staffed and has not promulgated any rules. It is important that the working group includes child rights experts as well as child security and child development experts. It is not uncommon for those who focus on child protection to take actions that may unintentionally limit children’s rights. Many young people are taking to social media to explore and express their concerns about politics, religion, sexuality, health, and many other topics that are important to them.
I am also concerned that this bill targets services “likely to be accessed by a child”. I understand that they try to focus on companies that have content that appeals to children, even though they claim that they do not market to children, but that “likely to view” may include a large amount of content. The Super Bowl, for example, is watched by millions or children, but that hasn’t stopped TV networks from airing adult drink ads. I know a kindergarten teacher who was unable to stream children’s music from her YouTube Music Premium to her class due to YouTube’s overly cautious reaction to the Children’s Online Privacy Protection Act, which was designed to keep children’s personal information away from marketers, not to prevent teachers from playing music to their students. The content may have been intended for children, but the person playing it was a responsible adult.
The bill also states that “age verification methods must be proportionate to the risks arising from the company’s data management practices, protect privacy and be minimally invasive.” I agree, but it’s also important to understand that age verification is difficult in the United States where many children don’t have government-issued IDs and privacy laws prohibit access to school and social security records. In 2008, I was part of the Internet Security Technical Working Group which, after hearing from several experts on age verification, concluded that it was impractical in the context of laws and technology. Americans. Granted, artificial intelligence has advanced since then, which is worth taking a look at, but figuring out if someone is a child is trickier than it looks.
Finally, as with all state laws affecting Internet use, there is the matter of state versus federal regulations. Due to its population size and technological presence, California regulations will likely set a floor for business behavior in each state. But when states adopt rules that could contradict each other, it creates a confusing playing field for industry, regulators and users.
I have stronger concerns about the social media platform’s Duty to Children Act. Maybe I’m shuddering over the semantics, but I’m not even sure I agree with the bill’s premise that social media is “addictive.” Although some psychologists and psychiatrists believe it is, the official bodies that represent psychiatrists and psychologists do not classify it as such, although they recognize that obsessive use of technology can be problematic and harmful.
That said, I can’t argue with the bill’s proponents that many kids spend too much time on social media and find it hard to walk away from their devices. In fact, many adults do it too, but there is a long tradition of laws that protect children from things that are legal for adults.
The operative part of the bill provides for a civil penalty of up to $250,000 for a social media platform featuring features “that were known, or should have been known, by the platform to be addictive for children”. The service could “be liable for all damages to child users which are, in whole or in part, caused by the features of the platform, including, but not limited to, suicide, mental illness, eating disorders, emotional distress and medical care costs. , including care from licensed mental health professionals. There are exclusions, so this bill doesn’t ban everything that makes social media sites appealing, but it still risks preventing companies from offering features kids love and should be able to use in moderation. and, in some cases, with parental supervision.
I understand. Social media companies use techniques designed to keep people online longer and some of them affect children as well as adults. But that’s true with just about any product. Many people consider sugar to be addictive, but that doesn’t stop companies from selling and marketing sugary candies for children. If a child becomes obese after eating an excessive amount of Ben and Jerry’s ice cream, should the company be responsible for the consequences on their physical and mental health? What if this child also eats a lot of Lays crisps? Should Pepsico, owner of Frito-Lay, also be sued. How do we know how many pounds the child gained with ice cream versus how many with chips and what about all the other aspects of the child’s life? Maybe someone should sue their school for not having a vigorous enough physical education program? Perhaps food companies should be forced to make their products unappealing to children to prevent overconsumption.
There are also people who think that television is addictive, so what about shows that have a suspense at the end of an episode that compels you to watch the next one, even if it’s long after bedtime. By that definition, I’m addicted to almost every show I’ve “binge-watched”.
I don’t want to trivialize a serious problem. My non-profit organization, ConnectSafely.org, has dedicated many resources to helping families deal with problematic internet use, but the problem and solution are far more complex than simply limiting internet time. screen or punishment from social media companies that use features designed to keep people online longer.
I want to close by congratulating Assemblymen Wicks and Cunningham on these two well-meaning bills. They should be fully considered, but it is important to focus on all the details and possible unintended consequences. I look forward to seeing how these bills evolve.
Disclosure: Larry Magid is CEO of ConnectSafely.org which receives financial support from Meta, Google and others technology companies who may be affected by these bills.