Sound journalism, grounded in facts and Biblical truth | Donate

California bills seek to minimize social media harms on children

The state could become the first to hold tech companies liable, but critics say the bills undermine free-speech protections

Assemblyman Jordan Cunningham votes for his social media bill at the Capitol in Sacramento, Calif., in May. Associated Press/Photo by Rich Pedroncelli, file

California bills seek to minimize social media harms on children

A recent short video released by the Dove Self-Esteem Project depicted teenage girls and their mothers watching social media influencers promote “toxic beauty advice” ranging from dieting pills, lip-filler kits, teeth-filing tips, and the benefits of “baby Botox.”

The video said social media has normalized harmful beauty advice for teen girls. “This stuff is on every girl’s feed,” one teen said. One in two girls agreed that idealized beauty content on social media caused them low self-esteem, Dove found in an April survey.

In recent years, a growing number of parents, lawmakers, and even teenagers are expressing concern over the harms of social media on children. But opponents of legislative efforts addressing the problem say they would undermine free speech.

The push for federal and state regulation gained momentum last year. A former Facebook employee leaked internal company data revealing the ways its photo-sharing app Instagram contributed to worsening body image issues and higher rates of anxiety and depression among teen girls. The data, obtained by The Wall Street Journal, found that 32 percent of girls said using Instagram made their body image worse and 17 percent said it exacerbated eating disorders.

Now, California lawmakers are considering two bills that would hold social media platforms accountable for their youth-oriented content, including algorithms and other features that track child users or market harmful or inappropriate content to them. The state could become the first in the nation to assign liability to social media companies for harming children who have become addicted to their platforms.

One of the bills would permit parents, guardians, and the state’s attorney general to sue platforms such as Instagram and TikTok for up to $25,000 per violation if they could prove a child was harmed physically, mentally, emotionally, or developmentally. ​​The bill, dubbed the “Social Media Platform Duty to Children Act,” interprets addiction as a user’s incapability to “cease or reduce use of a social media platform” despite their intention to do so.

The other bill would require tech firms to follow an age-appropriate design code for websites and apps likely to be used by children. It would bar companies from collecting children’s online data, including terms they enter into search engines, to set algorithms advertising harmful content. Companies would be required to provide the highest privacy settings for children younger than 18.

Both bills passed the California Assembly with bipartisan support and now head to the state Senate.

State Assemblyman Jordan Cunningham, a Republican and co-author of the bills, said the measures would send a strong signal to tech companies that “the era of unfettered social media experimentation on children is over.”

But critics, including tech industry executives and libertarian groups, argue the bills would compromise online users’ rights and violate free-speech protections for internet companies. They say the broad language in the legislation would open tech companies to lawsuits for other users’ content and activity on their platforms.

“There’s a solid consensus that there’s a problem with social media for teens,” said Lee Tien, legislative director for the San Francisco–based Electronic Frontier Foundation, a digital rights group. “It’s one thing to identify a problem. … We’re in the very beginning stages of finding solutions that work and do not have unintended consequences.”

The Electronic Frontier Foundation said in an opposition letter that legislation involving children’s privacy should account for the differences between the average 17-year-old and a younger child of age 7. Federal law, under the 1998 Children’s Online Privacy Protection Act, applies to children under age 13.

U.S. lawmakers have debated how to better protect kids’ online privacy and shield them from social media harms. During congressional hearings late last year, senators grilled tech executives from Facebook, Snapchat, Instagram, and TikTok, among others. Two U.S. senators in February introduced a bill that would hold tech companies responsible for harm they cause children.

This week, the head of the Federal Trade Commission (FTC) said the agency is working on actions and policies to protect children online, including penalizing education-technology companies that illegally surveil children when they go online to learn. The FTC in March required WW International, formerly named Weight Watchers, to delete information it collected illegally from kids under 13 and algorithms developed by its weight-loss app for children. WW International also paid a $1.5 million penalty.

In California, Common Sense Media CEO Jim Steyer says big tech companies will not change their practices on their own: “California can take an important step to force them to do the right thing for our kids, teens, and families.”

Mary Jackson

Mary is a book reviewer and senior writer for WORLD. She is a World Journalism Institute and Greenville University graduate who previously worked for the Lansing (Mich.) State Journal. Mary resides with her family in the San Francisco Bay area.


Thank you for your careful research and interesting presentations. —Clarke

Sign up to receive Relations, WORLD’s free weekly email newsletter on marriage, family, and sexuality.

Please wait while we load the latest comments...