To Protect Teens on Social Media, New York Targets the Algorithms

A bipartisan Senate bill includes a ban on algorithms for minors among its provisions.
A bipartisan Senate bill includes a ban on algorithms for minors among its provisions.

Summary

A proposal would restrict algorithms but faces potential First Amendment challenges.

ALBANY, N.Y.—It’s not just Congress. States are taking on social media’s grip on teenagers, too.

Officials in New York are pushing to restrict the algorithms that power a platform’s feed, making it the latest state to attempt to rein in the big tech companies in the wake of federal inaction.

Other states have hit legal roadblocks by attempting to shield children from specific types of content or restrict minors from signing up for accounts. If New York is successful, it would offer other states a legal pathway to pursue.

“Children and teens are struggling," New York Attorney General Letitia James, a Democrat, said. The proposal, she added, “will protect New York children and will be an example for others to follow."

James and New York Gov. Kathy Hochul support a proposal to prohibit social-media companies from serving content to minors in the state using algorithms—unless apps such as Instagram and TikTok first obtain parental consent. They are also backing restrictions on when apps can send notifications to teen users and what companies can do with data collected from minors.

The Democratic governor said the measure would make social media less addictive, adding that heavy usage by teens has contributed to higher instances of mental illness. Industry groups have raised questions about the constitutionality of the proposal and said media literacy would have a more immediate impact.

The Albany showdown marks the latest clash between state officials and tech companies. Attorneys general from 41 states sued Meta Platforms, the operator of Instagram and Facebook, in October, alleging the company intentionally built its products with addictive features that harm young users of its Facebook and Instagram services. Meta in November told The Wall Street Journal it didn’t design its products to be addictive for teens.

In Washington last week, senators from both political parties grilled tech CEOs over how their algorithms affect minors, and said they must bear more legal liability when children are harmed online. Several senators acknowledged that current federal laws don’t adequately address harms to children on the platforms. But federal lawmakers haven’t united around a single approach, leaving several bills waiting for a floor vote.

A bipartisan Senate bill includes a ban on algorithms for minors among its provisions. Another proposal, called the Kids Online Safety Act, would require that platforms give young users the option to disable algorithms and creates a duty for platforms to prevent and mitigate harms to minors. And a bill by Sens. Ed Markey (D., Mass.) and Bill Cassidy (R., La.) would tighten privacy protections for users under 18.

While none of the federal bills have been enacted, several states have passed various laws regulating social media. Tech companies have, so far, been successful in blocking those state efforts by citing the First Amendment’s free-speech protections.

Lawmakers in Arkansas, Ohio and Utah have banned minors from using social-media platforms without parental consent. NetChoice, a trade group whose members include TikTok and Meta, has sued in all three states and won injunctions in Arkansas and Ohio. Utah Gov. Spencer Cox last month pushed back the effective date of that state’s law to October from March “to incorporate more feedback."

“At the end of the day, they’re a violation of the First Amendment," Carl Szabo, NetChoice’s vice president and general counsel, said of the state’s laws. The group has also said federal proposals raise serious privacy and security concerns.

A dozen states last year enacted laws relating to children’s use of social media, including the Utah and Arkansas laws and other measures to create task forces or encourage media-literacy education, according to the National Conference of State Legislatures. There are more than 140 bills on the topic pending this year in at least 30 states, according to an NCSL tally.

Danny Weiss, a former congressional aide who is now chief advocacy officer for the advocacy group Common Sense Media, said the states are acting where Congress hasn’t. “There are no guardrails on social-media platforms when it comes to our kids," Weiss said.

James said that New York’s version may survive legal scrutiny thanks to a key difference from other state proposals: It limits a delivery mechanism rather than regulating content itself. Similar legislation is also pending in Minnesota and South Carolina.

Andrew Gounardes, a 38-year-old state senator who sponsors the New York measure, said it would wind back the clock on social media to the days when he first signed up in college. A user could select any source they wish, and content would be presented in chronological order.

“If you want to follow the Taylor Swift fan page, that’s great," he said. “What we don’t want is where you click on one thing, and in 15 minutes be shown self-harm videos."

The algorithm provisions are linked to a broader state spending plan that must be acted on by March 31. Democrats control both houses of the state legislature and, with Hochul’s support, Gounardes and other legislative officials said social-media regulation is likely this year.

Kate Romalewski got a phone in middle school and was first attracted to Snapchat, which let her send pictures with funny filters to her friends. When Covid-19 shut down classrooms for her first year of high school, she downloaded TikTok and watched a video of an older girl talking about the shape of her hips.

“I realized I have hip dips—and I’d never even heard of them," Romalewski, now 17, said. “Here’s something that I have, I can’t change, and thousands of people hate it."

An advocate for the New York bill, Romalewski recently started intermittently deleting the Instagram and TikTok apps from her phone to help reduce her usage.

G.S. Hans, a Cornell Law School professor and associate director of the school’s First Amendment clinic, said the New York proposal could still present constitutional concerns.

“Even a law that doesn’t touch content but talks about how you arrange, present and prioritize content is going to face this question," he said.

Meta said it uses algorithms to protect all users from inappropriate content and to help steer them toward supportive communities based on common interests. The company last month began to automatically restrict teen users from harmful content including videos and posts about self-harm, graphic violence and eating disorders.

“Teens move interchangeably between many websites and apps, and different laws in different states will mean teens and their parents have inconsistent experiences online," a company spokeswoman said.

A TikTok spokesman said the company automatically sets a 60-minute screen-time limit for users between ages 13 and 18, and parents are able to set content limits on linked accounts.

Julie Samuels, president of the industry group Tech:NYC, said another problem with the New York proposal was the lack of clear requirements to verify the ages of users and who their legal guardians are, should they try to opt in—something bill sponsors dispute.

“We have to live in reality in 2024, and we have to understand that we don’t know how to do that," Samuels said. “We need to strike an appropriate balance."

Write to Jimmy Vielkind at jimmy.vielkind@wsj.com

Catch all the Technology News and Updates on Live Mint. Download The Mint News App to get Daily Market Updates & Live Business News.
more

topics

MINT SPECIALS