Among the 17m American children who use Instagram, the average time spent scrolling the app each day is 30 minutes. But for Kaley, a 20-year-old who started using social media aged six, it became an hours-a-day addiction. Spending time on Instagram, as well as YouTube, led to feelings of body dysmorphia and thoughts of self-harm, she claimed. On March 25th a jury in California agreed, ordering the apps’ parent companies, Meta and Google, to pay Kaley (whose full name has not been made public) $3m in personal damages.
The payout amounts to less than one-thousandth of a percent of the companies’ annual sales. But it threatens to do them far more harm. The novel legal argument used by Kaley’s lawyers may bring social networks to heel in a way that previous attempts have not. The firms are weighing their options—both have said they will appeal—but the ruling could be a turning point in how social apps are regulated.
Although this was the first time that Mark Zuckerberg, Meta’s boss, appeared before a jury, it was hardly the first attempt to sue social apps into changing their ways. In 2023 a case against Twitter, over its hosting of terrorist material, made it to the Supreme Court. But that case, like many others, went in favour of the tech industry. Section 230 of the Communications Decency Act of 1996 excuses social networks from liability for what their users post.
Kaley’s lawyers took a different approach. Rather than trying to hold Meta and Google responsible for the harmful content hosted on their platforms, they attacked them for the way the platforms are designed. They showed the jury internal company documents demonstrating that executives knew of their products’ harmful effects on children, and argued that features like auto-playing videos, personalised recommendations and infinite feeds were designed to lure youngsters.
The verdict could influence thousands of similar lawsuits that have been filed against Meta, Google and other social-media firms. (TikTok and Snap were part of Kaley’s complaint, but settled before the trial.) Some lawyers have compared the claims to the cases brought against tobacco companies in earlier decades, which led to widespread regulation of the industry.
America is not the only place where social apps are facing greater scrutiny. In February a preliminary ruling from the European Commission found TikTok in breach of its Digital Services Act owing to its “addictive” features. TikTok was told to change the design of its app or risk a fine of up to 6% of the global revenue of its Chinese owner, ByteDance. Reining in such features would probably reduce the amount of time spent on social apps, and thus the number of ads users could be served—and the profits to be gained.
Governments are focused in particular on protecting youngsters. In December Australia banned under-16s from using social networks; others from Britain to Malaysia are considering similar measures. A 30-country study last year by Ipsos, a pollster, asked whether under-14s should be excluded from social media, and found a majority in favour in every single country. The verdict in California may soon go viral.