Get Instant Loan up to ₹10 Lakh!
An unfortunate incident that took place three years ago still remains a pressing concern, shining a harsh light on the darker side of social media. Stephanie Mistre, a resident of Cassis in southern France, found her 15-year-old daughter, Marie, lifeless in her bedroom in September 2021. Marie had taken her own life, leaving behind a devastated family and a trail of questions about the role of TikTok’s algorithm in her death.
Delving into her daughter’s phone after the tragedy, Mistre was horrified to find an array of videos promoting suicide methods, tutorials, and comments that encouraged users to go beyond “mere suicide attempts.” According to Mistre, TikTok’s algorithm had repeatedly pushed this harmful content to her daughter, normalising self-harm and creating a dangerous sense of belonging around depression.
“It was brainwashing,” said Mistre. “They normalised despair and self-harm, turning it into a twisted community for vulnerable children.” Now, she and six other families have taken legal action against TikTok France, accusing the platform of failing to adequately moderate harmful content and exposing young users to life-threatening material. Out of the seven families, two have lost children to suicide.
TikTok, a popular app owned by Chinese company ByteDance, has denied responsibility, stating that its guidelines prohibit content promoting suicide or self-harm. The company claims to employ 40,000 trust and safety professionals globally, including hundreds of French-speaking moderators, and refers users searching for suicide-related content to mental health services. Despite these assurances, critics argue that the platform’s moderation efforts are insufficient.
Marie’s death has been tied to multiple factors, including bullying at school and online harassment. However, her mother places much of the blame on TikTok, comparing the app’s algorithm to a ticking time bomb for empathetic and sensitive teenagers. She also noted that before her death, Marie had created videos explaining her decision, referencing personal difficulties and quoting lyrics from the emo rap group Suicideboys, who are popular on TikTok.
The legal battle against TikTok focuses on the platform’s algorithm, which the families’ lawyer, Laure Boutron-Marmion, alleges traps vulnerable users in cycles of despair for profit. The lawsuit claims that TikTok’s algorithm is designed to exploit children’s vulnerabilities, turning them into “lucrative re-engagement products.”
“Their strategy is insidious,” Mistre said. “They hook children into depressive content to keep them engaged.” The families’ case is supported by extensive evidence, according to Boutron-Marmion, who argues that TikTok can no longer evade responsibility by claiming it does not Global and Local Concerns create the content.
Critics of social media platforms, including TikTok, have highlighted the phenomenon of “algospeak”—a method users employ to bypass moderation by using coded language or emojis. For instance, some users use a zebra emoji to discuss self-harm or a Swiss flag emoji as an allusion to suicide. While these codes may not seem sophisticated, Imran Ahmed, CEO of the Centre for Countering Digital Hate, argues that platforms could identify them if they made greater efforts.
Ahmed’s organisation conducted a 2022 study simulating the experience of a 13-year-old girl on TikTok. Within just two and a half minutes, the algorithm began serving self-harm content, followed by eating disorder content within eight minutes. On average, harmful material was suggested every 39 seconds.
“The algorithm knows that eating disorder and self-harm content is especially addictive for young girls,” Ahmed said. He dismissed TikTok’s claim that over 98.8 per cent of harmful videos had been flagged and removed, stating that the platform’s efforts are insufficient to tackle the scale of the problem.
Scientists, however, caution against attributing mental health crises solely to social media. Grégoire Borst, a professor of psychology and cognitive neuroscience at Paris-Cité University, noted that it is difficult to establish a direct cause-and-effect relationship between social media use and mental health issues. He cited a peer-reviewed study showing that only 0.4 pe cent of variations in teenagers’ well-being could be attributed to social media.
Borst also pointed out that most teenagers use social media without significant harm. However, those already facing challenges—such as bullying or family instability—are more vulnerable to the negative effects of harmful content. “When teenagers already feel bad about themselves and are exposed to distorted images or harmful social comparisons, it can worsen their mental state,” he said.
Comparisons with TikTok’s Chinese counterpart, Douyin, have further fuelled criticism. Douyin imposes strict content controls for users under 14, including a mandatory “youth mode” that limits screen time to 40 minutes a day and provides only approved educational and uplifting content. Boutron-Marmion argues that the absence of similar safeguards in other regions highlights TikTok’s prioritisation of profit over safety.
The lawsuit against TikTok France reflects broader global concerns about the impact of social media algorithms on mental health. Similar legal actions have been initiated in the United States, where families have accused platforms like Meta’s Instagram and Facebook, as well as Snapchat and TikTok, of contributing to teenage suicides.
In France, the government has also recognised the potential dangers of social media algorithms. A report commissioned by President Emmanuel Macron in April 2024 recommended banning certain addictive algorithmic features and restricting social media access for minors under 15. However, these measures have yet to be implemented.
Whether TikTok’s algorithm constitutes “brainwashing” or whether the problem lies elsewhere, the question remains: how can society protect its most vulnerable members in the digital age?
(With inputs from AP)
Catch all the Business News , Breaking News Events and Latest News Updates on Live Mint. Download The Mint News App to get Daily Market Updates.