Greece plans to ban social media for children under the age of 15 from next year, joining a growing list of countries that are weighing or enforcing restrictions to shield younger users from what they see as potentially harmful content.
Greek Prime Minister Kyriakos Mitsotakis announced the plan in a TikTok video on Wednesday, saying the decision was difficult but necessary as children spent long hours glued to their screens and faced growing pressure to compare themselves to others.
“Greece will be among the first countries to take such an initiative, but I am sure that it won’t be the last,” Mitsotakis said. “Our goal is to push the European Union in this direction.”
The move—part of a broad crackdown to restrict social media access for younger users—comes months after Australia became the first country in the world to enact a ban on social media for under 16s, triggering a lawsuit from Reddit.
Meta Platforms-owned Facebook and Instagram as well as Snapchat, TikTok, X and YouTube were included in the Australian ban, which garnered mixed reactions from parents, teenagers and influencers.
Since then, several governments around the world have considered banning or have introduced legislation to ban social media for certain age groups of minors. Earlier this year, Spanish Prime Minister Pedro Sanchez said Madrid planned to regulate social media access for children under the age of 16 by rolling out age-verification checks.
Elsewhere in Europe, lawmakers across Germany, France, Italy, Austria, Slovenia, the Czech Republic, Bulgaria, Poland, Denmark, Norway, Finland and the U.K. have spoken in favor of restricting social media access for different age groups of youths. In Asia, Indonesia recently began restricting children under 16 from accessing social media, while in the U.S., Florida is enforcing a ban on social-media use under the age of 14.
Bans, or plans to curtail access to platforms, show that social-media companies face growing criticism from governments and regulators that say they aren’t doing enough to protect younger users from potentially harmful content that might pop up on their feeds because of addictive algorithms.
Last month, the European Commission, the executive arm of the European Union, launched an investigation into Snapchat’s compliance with child-protection rules, saying the company might have exposed minors to grooming attempts, recruitment for criminal purposes and information on the sale of drugs or age-restricted products like alcohol and vapes.
Officials also said Snapchat relied on age self-declaration that didn’t effectively prevent children under the age of 13 from accessing its platform. The company said its platform was designed with privacy and safety built in from the start and that it would work closely with the commission throughout the investigation.
The EU has been piloting an age-verification app that it says enables users to prove that they are over 18 when they attempt to access adult content. That app is currently being tested with member states, online platforms and other third parties.
Write to Mauro Orru at mauro.orru@wsj.com and Aimee Look at aimee.look@wsj.com