Instagram to limit content for teenage users based on PG-13 ratings
The platform will hide certain content from teens, such as strong language and risky stunts.
Instagram is rolling out new protections for teenage accounts and introducing a system that limits what young users can see based on PG-13 movie ratings.
Teen Instagram users by default will be able to access content similar to what they might see in a PG-13 movie, Instagram said Tuesday. Users under 18 won’t be able to opt out without a parent’s permission and parents will be able to place stricter restrictions on those accounts, the company said.
Instagram, which is owned by Meta Platforms, introduced new artificial intelligence systems earlier this year to detect when teens are claiming to be adults.
Under the new limits, Instagram said it would hide certain content like strong language, risky stunts and marijuana-related content from teen users. The new settings will be fully rolled out to teen accounts in the U.S., Canada, U.K. and Australia by the end of the year, the company said.
The new PG-13 restrictions also apply to AI bots that teens could interact with on the app. Responses from AI bots will be age appropriate and shouldn’t feel out of place in a PG-13 movie, the company said.
Instagram has added more protections for teens in recent years, such as automatically making new accounts for those under 16 years old private.
On Monday, California Gov. Gavin Newsom signed legislation to strengthen the state’s online protections for children. The new law adds safeguards such as age verification for certain products and protocols to address suicide and self-harm.
“Without real guardrails, technology can also exploit, mislead, and endanger our kids," Newsom said in a statement Monday. “We’ve seen some truly horrific and tragic examples of young people harmed by unregulated tech, and we won’t stand by while companies continue without necessary limits and accountability."
The Wall Street Journal’s Facebook Files series in 2021 showed internal research at the company found Instagram was harmful for some young users, primarily teenage girls with body-image concerns.
Facebook, which became Meta in 2021, scrapped plans to create an Instagram platform tailored to children after lawmakers raised concerns over the app’s impact on young people’s mental health.
Write to Joseph De Avila at joseph.deavila@wsj.com
