Meta has announced that Instagram is introducing a PG-13-style content rating system to give parents more control over what teens see on the platform.
The change represents one of the company’s most comprehensive efforts to date to align social media content moderation with the style of age advice seen in the movies. All users under the age of 18 are automatically placed in a “13+” setting, which is based on the US Parental Advisory Board film rating. Young people can only unsubscribe with the express consent of their parents.
The PG-13 system, developed in the United States more than four decades ago, has become a shorthand for content that is broadly suitable for teenagers but contains material that may be inappropriate for younger children. Meta said its new approach would reflect that framework online.
“While there are obvious differences between movies and social media, we made these changes so that the experience of teens in the 13+ environment approaches the Instagram equivalent of watching a PG-13 movie,” Meta said. “We wanted to align our guidelines with an independent standard that parents are already familiar with.”
Instagram already restricts sexually suggestive, graphic or adult content such as tobacco or alcohol advertising on teen accounts. The new settings go a step further, tightening filters for offensive language, risky stunts and images linked to harmful behavior, including posts depicting marijuana or drug paraphernalia.
Search results are also more restricted. Keywords like “alcohol” or “blood” – and even common misspellings – will be blocked under the new moderation system.
The approach was designed to be similar to the UK cinema classification 12A. Just as films like “Titanic” or “The Fast and the Furious” may feature fleeting nudity or moderate violence but remain accessible to teenagers, Instagram’s new rules won’t ban all instances of partial nudity or stylized aggression.
Meta said the system will initially be rolled out in the US, UK, Australia and Canada before expanding to Europe and other regions early next year.
The move comes amid increasing scrutiny over Meta’s child safety record and the effectiveness of its moderation tools.
A recent independent investigation led by Arturo Béjar, a former senior meta engineer turned whistleblower, concluded that 64% of Instagram’s new security tools were ineffective. The study, conducted with researchers from New York University, Northeastern University and Britain’s Molly Rose Foundation, found that teen users are consistently exposed to harmful content.
Béjar said: “Children are not safe on Instagram.”
Meta dismissed the findings, insisting that parents already had “robust tools” in place to manage teens’ accounts and monitor activity.
Britain’s communications regulator Ofcom has also warned that social media companies must take a “safety-first approach” under the upcoming Online Safety Act, saying platforms that fail to protect children will face enforcement action and possible fines.
Child safety activists welcomed the intent behind the PG-13 system but questioned whether it would lead to meaningful change.
“Time and time again, Meta’s PR announcements fail to result in meaningful youth safety updates,” said Rowan Ferguson, policy manager at the Molly Rose Foundation. “As our recent report showed, they still have work to do to protect young people from the most harmful content. These further updates must be judged on their effectiveness – and that requires transparency and independent testing.”
Critics argue that parental controls can only be effective if they are easy to use and clearly communicated to families, while some digital rights advocates warn that excessive blocking could limit teenagers’ access to legitimate health or educational resources.
The introduction of a PG-13-style content standard reflects Meta’s broader strategy to move its platforms closer to traditional media norms in the face of increasing pressure from governments and regulators.
By adopting a well-known system from the film industry, Instagram wants to reassure parents that it takes responsibility for the well-being of its youngest users – and set a benchmark that other social platforms may now have to follow.




