In a major move to enhance youth safety on its platform, Instagram is rolling out new content controls for all users under 18, automatically applying filters modeled after PG-13 movie ratings. This significant update, announced by parent company Meta, aims to provide stricter protection against mature themes, stronger language, and risky behavior, aligning with a standard that is familiar to parents.
All users under 18 will now be automatically placed into a “13+” setting that they cannot opt out of without parental consent. The new PG-13 framework will apply more stringent restrictions on the content teens can see in their feeds, Reels, Stories, and search results.
For parents seeking even tighter controls, Instagram is introducing a new, stricter Limited Content setting. This option filters out more content and, starting in 2026, will restrict teens’ conversations with AI.
The update is Meta’s response to intense regulatory and public pressure regarding the impact of social media on adolescent mental health and safety. By aligning with the familiar PG-13 standard, Meta hopes to offer clarity and reassurance to parents. The move also builds on existing safety measures, such as the initial launch of teen accounts in 2024. Users also see this move addressing criticisms over past failures to prevent exposure to explicit and harmful content.
The rollout began on October 14, 2025, in the U.S., U.K., Australia, and Canada, with global expansion expected by the end of the year.
Key aspects to watch include how well the new filters and AI-driven restrictions work, how parents respond to the new controls, how the changes affect creators’ reach, and whether these measures reduce exposure to harmful content without overly stifling expression for teens.