Meta is rolling out comprehensive safety measures for younger users across its social media ecosystem, building on the success of Instagram’s Teen Accounts launched last year. The company is now extending similar protections to Facebook and Messenger while strengthening existing safeguards on Instagram.
The enhanced safety framework automatically places users under 16 into Teen Accounts with strict default settings. These accounts limit contact from strangers, reduce exposure to sensitive content, and include tools to manage screen time. Any attempts to relax these restrictions will require explicit parental approval.
Instagram’s Teen Account implementation has proven effective, with Meta reporting that 97% of users aged 13-15 have maintained the default protective settings. This success has prompted new Instagram-specific measures requiring parental authorization for teens under 16 to start live broadcasts or disable the automatic blurring of potentially inappropriate direct message images.
“We’re creating a safer environment where teens can explore social media with appropriate guardrails,” explained Meta’s Head of Youth Safety. “These measures balance protection with the independence young people need to develop digital literacy.”
The expanded protections will initially launch in the United States, United Kingdom, Australia, and Canada before rolling out globally. Parent feedback has been overwhelmingly positive, with 94% finding the features helpful and 85% reporting they facilitate better support of their teens online.
Currently, Meta oversees approximately 54 million active Teen Accounts worldwide, featuring private accounts, overnight notification silencing, usage break reminders, and restricted messaging capabilities.