Meta has unveiled a significant expansion of its teen safety features, extending dedicated protections from Instagram to now include Facebook and Messenger. In a statement shared on its official Newsroom, the tech giant said the move was designed to “create a safer and more age-appropriate experience for younger people” across its platforms.
The new rollout introduces “teen accounts” on Facebook and Messenger, offering a suite of safety measures that already exist on Instagram. These features aim to minimise harmful interactions, restrict exposure to inappropriate content, and provide greater oversight for parents and guardians.
The enhanced features will initially be launched in the UK, US, Australia, and Canada, with global expansion expected in the coming months.
Key features for teen accounts
Teen accounts on Facebook and Messenger will include a variety of restrictions to ensure a safer social media environment:
- Messaging limits: Teens will only be able to receive direct messages from users they already follow or are connected with. Unsolicited contact from strangers will be blocked by default.
- Sensitive content filtering: The strictest content filters will be applied automatically to reduce the risk of viewing harmful or inappropriate material.
- Interaction restrictions: Only people a teen follows can tag or mention them in posts or comments. Additionally, offensive language is automatically filtered and removed from interactions.
- Time limits and prompts: A daily usage reminder will appear after 60 minutes of use, encouraging users to take breaks and reflect on their screen time.
- Sleep mode: Automatically activated from 10 pm to 7 am, this feature silences notifications and sends auto-replies, helping teens to maintain healthier digital habits and sleep routines.
- Parental controls: Guardians will have the option to monitor usage, including viewing recent contacts, setting usage caps, and enforcing restricted access during specific hours.
New protections for Instagram teen accounts
Alongside the rollout across Facebook and Messenger, Meta is introducing new layers of protection specifically for teen users on Instagram.
One major change involves Instagram Live. Teen users under the age of 16 will now require parental permission to go live on the platform. This move is intended to reduce the risks associated with livestreaming, particularly where it involves public visibility or potential inappropriate engagement.
In addition, new DM protections will restrict how teen users interact with image-based messages. Meta announced that images suspected of containing nudity will be automatically blurred, and teens will need explicit parental approval to disable this filter.
These new controls will be rolled out “in the coming months,” according to the company.
A continuing commitment to safety
Meta has been under consistent scrutiny over the past few years regarding its handling of child and teen safety. Recent reports and testimonies in both the UK Parliament and US Congress have highlighted growing concerns around the mental health impact of social media, particularly for younger users.
By extending teen account protections and providing parents with more control, Meta hopes to regain trust and reassure users and regulators alike that it is taking these concerns seriously.
Antigone Davis, Meta’s Global Head of Safety, said in a statement: “We want teens to have safe, positive experiences across our platforms. These updates reflect our ongoing commitment to building spaces where young people can explore, connect and express themselves—while having robust protections in place.”
As part of the initiative, Meta also confirmed plans to continue working with child safety experts, advocacy groups and policymakers to refine and improve its approach to online safety.
For parents, educators and young users, the rollout of these teen-specific controls represents a proactive step in adapting social platforms to better support youth wellbeing in the digital age.