
Meta is taking another big step in teen safety by launching Teen Accounts on Facebook and Messenger. After first introducing the feature on Instagram last year, the company is now bringing the same built-in protections to younger users on its other platforms. This move starts in the U.S, U.K, Australia and Canada, with plans to expand globally in the coming months.
Its aim is to create a safer, more controlled online experience for teens, without putting all the pressure on parents or depending on teens to change their settings themselves. Once a teen creates an account, they’re automatically placed into an experience that filters out inappropriate content and blocks unwanted contact from strangers. For instance, teens will only receive direct messages from people they follow or have previously chatted with. Similarly, only friends can reply to their stories or tag and mention them in posts and comments.
There’s also a focus on encouraging healthy screen habits. Meta is introducing reminders for teens to log off after an hour of use each day. Also it will automatically enable “Quiet Mode” overnight to help discourage late-night scrolling.
On Instagram, the protections are getting even stronger. Teens under 16 won’t be able to go live unless they get parental approval. Moreover any attempt to disable the DM nudity filter will also require a parent’s okay. These changes build on growing concerns around teen safety and mental health, especially after criticism from lawmakers and health officials about the impact of social media on young users.
To date, Meta says it has already transitioned 54 million teens to Teen Accounts on Instagram, with millions more expected as the features expand to other platforms. Impressively, 97% of teens aged 13–15 are keeping those protections turned on, showing that most aren’t rushing to opt out of the safety features.
A recent study Meta commissioned from research firm Ipsos also highlights strong support from parents. According to the survey, 94% of parents believe Teen Accounts are useful, and 85% feel the features make it easier to guide their kids toward more positive social media experiences.
Related links you may find interesting
Additional Insights:
- Digital Wellbeing Trend:- Meta isn’t alone in this shift. TikTok, YouTube, and Snapchat have all started rolling out more teen-focused restrictions and parental controls. It’s part of a broader industry move driven by increasing pressure from regulators and advocacy groups.
- Regulatory Pushback:- Several U.S. states have already started pushing laws that restrict teens from using social media without parental permission. Meanwhile, the U.S. Surgeon General has publicly warned about the risks of excessive screen time, especially for developing minds.
- AI Moderation and Content Filters:- Behind the scenes, Meta continues to invest in AI-based moderation to detect and block harmful content. Teen Accounts will benefit from these advancements, helping prevent exposure to explicit, violent, or misleading posts.
- Parental Supervision Tools:- While Meta’s current protections are largely automatic, the company is also pushing its “Family Center” , a set of tools that give parents visibility and control over their teen’s experience, including screen time tracking, message approvals, and content insights.
As social media becomes more central to young people’s lives, platforms like Facebook and Instagram are finally stepping up to balance freedom with safety. While it may not solve every concern, Teen Accounts are a solid start in giving both teens and parents more peace of mind online.