Meta is placing strict limits on what teenagers can do on its social media platforms, starting with Instagram.
On Tuesday, the umbrella company above Facebook announced that all teens will automatically be placed into private accounts by default on the photo-sharing app.
The new tools, which include restrictions on messaging and recommended content, are designed to create a safer, more controlled experience for younger users, Meta said.
Ultimately, it sounds like teens will be isolated in their own bubble, hidden away from strangers, where they can browse and post without the threat of bullying or seeing harmful content.
What are Instagram teen accounts?
As part of the changes, the content teens share on Instagram will no longer be public, meaning that no one outside of their circle of followers will be able to see it. Teens will also have to accept new followers before they can start viewing their posts.
In addition, only people they follow will be able to tag or mention them in posts, further decreasing their visibility on Instagram. Meta will also automatically activate its most restrictive anti-bullying tool, “Hidden Words,” to filter out offensive language in comments and direct message requests.
The restrictions also extend to the types of recommended content younger people can see in algorithmically curated feeds like Explore and Reels. Meta says that teens won’t be shown videos or images that show people fighting or promoting cosmetic procedures.
They’ll also get a new feature that lets them select the topics they want to see more of in their feeds.
To cut down on their screen time, teens will get notifications telling them to leave the app after 60 minutes each day.
A sleep mode will be enabled overnight, ensuring that they aren’t pinged by the app, with auto-replies generated for DMs.
Who will get assigned a teen account?
Starting today, new users will be placed into teen accounts when they join Instagram if they’re aged 13 to 19. Existing users will start getting in-app notifications alerting them to the changes before they’re placed into the private accounts next week.
Those aged 16 and under will need their parents’ permission to lift the barriers, while anyone above that age will be able to remove them manually. Parents worried about what older teens are getting up to can also turn on parental supervision to apply or remove the new settings. Soon, parents will also be able to change these settings directly to be more protective, Meta said.
What parental controls are there?
In addition, Meta is expanding its Instagram supervision tools for parents. Now, they can see who their teen has messaged in the past seven days (though they can’t read the messages), set daily time limits on app usage, and block access during specific periods, such as at night. Parents can also view the topics their teen is interested in to ensure the content is age-appropriate.
Why is Meta doing this?
The move is part of Meta's push to better protect children amid mounting legal pressure. In October, dozens of US states sued Meta and Instagram, accusing them of fueling a youth mental health crisis by making their platforms addictive.
The lawsuit claims features like algorithmic feeds manipulate "dopamine releases" in children, similar to slot machines, to keep them engaged.
Meta is also under scrutiny in Europe, with the European Commission demanding answers on how the firm safeguards children from harmful content. This comes as concerns remain high following the tragic death of 14-year-old Molly Russel in 2017, who was exposed to harmful content online.
Since then, Meta says it has introduced more than 30 tools to support teens and their families on its platforms. They include restrictions on harmful content, DMs, and notifications.