Strict controls will be enforced on sensitive content, to avoid recommendations of potentially harmful material and muting notifications during the night
Meta announced today, Tuesday, September 17, that it is activating special accounts for minors on Instagram, which will have “built-in protection” for teenagers.
The new regulations come after pressure has been put on social media companies to make the platforms safer for teenagers. The transfer of teens to the new category of teen accounts, Meta said, will be completed within 60 days in the US, Britain, Canada and Australia, and in the European Union later this year. The rest of the countries will follow in January. Meta calls it a “new, parent-led experience for teens” and says it will better support parents and reassure them that teens are safe with the right safeguards in place.
Meta’s announcement comes three years after the company abandoned plans for a teen version of Instagram following backlash from politicians and organizations. Users under the age of 16 will be able to change the default settings, only with their parents’ permission. A range of settings will also be made available to parents to monitor what their children see and limit their use of the app.
The new functions
In particular, teen accounts will change how Instagram works for users between the ages of 13 and 15, with some settings turned on by default. Accounts will have strict controls on sensitive content to prevent recommendations of potentially harmful material and muting notifications overnight.
Accounts will also be set to private instead of public, meaning teens will have to actively accept new followers, while the content of their accounts won’t be viewable by people who don’t follow them.