
Meta Platforms unveiled a fresh set of limitations for Instagram users aged under 16-years and commenced the rollout of restricted Facebook and Messenger accounts for teenagers.
Following the update to Instagram Teen accounts, which launched in 2024, users below this age will be unable to activate a live broadcast or turn off a feature which blurs images suspected of containing nudity in direct messages.
As with its other child protection settings, these can be reversed with parental permission.
The upgrade is set to take effect in the “next couple of months”, the company wrote in a post on the moves.
Meta Platforms automatically gives users below 18 years-old Teen Accounts, which place a number of restrictions compared with content seen by other users.
Those below 16-years require parental consent to change related settings.
The company claimed since their introduction, 97 per cent of users between its minimum age of 13-years and 15-years still had the restrictions in place, with overwhelming parental support for the measures.
Following the apparent positive response to attempts to increase child safety on Instagram, the social media company plans to place similar restrictions on those using its Facebook and Messenger services.
The company positioned it as an attempt to make it “easier for parents to have peace of mind when it comes to their teens’ experiences across Meta’s apps”.
Teen Accounts on these platforms will be initially available in the US, UK, Australia and Canada before being extended into other markets.
The latest upping of protections comes at a time of continued scrutiny from authorities, parental groups and activists on the role of social media companies in protecting children using their apps, and in some cases questioning how appropriate they are at all for young teenagers.
————————————————