Social media age restrictions – early impact of reforms and teacher… | #childpredator | #kidsaftey | #childsaftey


Australia introduced its social media minimum age (SMMA) restrictions at the end of last year, sparking widespread global interest. When the ban came into force on December 10, Teacher reported on what the changes would mean for young people and how educators could support students and families (Earp, 2025). With the eSafety Commissioner releasing an update on the first 3 months of implementation, this follow-up article explores the early impact of the reforms, areas of good practice and concern, and insights from educators and parents.

Under Australia’s social media minimum age (SMMA) requirements, the providers of age-restricted platforms must take reasonable steps to prevent children under the age of 16 from having accounts. The focus has been on 10 platforms: Facebook, Instagram, Kick, Reddit, Snapchat, Threads, TikTok, Twitch, X (formerly Twitter) and YouTube.

One of the main takeaways from the SMMA 3-month update released by the national eSafety Commissioner (2026) is that as far as compliance goes there has been movement, but not enough; there are examples of good practice, but also significant concerns.

As of mid-January, some 4.7 million age-restricted accounts had been removed or restricted by platform providers – the number of individual users is less, of course, as many children have multiple accounts on different platforms; the figure also includes inactive accounts. By the start of March, another 300,000 age-restricted accounts were stopped from gaining access.

The update also highlights areas of ‘good’ and ‘poor’ practice by platform providers. 

Good practices include helpful links to support services in case users may be feeling distressed (a move in line with eSafety’s guidance); and platforms applying temporary blocks when users add a date of birth or age that makes them under 16 so they can’t find a way around the system simply by retrying immediately.

Poor practices include platforms encouraging self-declared 14- and 15-year-olds to undergo age checks and offering facial age estimation (something that’s known to have higher error rates for those who are near the age threshold); allowing multiple attempts at the same age assurance method (sometimes allowing 10+ attempts when the recommendation is 5); and barriers associated with making a report.

Survey data from parents and carers 

‘While the onus is on age-restricted platforms to take reasonable steps to keep children under 16 from having accounts, parents are proving pivotal partners in this cultural reset,’ eSafety Commissioner Julie Inman Grant says. ‘We have heard from parents who have said the law is empowering them to say no to requests by their kids to have social media accounts.’

In January and February 2026, eSafety surveyed almost 900 parents and carers of children aged 8 to 15 years to get their perspective. Half reported their child had an account with at least one social media platform before the restrictions came into force – this fell to 31% after the SMMA obligations. This pattern was seen across the 10 platforms.

Of those survey participants who said their child had an account on each platform before the restrictions, 64% said their child still had an account on Facebook, 69% on Instagram, 69% on Snapchat and 69% on TikTok. Half (49%) said their child still had an account on YouTube.

The survey also asked parents and carers for their perspective on why their child no longer had an account on any social media platform. ‘… the most common reason was platform-led deactivation (selected by 43.6% of parents whose child no longer had at least one social media accounts), followed by children deactivating their own accounts (36.3%), and parents or carers closing their child’s account (26.6%),’ the report shares.

Educator engagement and feedback

Noting the important role of educators, the report says: ‘While we have been pleased to hear from many different organisations across government, industry, not-for-profits and other sectors, one segment we were particularly keen to hear from was educators.’ 

To this end, the Commission’s National Online Safety Education Council (NOSEC) involves representatives from across Australia’s school sectors. It’s a forum for raising awareness of, and addressing, challenges and sharing of best practice related to online safety education. There were 3 key observations from NOSEC participants in March:

  • Initial wellbeing concerns about how the SMMA obligation could affect vulnerable children, especially over the 2025-2026 summer break, were not observed.
  • Initial experiences of the SMMA obligation are mixed, with some children aged under 16 appearing to be relieved they are no longer on social media, and others seemingly celebrating their circumvention and retention of accounts.
  • Anecdotally it seems children aged under 16 continue to have access to their accounts and platforms are not taking down accounts that are reported, (eSafety Commissioner, 2026).

Fears about reaching out for support

The eSafety Commissioner says, in its consultations about SMMA there was concern that young people might not reach out for support if something went wrong while they were on an age-restricted platform for fear that their account would be removed. The advice is clear – the focus is on helping those who’ve experienced harm: taking down harmful content or stopping threats.

Educators can point students towards this dedicated page, which offers an important reminder: ‘No matter how old you are, if you have a harmful experience online you should reach out for support – even if you’re under 16 and it happens on social media. You won’t get into trouble for being on the platform. …If you report cyberbullying or image-based abuse (sharing, or threatened sharing, of intimate images or videos) and it happened on an age-restricted platform, we will not alert the platform that your account should be removed or deactivated to comply with the age-restrictions.’

It adds there is still a chance the platform provider will find out the age of the people involved, and this may mean accounts of under-16s are removed or deactivated, but it’s always better to reach out for help so that you can be protected from harm.

Ongoing resources and research

That dedicated page is part of the eSafety Commissioner social media age restrictions hub which includes information for educators, young people and families. There’s also a dedicated page on what the social media age restrictions mean for educators. And, since the ban came into force, the Commissioner has delivered 18 free webinars to more than 5,000 educators, parents and carers and other professionals working with young people.

In terms of further research insights, the evaluation of the impact of the SMMA is already underway. More than 4,000 children and families will be followed over more than 2 years for the major study. The initial findings will be released later this year, with further updates throughout 2027 and 2028.

Social media and technology use remains an ongoing topic for discussion, which means survey data from other organisations will continue to contribute to the picture. This week, NAB released an Education Insights Report focusing on student wellbeing (Pearson & De Iure, 2026). It shares findings from a 2025 survey of 400 school students across Australia in years 7-12.

While 62% of students said schoolwork, exams, and grades was a cause of their worries (the top concern), 17% cited social media (ranked 11th on the list). Digging deeper, 58% of those surveyed agreed with the statement ‘Social media makes me feel more connected to what is going on in my friends’ lives’ and 52% agreed it makes them feel ‘included in things’, but 42% also agreed with the statement ‘Social media makes me feel overwhelmed because of all the issues going on in the world’, and 31% said it makes them feel ‘overwhelmed because of all the drama between my friends’ and ‘pressure to post or comment’.

The NAB report also discusses the SMMA restrictions. ‘While students were surveyed prior to the introduction of the new laws, it is telling that when asked about future parenting, the majority of students indicated they would enforce much stricter guidelines around their children’s screen time. Suggested rules included restrictions on social media use at bedtime and during meals, age-based access limits, and closer monitoring of online activity. Only about one in 7 students said they would choose not to impose any rules or limits for their children, highlighting a widespread desire among today’s teens for more boundaries and support around digital habits than they themselves currently experience,’ (Pearson & De Iure, 2026).

References

Earp, J. (2025, December 10). Social media age restrictions – how teachers can support students and parents. Teacher magazine.  https://www.teachermagazine.com/au_en/articles/social-media-age-restrictions-how-teachers-can-support-students-and-parents

eSafety Commissioner. (n.d.). Social media: Age restrictions. eSafety Commissioner. https://www.esafety.gov.au/educators/social-media-age-restrictions

eSafety Commissioner. (2026). Social Media Minimum Age: Compliance update. March 2026. eSafety Commissioner. https://www.esafety.gov.au/sites/default/files/2026-03/SocialMediaMinimumAgeComplianceUpdateMarch2026.pdf

Pearson, D., & De Iure, R. (2026). NAB Education Insights 2026 Highlights – Student Wellbeing. National Australia Bank. https://business.nab.com.au/tag/government–education-and-community/2026-nab-education-insights-report—wellbeing



Source link

——————————————————–


Click Here For The Original Source.

National Cyber Security

FREE
VIEW