TikTok kicks 190,000 users off its platform a day for lying about being older than 13 years old, a senate inquiry has been told.
The inquiry is examining how Australia’s law enforcement agencies tackle child exploitation, including online trends for explicit material and access to child abuse material.
It’s also considering the role technology providers have in assisting law enforcement agencies to combat child exploitation.
During Wednesday’s senate hearing, TikTok director of public policy Ella Woods-Joyce confirmed the online platform closed 17 million accounts in the last financial quarter of 2022.
It’s the equivalent of 190,000 users per day being banned.
Those cancelled accounts were determined by TikTok’s 40,000 trust and safety professionals who found they were being operated by a person under the age of 13, which is against TikTok’s eligibility policy to use the platform.
With more than 8.5 million Australian users coming to the online video platform per month, Ms Woods-Joyce said online child safety was TikTok’s “top priority”.
“We have a zero tolerance approach for child exploitation as well as any content that may exploit or danger minors,” she said.
The inquiry was told users under the age of 15 didn’t have access to direct messages, couldn’t use the live feature and parents could link their own accounts to their children’s accounts to monitor their content.
Ms Woods-Joyce said TikTok alerts the appropriate law enforcement whenever it had confirmed child abuse material present on its platform.
“We do this because we know law enforcement has finite resources, where we can detect an imminent risk,” she said.
“Our law enforcement outreach team maintains very good relationships with all law enforcements in Australia.
“Inside those teams we have specialised child safety teams.
“In the context of child sexual exploitation, we do have specialists who work to keep our community safe.”
TikTok alerted authorities in February to an incident in the ACT where a 50-year-old woman was accused of repeatedly sexually abusing her young granddaughter and posting the abuse material on TikTok.
The Australian Federal Police Child Protection Triage Unit began investigating the case after being alerted to newly produced child abuse material being uploaded to social media on January 27.
TikTok had reported the material to the Australian Centre To Counter Child Exploitation and subsequently assisted police with their investigation.
The investigation was handed over to the Sexual Assault and Child Abuse team and the ACT Joint Anti Child Exploitation Team before the woman’s arrest.
She remains before the court charged with a string of child exploitation offences.
Ms Woods-Joyce said this was an example of the work TikTok did daily to monitor any threats of child exploitation material on its platform.
“It speaks to the vigilance of our teams in identifying those underage users,” she said.
“We have over one billion users on our platform.
“We’re continually working hard to make sure they’re 13 (years old) and above and their experience is appropriate for their age.”
Senator David Shoebridge criticised TikTok’s limitations to signing up to the platform, stating people could easily lie about their age when joining.
“Something is seriously wrong if you have to scrap out 190,000 accounts of people identified under 12,” he said.
Ms Woods-Joyce said the platform must also obey privacy laws as well as maintaining users safety.
“The balance for privacy is a big priority for online platforms like ours,” she said.
“We must always balance our privacy obligations with our safety obligations.
“We’re very invested into safety of our users.”