To promote positive digital habits — and perhaps help parents enforce bedtimes — accounts for children between ages 13-15 will not receive push notifications after 9 p.m., the company said. For teens between ages 16-17, push notifications will be disabled after 10 p.m.
Other changes include requiring users between the ages 16 and 17 to actively switch their settings to enable direct messaging. Those under 16 don’t have access to direct messages.
“We want to help teens make active decisions about their privacy settings, so when someone aged 16-17 joins TikTok, their direct messaging setting will now be set to ‘no one’ by default,” the company said in a statement. “Existing accounts … will receive a prompt asking them to review and confirm their privacy settings the next time they use this feature. “
In addition to messaging, users under age 16 who are trying to publish their first video will get a pop-up message to help them better understand their privacy options. They won’t be able to publish the video without selecting who will be allowed to see it.
TikTok is hugely popular among teenagers, and like most digital platforms it can be plagued by bullies and potentially harmful content.
More than 12,000 people in the United States recently signed an open letter asking TikTok to allow parents to view mirror accounts of what their kids see on the short-form video app.
“Parents play a critical role in keeping kids safe from online harms like cyberbullying, sexual predators, and violent or extremist content,” said the letter by the group ParentsTogether. “But parents can’t protect kids if they can’t see what their kids see.”
TikTok says it’s working with teens, community organizations and parents to implement more changes that build on safety commitments.
People are required to be at least 13 to create a TikTok account, but it’s unclear how the company verifies users’ ages.
A TikTok spokesperson directed CNN to a section on its site that says users are required to fill in their complete birthdate to discourage them from clicking on a pre-populated minimum age.
“We also use other information as provided by our users, such as keywords and in-app reports from our community, to help surface potential underage accounts,” it said. “When our safety team believes that an account may belong to an underage person, the account will be suspended.”
In the first quarter of this year, TikTok said it removed nearly 7.3 million suspected underage accounts.