
Summary
- Google proposes a new framework to prioritize appropriately designed products over mandatory age verifications on the internet to keep minors safe from harmful content.
- The company believes age verifications should be limited to higher-risk services like adult content, gambling, and alcohol, while more broad products should be designed with age-appropriate principles.
- Google recognizes the need to strike a balance between child protection and mass surveillance.
Child safety on the internet has become a big topic, with different regulators proposing different solutions to keep minors safe from harmful content they may come across on the web. Google is chipping in this discussion with a new framework of its own that prioritizes appropriately designed products over mandatory age verifications in many cases, limiting more intrusive identification measure to higher-risk services.
Google says in a blog post that it takes its responsibility seriously and that it supports regulation surrounding child safety. To accelerate the process and to shape it in the way it’s beneficial to itself, the company proposes a framework of its own, called “Legislative Framework to Protect Children and Teens Online.”
The company makes clear that regulators have a thin line to walk since it can see many cases where adults could be forced to share identification or personal information when it’s not necessary. The company is concerned that with the wrong scope of the regulation, age verifications could be implemented too broadly for services that don’t really require it. At the same time, Google agrees that for highly sensitive “high risk” content like adult content, gambling, and alcohol, age verifications are in order.
For other content, Google prefers an approach that favors “age-appropriate design principles,” meaning that the company thinks it and other businesses should build products that are age appropriate for everyone by design.
Google additionally notes that child safety and privacy groups are also advocating to get the balance between child protection and surveillance right. At the same time, it’s likely that a less strict approach would also help lessen the burden on Google’s own servers and its privacy protection tools, given that this would mean that the company has to collect fewer identifying and sensitive data to confirm ages.
Google has repeatedly come under regulatory scrutiny due to the way it handles its users’ privacy, especially in regard to children. In 2019, the FTC fined Google over targeted advertising tailored to minors, prompting the company to step up its game with YouTube Kids and with further protection options. In a settlement, the company paid $170 million.
The company also received fines from European data protection agencies. For one, the way the company handled its GDPR cookie banners was too complicated. Rather than allowing users to easily hit a “Reject all” option for cookie and ad personalization, the company sent them into a convoluted menu to achieve the same, effectively not obtaining users’ free consent to data collection. For another, regulators looked into complaints about difficulties to understand how data is collected by the company and to access this trove of data, which is mandatory under GDPR.
————————————————