YouTube to Launch AI Age Verification in US, Enhancing Child Safety | #childsafety | #kids | #chldern | #parents | #schoolsafey


YouTube, a subsidiary of Alphabet, is set to introduce an AI-driven age verification system to determine whether viewers in the United States are under 18 years old. This move comes as large technology companies face increasing pressure to enhance online safety measures for children. The AI system will analyze various signals, including video search history, frequently watched categories, and the duration of YouTube account usage, to estimate the age of users. This method has already been successfully implemented in other markets and will now be rolled out in the United States starting from August 13, initially for a small group of users.

Once the system identifies a user as under 18, YouTube will activate youth account protection measures. These include non-personalized ads, reminders to take a break, privacy alerts, and reduced recommendations for content that may be problematic if viewed repeatedly. If the AI system makes an incorrect age determination, users can verify their actual age through government-issued IDs, credit cards, or by uploading a selfie. This ensures that only users who are estimated or proven to be over 18 can access age-restricted content.

YouTube has acknowledged that some creators may notice changes in their youth audience, which could potentially affect their ad revenue. However, the company believes that the impact on most creators will be minimal. In February, the CEO of YouTube outlined the company’s plans to increase the use of AI, including age estimation. Creators will also benefit from AI tools that assist in tasks such as automatic dubbing in different languages and generating video titles and thumbnails.

YouTube will closely monitor the age estimation analysis feature before rolling it out more widely. The tech industry in the United States is currently adapting to age verification regulations in multiple states and other countries. These regulations require platforms to verify user ages to protect children from harmful content, including pornography. Advocates for child safety emphasize that holding tech companies accountable for age verification is crucial for creating a safer online environment for minors.

————————————————


Source link

National Cyber Security

FREE
VIEW