YouTube Denies Presence Of Child Sexual Abuse Material Amid Govt s Demands – BW Businessworld | #childsafety | #kids | #chldern | #parents | #schoolsafey

YouTube on Monday said that it has not detected any child sexual abuse material (CSAM) on its platform. This statement comes after the Ministry of Electronics and Information Technology (MeitY) issued notices to various social media platforms, including YouTube, directing them to remove CSAM content from their platforms. Failure to comply with these requirements could result in the loss of safe harbor protection under Section 79 of the IT Act.

Minister of State for Electronics, Rajeev Chandrasekhar, emphasised the government’s determination to establish a safe and trusted internet under the IT Rules, which set strict expectations for social media intermediaries to prevent harmful or criminal posts on their platforms. If these intermediaries do not take swift action, they risk losing their safe harbor protection, with legal consequences under Indian law.

YouTube, in its formal response to Indian regulators, reiterated its commitment to child safety. The company stated that it has a long history of successfully fighting child exploitation on its platform. Under its child safety policy, YouTube has removed over 94,000 channels and more than 2.5 million videos during the second quarter of 2023.

“We have a long history of successfully fighting child exploitation on YouTube. Based on multiple thorough investigations, we did not detect CSAM on our platform, nor did we receive examples or evidence of CSAM on YouTube from regulators. No form of content that endangers minors is allowed on YouTube, and we will continue to heavily invest in the teams and technologies that detect, remove and deter the spread of this content. We are committed to work with all collaborators in the industry-wide fight to stop the spread of child sexual abuse material (CSAM),” said a YouTube spokesperson in a statement. 

To safeguard minors in India, YouTube has implemented various measures. It displays warnings at the top of search results for specific queries related to CSAM, informing users that child sexual abuse imagery is illegal and providing links to the National Cyber Crime Reporting Portal. The platform enforces an age restriction, allowing users over 13 or with parental supervision. Accounts belonging to individuals under 13 without supervision have been terminated. Furthermore, YouTube has disabled comments, restricted live features, and limits video recommendations that could expose minors to predatory attention.

YouTube said it also utilises technology like CSAI Match, an API designed to identify re-uploads of previously identified CSAM material within videos. The company extends its expertise and resources to smaller partners and NGOs to help combat child sexual exploitation on its platform. To encourage user involvement, YouTube runs the Priority Flagger initiative, urging users and NGOs to report any content that violates its policies.

Tags assigned to this article:


s.parentNode.insertBefore(t,s)}(window, document,’script’,
fbq(‘init’, ‘1743098172620936’);
fbq(‘track’, ‘PageView’);

(function(d, s, id) {
var js, fjs = d.getElementsByTagName(s)[0];
if (d.getElementById(id)) return;
js = d.createElement(s); = id;
js.src = “//”;
fjs.parentNode.insertBefore(js, fjs);
}(document, ‘script’, ‘facebook-jssdk’));


Source link

National Cyber Security