New Delhi: Looking at the increase in the volume of explicit material involving children, the National Human Rights Commission has issued an advisory recommending that the terminology ‘child pornography’ be changed with ‘child sex abuse material’ (CSAM), use technology and platforms like social media and OTTs to combat the issue, and create a database of such material with their hash value.
A four-part advisory, the Advisory for Protection of the Rights of Children against Production, Distribution and Consumption of Child Sexual Abuse Material (CSAM) was issued to all states and key ministries by the NHRC on on Friday. The human rights body has asked for an action-taken report from various governments and departments in two months.
In its advisory, the NHRC has also asked for special police units in the Centre as well as states which should be funded from the Nirbhaya Fund among other sources for financial assistance.
The NHRC said that data from the National Center for Missing & Exploited Children shows that 5.6 million reports of CSAM of over 32 million annual reports globally came from India. The rights body said that 4,50,207 cases of CSAM have been reported in 2023 so far. In 2022, 2,04,056 cases were reported which was an increase from 1,63,633 cases reported in 2021.
In the advisory, the NHRC has recommended that the phrase ‘Child Pornography’ in Section 2(1) (da) of the POCSO Act, 2012 be replaced with ‘Child Sexual Abuse Material’ (CSAM). Terms like ‘use of children in pornographic performances and materials’, ‘child sexual abuse material’ and ‘child sexual exploitation material’ must be preferred over ‘child pornography’. It also urged the government to redefine the term ‘sexually explicit’ in the IT Act 2000 for better identification and removal of CSAM.
The NHRC has also recommended that the current quantum of punishment for offences under Section 14 of the POCSO Act and Section 67B of the IT Act for CSAM which amount to seven years or less “may be relooked or exempt the application of Section 41A CrPC by making appropriate legislative changes.”
It has asked intermediaries to develop a CSAM-specific policy which must outline an in-house reporting mechanism and details how such material will be removed from the platforms.
“Intermediaries, including Social Media Platforms, Over-The-Top (OTT) applications and Cloud Service Providers, must deploy technology, including content moderation algorithms, to proactively detect CSAM on their platforms and remove the same,” it said.
(Published 29 October 2023, 18:31 IST)