New

Protecting children from online Child Sexual Abusive Material (CSAM)

(GS Mains2 : Government policies and interventions for development in various sectors and issues arising out of their design and implementation.)

Context:

  • Recently,  the Central Bureau of Investigation (CBI) conducted searches across States and Union Territories as part of a pan-India operation, “Megh Chakra”.
  • The operation Megh Chakra is against the online circulation and sharing of Child Sexual Abusive Material (CSAM) using cloud-based storage.
  •  In November 2021, a similar exercise code-named “Operation Carbon” was launched by the CBI, with many being booked under the IT Act, 2000.

Lacking automatic electronic monitoring:

  • In India, though viewing adult pornography in private is not an offence; seeking, browsing, downloading or exchanging child pornography is an offence punishable under the IT Act. 
  • However, Internet Service Providers (ISPs) are exempted from liability for any third-party data if they do not initiate the transmission. 
  • As the public reporting of circulation of online CSAM is very low and there is no system of automatic electronic monitoring, India’s enforcement agencies are largely dependent on foreign agencies for the requisite information.

Case study models:

  • The National Center for Missing & Exploited Children (NCMEC), a non-profit organisation in the United States, operates a programme called CyberTipline, for public and electronic service providers (ESPs) to report instances of suspected child sexual exploitation. 
  • ISPs are mandated to report the identity and the location of individuals suspected of violating the law. 
  • In the United Kingdom also, the mission of the Internet Watch Foundation (IWF), a non-profit organisation to ensure a safe online environment for users with a particular focus on CSAM, includes disrupting the availability of CSAM and deleting such content hosted in the U.K. 
  • The IWF engages the analysts to actively search for criminal content and not just rely on reports from external sources. 
  • Though the U.K. does not explicitly mandate the reporting of suspected CSAM, ISPs may be held responsible for third party content if they host or caches such content on their servers.

Pathway of India:

  • In India, the Supreme Court of India, in Shreya Singhal (2015), read down Section 79(3)(b) of the IT Act to mean that the ISP, only upon receiving actual knowledge of the court order or on being notified by the appropriate government, shall remove or disable access to illegal contents. 
  • Thus, ISPs are exempted from the liability of any third-party information.
  • In the Kamlesh Vaswani (WP(C) 177/2013) case, the petitioner sought a complete ban on pornography. 
  • After the Court’s intervention, the advisory committee (constituted under Section 88 of the IT Act) issued orders in March 2015 to ISPs to disable nine (domain) URLs which hosted contents in violation of the morality and decency clause of Article 19(2) of the Constitution. The petition is still pending in the Supreme Court.
  • ‘Aarambh India’, a Mumbai-based non-governmental organisation, partnered with the IWF, and launched India’s first online reporting portal in September 2016 to report images and videos of child abuse. 
  • The Ministry of Home Affairs (MHA) launched a national cybercrime reporting portal in September 2018 for filing online complaints pertaining to child pornography and rape-gang rape. 
  • Further, the National Crime Records Bureau (MHA) signed a memorandum of understanding with the NCMEC in April 2019 to receive CyberTipline reports to facilitate action against those who upload or share CSAM in India. 
  • The NCRB has received more than two million CyberTipline reports which have been forwarded to the States for legal action.

The recommendations:

  • The ad hoc Committee of the Rajya Sabha, headed by Jairam Ramesh, in its report of January 2020, made wide-ranging recommendations on ‘the alarming issue of pornography on social media and its effect on children and society as whole’. 
  • On the legislative front, the committee not only recommended the widening of the definition of ‘child pornography’ but also proactive monitoring, mandatory reporting and taking down or blocking CSAM by ISPs.
  • On the technical front, the committee recommended permitting the breaking of end-to-end encryption, building partnership with industry to develop tools using artificial intelligence for dark-web investigations.
  • Further it recommend for tracing identity of users engaged in crypto currency transactions to purchase child pornography online and liaisoning with financial service companies to prevent online payments for purchasing child pornography.

Way forward:

  • According to the ninth edition (2018) report of the International Centre for Missing and Exploited Children on “Child Sexual Abusive Material: Model Legislation & Global Review”, more than 30 countries now require mandatory reporting of CSAM by ISPs. 
  • India also figures in this list, though, the law does not provide for such mandatory reporting.
  • The Optional Protocol to the United Nations Convention on the Rights of the Child that addresses child sexual exploitation encourages state parties to establish liability of legal persons. 
  • Similarly, the Council of Europe’s Convention on Cybercrime and Convention on The Protection of Children against Sexual Exploitation and Sexual Abuse also requires member states to address the issue of corporate liability.
  • Thus, it is time India joins INHOPE and establishes its hotline to utilise Interpol’s secure IT infrastructure or collaborate with ISPs and financial companies by establishing an independent facility such as the IWF or NCMEC. 
  • The Jairam Ramesh committee’s recommendations must be followed up in earnest and the Prajwala case brought to a logical end.

Conclusion:

  • India needs to explore all options and adopt an appropriate strategy to fight the production and the spread of online CSAM. Children need to be saved.
Have any Query?

Our support team will be happy to assist you!

OR