Print Friendly, PDF & Email

Child Sexual Abuse Material (CSAM)

Source: Hindustan Times
GS II:  Welfare schemes for vulnerable sections of the population by the Centre and States and the performance of these schemes; mechanisms, laws, institutions and Bodies constituted for the protection and betterment of these vulnerable sections


Overview

Child Sexual Abuse Material (CSAM)
Pixels Photo by Josie Stephens
  1. News in Brief
  2. Child Sexual Abuse Material (CSAM)
  3. Indian laws and regulations to stop CSAM

Why in the News?

YouTube, Telegram and X (formerly Twitter) have been told to proactively filter child sexual abuse material (CSAM) out on the Indian Internet the Ministry of Electronics and Information Technology.

News in Brief


  • A notice to social media platforms, including Telegram, X (formerly Twitter)), and YouTube, to remove child sexual abuse material (CSAM) from their platforms in India.
  • MeitY also said that delay in complying with the notices will result in the withdrawal of their safe harbour protection under Section 79 of the IT Act.

Section 79 of the IT Act

Any social media intermediary will not be in the radar of legal action for any third party information, data, or communication link made available or hosted by him. This immunity is however, subject to the provisions of Section 79(2) and 79(3) of the Act.

  • The notices were sent a day after the paper had reached out to the government about such content available on these platforms.
  • The notice also calls for the implementation of proactive measures to take down CSAM and emphasises the importance of prompt and permanent removal of CSAM.
  • Non-compliance with these requirements will be deemed a breach of Rule 3(1)(b) and Rule 4(4) of the IT Rules, 2021

Rule 3(1)(b) Due diligence by an intermediary:

The rules and regulations, privacy policy or user agreement of the intermediary shall inform the user of its computer resource not to host, display, upload, modify, publish, transmit, store, update or share any information that belongs to another person and to which the user does not have any right. Also, that is harmful to children.

Rule 4(4) Additional due diligence to be observed by significant social media intermediary

A significant social media intermediary must use technology-based measures, including automated tools, to identify and prevent the sharing of content related to rape, child sexual abuse, or conduct that is identical to previously removed content. Users trying to access such content will be shown a notice indicating that the content falls into these categories.

Child Sexual Abuse Material (CSAM)


Child Sexual Abuse Material (CSAM), also known as child pornography, refers to any visual or written material that portrays minors engaged in explicit sexual activities. This illegal and harmful content has severe consequences for both the victims and society as a whole.

  • Traumatic Impact on Victims
    • The creation and dissemination of CSAM involve the sexual exploitation and abuse of children, causing severe emotional, psychological, and physical harm to the minors involved.
    • Victims often suffer long-lasting trauma, including post-traumatic stress disorder, depression, and anxiety.
  • Perpetuation of Abuse
    • The sharing and circulation of CSAM perpetuate the abuse of the child victims.
    • Once these images or videos are online, they can be replicated and redistributed countless times, causing ongoing harm to the victims as their abuse is relived repeatedly.
  • Normalization of Child Exploitation
    • The existence and availability of CSAM normalize child exploitation and contribute to the desensitization of society towards the sexual abuse of children.
    • This normalization can create a culture that tolerates and even condones such heinous acts.
  • Criminal Activity
    • The production, distribution, and possession of CSAM are illegal activities in most jurisdictions.
    • Those involved in these activities can face severe legal consequences, including imprisonment.
  • Online Safety Risks
    • CSAM is often distributed through online platforms, making the internet a dangerous place for children.
    • Minors may be lured or coerced into sharing explicit images of themselves, which can then be used as CSAM.
    • This puts children at risk of further exploitation and abuse.
  • Global Issue
    • CSAM is a global problem, with vast networks dedicated to its production and distribution.
    • International cooperation is essential to combat this issue effectively.
  • Damage to Society
    • The existence of CSAM damages the social fabric by eroding trust and causing widespread outrage.
    • It also diverts resources away from addressing other pressing issues.
Indian laws and regulations to stop CSAM
  • Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021
    • These rules require significant social media intermediaries and digital platforms to deploy technology-based measures to proactively identify and remove CSAM and related content.
    • Failure to do so can result in penalties.
  • The Protection of Children from Sexual Offences (POCSO) Act, 2012
    • This law defines and criminalizes various sexual offences against children, including the creation, distribution, and possession of CSAM.
    • It provides for stringent punishments for offenders and aims to expedite legal proceedings to protect child victims.
    • Using the e-box from POCSO: Under the Protection of Children from Sexual Offences (POCSO) Act of 2012, it is a simple, straightforward, and confidential way to report any occurrence of sexual assault.
  • Section 67B of the Information Technology (IT) Act, 2000
    • Section 67B of the Information Technology (IT) Act, 2000
    • This section specifically addresses the punishment for publishing or transmitting material containing sexually explicit acts involving minors.
    • Offenders can face imprisonment and fines.
      • On first conviction with imprisonment of either description for a term which may extend to five years and with a fine which may extend to Rupees ten lakh.
      • On second or subsequent conviction with imprisonment of either description for a term which may extend to seven years and also with a fine which may extend to Rupees ten lakh.
  • Section 293 of the Indian Penal Code (IPC)
    • This section makes it illegal to sell, distribute, or publicly exhibit obscene materials involving minors.
    • Violators can be penalized
      • On first conviction with imprisonment of either description for a term which may extend to three years, and with fine which may extend to two thousand rupees.
      • On second or subsequent conviction, with imprisonment of either description for a term which may extend to seven years, and also with a fine which may extend to five thousand rupees
  • National Commission for the Protection of Child Rights (NCPCR)
    • The NCPCR is responsible for monitoring and ensuring the enforcement of child protection laws, including those related to CSAM.
  • Indian Cyber Crime Coordination Centre (I4C)
    • This organization works to combat cybercrimes, including CSAM-related offences.
    • It collaborates with law enforcement agencies to investigate and prosecute such cases.
    • CCPWC scheme (CYBERCRIME PREVENTION AGAINST WOMEN AND CHILDREN) enabled the filing of Cybercrime complaints pertaining to Child Pornography (CP)/ Rape or Gang Rape (RGR) – Sexually Abusive Content only.

Daily Current Affairs: Click Here

Rate this Article and Leave a Feedback

0 0 votes
Article Rating
Subscribe
Notify of
guest

0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x