The Facebook-owned image and video sharing service early this year clamped down on images of self-injury after a British teen who went online to read about suicide took her own life.
“We will no longer allow fictional depictions of self-harm or suicide on Instagram, such as drawings or memes or content from films or comics that use graphic imagery,” Instagram chief Adam Mosseri said in blog post.
“We will also remove other imagery that may not show self-harm or suicide, but does include associated materials or methods.”
Instagram has never allowed posts that promote or encourage suicide or self-harm.
With the rule change early this year, Instagram began removing references to non-graphic content related to people hurting themselves from its searches and recommendation features.
It also banned hashtags — words featuring a “#” that mark a trending topic — relating to self-harm.
The measures were meant to make such images more difficult to find for depressed teens who might have suicidal tendencies.
British teen Molly Russell took her own life in her bedroom in 2017. The 14-year-old’s social media history revealed that she followed accounts about depression and suicide.
The case sparked a vigorous debate in Britain about parental control and state regulation of children’s social media use.
People making self-harmed related searches at Instagram will be sent online resources or local hotlines, such as Samaritans or Papyrus in Britain or the National Suicide Prevention Hotline in the US, according to Mosseri.
“The tragic reality is that some young people are influenced in a negative way by what they see online, and as a result they might hurt themselves,” Mosseri said in the post.
“This is a real risk.”
Instagram reported that in the three months following the policy change, the service “reduced the visibility of, or added sensitivity screens” to more than 834,000 pieces of content.