X

Instagram says it'll ban all graphic images of self-harm

The social network will also prevent nongraphic self-harm content from appearing in search, hashtags and the explore tab, says Instagram head Adam Mosseri.

Abrar Al-Heeti Technology Reporter
Abrar Al-Heeti is a technology reporter for CNET, with an interest in phones, streaming, internet trends, entertainment, pop culture and digital accessibility. She's also worked for CNET's video, culture and news teams. She graduated with bachelor's and master's degrees in journalism from the University of Illinois at Urbana-Champaign. Though Illinois is home, she now loves San Francisco -- steep inclines and all.
Expertise Abrar has spent her career at CNET analyzing tech trends while also writing news, reviews and commentaries across mobile, streaming and online culture. Credentials
  • Named a Tech Media Trailblazer by the Consumer Technology Association in 2019, a winner of SPJ NorCal's Excellence in Journalism Awards in 2022 and has three times been a finalist in the LA Press Club's National Arts & Entertainment Journalism Awards.
Abrar Al-Heeti
2 min read
Instagram
Getty Images

Instagram won't allow graphic images of self-harm, such as cutting, on its site, its leader wrote in a Thursday post

The social platform will also prevent nongraphic self-harm related content, such as images of healed scars, from showing up in search, hashtags and the explore tab. It also won't recommend that content.

"Up until now, we've focused most of our approach on trying to help the individual who is sharing their experiences around self-harm," Instagram head Adam Mosseri wrote. "We have allowed content that shows contemplation or admission of self-harm, because experts have told us it can help people get the support they need. But we need to do more to consider the effect of these images on other people who might see them."

Instagram  won't remove nongraphic self-harm content entirely because "we don't want to stigmatize or isolate people who may be in distress and posting self-harm related content as a cry for help," Mosseri said. The platform will also work to provide more resources to people who post and search for self-harm related content and will direct them to organizations that can help, he added.

Instagram will continue to consult with experts to learn how to best approach the issue, which might include blurring nongraphic self-harm content with a sensitivity screen, so users would be blocked from seeing content unless they actively choose to, Mosseri said. 

The new policy expands on measures Mosseri discussed in an op-ed in the Daily Telegraph on Monday, in which he said he'd do more to protect vulnerable users from seeing content promoting suicide and self-harm. This includes using sensitivity screens to obscure images depicting cutting, and being more supportive of people who post images indicating they might be struggling with these issues, he wrote. The piece touched on the death of British teenager Molly Russell, who took her life in 2017. Russell had used Instagram to engage with and post content about depression and suicide, leading her family to blame the social network for her death.

Experts such as the Centre for Mental Health and Save.org advised Instagram that although safe spaces for people to discuss their experiences are essential, graphic images of self-harm could unintentionally promote more harm, Mosseri said.

The goal of Instagram's new policy is to eliminate graphic self-harm or graphic suicide-related content from the platform, Mosseri said, and to reduce and eventually eliminate all self-harm and suicide images from hashtags, search, the explore tab and recommended content.  

"We will not be able to remove these images immediately and we must make sure that people posting self-harm related content do not lose their ability to express themselves and connect with help in their time of need," Mosseri wrote.

CNET en Español: Get all your tech news and reviews in Spanish.

CNET Magazine: Check out a sample of the stories in CNET's newsstand edition.