London — A coroner in London concluded on Friday that social media was a factor in the death of 14-year-old Molly Russell, who took her own life in November 2017 after seeing a large amount of online content about self-harm and suicide on social media platforms. which include Instagram and Pinterest.
“It is likely that the material seen by Molly…negatively and contributed to his death in a more than minimal way,” Coroner Andrew Walker was quoted as saying by British media on Friday. “It would not be safe to leave suicide as a conclusion. He died from an act of self-harm while suffering from depression and the negative effects of online content.”
Walker said he would prepare a “prevention of future deaths” report and write to Pinterest and Meta (Instagram’s parent company), as well as the British government and Ofcom, the UK communications regulator.
“The ruling should send shockwaves through Silicon Valley,” said Peter Wanless, chief executive of the British child protection charity NSPCC, in a statement. “Technology companies should expect to be held accountable when they put child safety ahead of business decisions. The magnitude of this moment for children around the world cannot be underestimated.”
The conclusion came days after a senior Meta executive apologized to the coroner’s inquest for allowing Russell to view graphic Instagram posts about suicide and self-harm that should have been removed under their own policies. But the executive also said that she considered some of the content Russell had viewed to be safe.
Elizabeth Lagone, Meta’s head of health and wellness policy, said at the inquest on Monday that Russell had “viewed some content that violated our policies and we’re sorry.”
Asked if he was sorry, Lagone said: “We’re sorry Molly saw content that violated our policies and we don’t want that on the platform.”
But when asked by Russell’s family’s attorney if material related to depression and self-harm was safe for children to view, Lagone replied, “Respectfully, I don’t think it’s a binary question,” adding that “some people might find comfort ” in knowing that they are not alone.
She said that Instagram had consulted with experts who advised the company to “not seek to remove [types of content connected to self-harm and depression] because of the increased stigma and shame it can cause to people who are struggling.
In a statement Friday, Pinterest said it was “committed to making continuous improvements to help ensure the platform is safe for everyone and that the coroner’s report will be considered carefully.”
“Over the past several years, we have continued to strengthen our policies around self-harm content, provided pathways to compassionate support for those in need, and invested heavily in creating new technologies that automatically identify and take action on self-harm content,” he said. the company, adding that the British teenager’s case had “reinforced our commitment to creating a safe and positive space for our Pinners.”
Meta said it was “committed to ensuring that Instagram is a positive experience for everyone, particularly teens, and we will carefully consider the coroner’s full report when it is provided. We will continue our work with the world’s leading independent experts to help ensure that The changes we make offer the best possible protection and support for teens.”
The investigation heard that 2,100 of the 16,000 pieces of online content Russell viewed during the last six months of his life were related to depression, self-harm and suicide. He also heard that Molly had created a Pinterest board with 469 images on related topics.
On Thursday, before the conclusion of the investigation, Walker, the chief coroner, said this should serve as a catalyst to protect children from online risks.
“It used to be the case when a child walked through the front door of their house, it was to a safe place,” Walker said. “With the Internet, we brought a source of risk into our homes, and we did so without appreciating the magnitude of that risk. And if there is a benefit to be derived from this research, it must be to recognize that risk and take steps to ensure that the risk that we have assumed in our home is kept completely away from children. This is an opportunity to make this part of the Internet safe, and we must not let it slip away. We must do it.”
At a news conference after the investigation concluded, Molly Russell’s father, Ian, said that on social media “people are misusing products and their products are not safe. That’s the monster that created, but it is a monster and we must do something”. about to make it safe for our children in the future.
When asked if he had a message for Meta CEO Mark Zuckerberg, he said: “Listen to the people who use your platform, listen to the conclusions that the coroner gave in this investigation, and then do something about it.”
If you or someone you know is experiencing emotional distress or a suicidal crisis, please call the National Suicide Prevention Hotline at 1-800-273-TALK (8255) or dial 988.
For more information about mental health care resources and support, you can contact the National Alliance on Mental Illness (NAMI) Helpline Monday through Friday, 10 a.m. to 6 p.m. firstname.lastname@example.org.