Molly Russell, a 14-year old from London who died of self-inflicted injuries in 2017, didn’t die by suicide, according to a senior British coroner who examined her.
“It would not be safe to leave suicide as a conclusion,” Andrew Walker said in a court hearing on Friday, at the end of a two-week investigation, the BBC reported. Instead, according to Walker, “she died from an act of self-harm while suffering from depression and the negative effects of online content.” Coroners are judges in the United Kingdom, and the finding amounts to a legal ruling.
In the lead up to her death, Russell viewed and interacted with more than 2,000 Instagram posts related to suicide, self-harm, or depression, according to a report from The Guardian. The paper also described hundreds of self-harm related images found on Russell’s Pinterest account. Pinterest had reportedly sent the teen content recommendation emails with titles like “10 depression pins you might like.”
“It is likely that the above material viewed by Molly, already suffering with a depressive illness and vulnerable due to her age, affected her in a negative way and contributed to her death in a more than minimal way,” Walker testified, according to The Guardian.
The inquest, which came five years after the teen’s death, had been delayed multiple times previously, in part because of content redaction requests from Meta, which owns Instagram.
In recent years, multiple families have sued technology companies over the alleged role that social networks have played in youth injuries and deaths, including at least three ongoing suits in the U.S..
Yet Friday’s ruling appears to be unique, as the coroner’s conclusion is the “first time globally,” that content on a social media site has been determined to have directly contributed to a child’s death, Andrew Burrows, head of child safety online policy at the UK-based children’s charity NSPCC, told the Belfast Telegraph.
In a statement following the coroner’s conclusion, NSPCC’s CEO, Peter Wanless, warned, “This should send shockwaves through Silicon Valley - tech companies must expect to be held to account when they put the safety of children second to commercial decisions,” BBC reported.
In an email to Gizmodo, a Meta spokesperson claimed that, based on self-reported data, in the first three months of 2022, Instagram managed to remove 98% of all suicide-related and self-harm content on the platform before it was reported by users, linking to a parental support resource page.
“Our thoughts are with the Russell family and everyone who has been affected by this tragic death. We’re committed to ensuring that Instagram is a positive experience for everyone, particularly teenagers, and we will carefully consider the coroner’s full report when he provides it,” the spokesperson said.
In an emailed statement, a Pinterest spokesperson told Gizmodo: “Our thoughts are with the Russell family. We’ve listened very carefully to everything that the Coroner and the family have said during the inquest. Pinterest is committed to making ongoing improvements to help ensure that the platform is safe for everyone and the Coroner’s report will be considered with care.”
Since Russell’s death, her family has become dedicated advocates for online safety—using their platform to try to prevent the same tragedy from repeating itself.
Across the Atlantic Ocean, multiple families are pursuing legal action against social media companies along similar lines. In April, a Wisconsin family sued Snapchat and Meta over the death of a 17-year old boy, claiming the companies “knowingly and purposely” create harmful and addicting products. One month later, the mother of a 10-year old filed a lawsuit against TikTok over the so-called “Blackout Challenge,” which she claims killed her daughter. And in June, two parents in California cited the Facebook Papers in their lawsuit against Meta over their daughter’s eating disorder.
Research has demonstrated that social media can have a harmful impact on teen’s mental health, though what degree of legal liability that lays at the companies’ feet has yet to be settled. Multiple recent studies have found links between increased time spent on social media and increased risk of anxiety, depression, and other mental health conditions in young people. Further, social media companies like Meta seem to be aware of the harm their products cause, according to internal documents.
If you or someone you know is having a crisis or contemplating suicide, please call or text the Suicide and Crisis Lifeline at 988. You can also call the National Suicide Prevention Lifeline at 800-273-8255 or text the Crisis Text Line at 741-741.