Videos shown at an inquest into the death of schoolgirl Molly Russell were so disturbing that a coroner considered editing them and issued the “greatest” warning before playing them.
The 14-year-old from Harrow, in north-west London, took her own life in November 2017 after watching content online about self-harm, depression and suicide.
An inquest into her death at Coroner’s Court in north London revealed 17 clips she liked or saved to Instagram that “appeared to denounce young people evil”.
Before the clips played, coroner Andrew Walker told those in attendance to leave if they were likely to be affected by the material.
The court was told that lawyers and the coroner had discussed beforehand whether they should be edited because they were “so uncomfortable to watch”.
“But Molly didn’t have such a choice, so we would essentially edit the footage for adult viewing if it was available in a raw form for a child,” said Mr Walker.
Describing images the court was about to see, the coroner said: “It is of the most disturbing nature and it is almost impossible to look at.
“If you are likely to be affected by such videos, don’t stop watching them.”
The coroner turned to Molly’s family and said, “You don’t have to stay.
“In my opinion, this series of video images should be seen [by the court].”
The court then played the clips, which involved suicide, drugs, alcohol, depression and self-harm.
Molly’s family remained in court while the videos played, but the coroner chose to take a 15-minute break from the proceedings afterward.
The schoolgirl’s family has been campaigning for better internet safety since her death nearly five years ago.
Instagram’s guidelines at the time, shown to the court, said that users could post content about suicide and self-harm to “facilitate coming together to support other users”, but not if it “encouraged or promoted” self-harm.
On Friday, the head of health and wellness at Instagram’s parent company Meta defended the social media platform’s content policies — saying suicide and self-harm material could have been posted by a user as a “cry for help.”
Elizabeth Lagone told the court that it was an important consideration of the company, even in its policy at the time of Molly’s death, to “consider the wide and unbelievable damage that could be done by silencing[a poster’s]struggle.” “.
Ms. Lagone also denied that Instagram had treated children like Molly as “guinea pigs” when it launched content ranking – a new algorithm-driven system for personalizing and sorting content – in 2016.
Molly’s family’s attorney, Oliver Sanders KC, said: “Isn’t it true that children, including children who suffer from depression like Molly, who were on Instagram in 2016, were just guinea pigs in an experiment?”
She replied, “That’s specifically not the way we develop policies and procedures at the company.”
When asked by Mr. Sanders whether it was clear it was not safe for children to see “graphic suicide images,” the director said, “I don’t know…these are complicated issues.”
Mr. Sanders drew the witness’s attention to experts who informed Meta that it was not safe for children to view the material, before asking, “Had they told you otherwise before?”
Ms. Lagone replied: “We have ongoing discussions with them, but there are some … issues we are talking to them about.”
The court heard that Molly set up an Instagram account in March 2015 when she was 12 and was recommended 34, “possibly more”, sad or depressive-related accounts on Instagram.
Of the recommended reports, Mr. Sanders said one referred to self-injury, one to concealment, four to suicidal thoughts, one to themes of “not being able to continue”, two to mortality and one to burial.
On Thursday, Pinterest’s head of community activities, Judson Hoffman, apologized after admitting the platform was “not safe” when the 14-year-old was using it.
Mr Hoffman said he “deeply regrets” posts Molly viewed on Pinterest before her death, saying it was material he “wouldn’t show to my kids”.
The judicial investigation, which will last up to two weeks, continues.
If you experience feelings of fear and isolation, or are struggling to cope, The Samaritans offers support; you can talk to anyone for free, in confidence, on 116 123 (UK and ROI), email [email protected]or visit the Samaritans website to find the details of your nearest branch.
For services near you, enter your zip code into the national mental health database – Hub of Hope – to search for organizations and charities that offer mental health advice and support in your area.
Additional Press Association coverage