Molly Russell ‘liked’ posts that ‘glorified’ suicide before committing suicide, court hears

Tragic schoolgirl Molly Russell liked suicide videos of ‘the most harrowing nature’ before committing suicide, a study finds today.

The 14-year-old schoolgirl from Harrow, Northwest Londonresearched self-harm and suicide online before she died in November 2017.

Executives of metathe parent company of Instagram and facebookand Pinterest, have flown in to testify in person during Molly’s inquest at the coroner’s office in North London.

The study will examine algorithms used by social media companies to channel content to users and keep them hooked.

Molly’s family also wants the inquest to consider 29 internal Meta documents that allegedly contain research into the impact of self-harm and suicide of online material on teens.

Before showing the footage, coroner Andrew Walker warned the court that the videos were “almost impossible to watch.”

Tragic schoolgirl Molly Russell (pictured above) liked 'glamorous' suicide videos of 'the most disturbing nature' before committing suicide, was told during a London inquest today

Tragic schoolgirl Molly Russell (pictured above) liked ‘glamorous’ suicide videos of ‘the most disturbing nature’ before committing suicide, was told during a London inquest today

Elizabeth Lagone, Meta's Head of Health and Wellness Arrives at Barnet Coroner's Court

Elizabeth Lagone, Meta’s Head of Health and Wellness Arrives at Barnet Coroner’s Court

He said: ‘The video content can be edited, but Molly had no such choice. My opinion is that the video footage should be played on its own.

‘Be warned, the images glorify suicide. It is of the most disturbing nature. Watching is almost impossible.

“I mainly say this to members of Molly’s family, but I think the video footage should be seen.”

Her family decided to stay in court while the videos played.

Social media, all “liked” by Molly before her suicide, showed people falling from buildings, jumping in front of trains and others hanging from a noose.

Some saw themselves cutting themselves with knives and even shooting themselves in the head.

Molly, from Harrow, liked disturbing videos she watched on social media, court heard

Molly, from Harrow, liked disturbing videos she watched on social media, court heard

The words “fat,” “worthless,” and “suicidal” flashed across the screen between videos to the background of aggressive music.

On Friday, the head of health and wellness at Instagram’s parent company Meta, Elizabeth Lagone, defended the social media platform’s content policy — saying that suicide and self-harm material could have been posted by a user as a “cry for help.”

Ms Lagone told the court that it was an important consideration of the company, even in its policy at the time of Molly’s death, to “consider the wide and unbelievable damage that could be done by silencing (a poster’s) struggle.” lay’.

Instagram’s guidelines at the time, shown to the court, said users could post content about suicide and self-harm to “facilitate coming together to support other users,” but not if it “encouraged or promoted.”

Ms Lagone then went to the witness stand, before the coroner said: ‘My premise is that the Internet is a very dangerous place for those who enter it.

“Every effort must be made to make that journey as safe as possible.”

When asked by family lawyer Oliver Sanders KC whether it was clear that it was not safe for children to see ‘graphic suicide images’, the director said: ‘I don’t know… these are complicated issues.’

Judson Hoffman, Global Head of Community Operations at Pinterest, Leaves Court Yesterday

Judson Hoffman, Global Head of Community Operations at Pinterest, Leaves Court Yesterday

Mr. Sanders drew the witness’s attention to experts who had informed Meta that it was not safe for children to view the material before asking, “Had they told you otherwise before?”

Mrs. Lagone replied, ‘We have ongoing discussions with them, but there are a number of… issues we are discussing with them.’

In response to Molly’s family’s allegations of internal investigation, Ms. Lagone told the court that she was not aware of any research being done by the tech giant into how content affects users of its platforms.

Coroner Andrew Walker asked the Meta manager about research into the impact of self-harm related content on users and asked if there had been any internal investigation.

She said, “I’m not aware of any specific research on the impact of content. That would be very difficult research from an ethical point of view.’

Ms. Lagone later added: ‘We are confident that our policies take into account the needs of our youngest users.’

On Thursday, Pinterest’s head of community operations, Judson Hoffman, apologized after admitting the platform was “not safe” when the 14-year-old was using it.

Mr Hoffman said he “deeply regrets” posts Molly viewed on Pinterest before her death, saying it was material he “wouldn’t show to my kids.”

The judicial investigation, which will last up to two weeks, continues.