December 9, 2022

An Instagram moderator has accused the social media giant of ‘cutting costs’ when it comes to prioritizing safety in the wake of the Molly Russell investigation.

Speaking on condition of anonymity, she said the multi-billion pound firm could do ‘much, much better’ in protecting children on the platform.

The moderator, who took office a year after the 14-year-old took his own life, said most of those posting self-harming content appeared to be young teenage girls.

She admitted she was ‘not qualified’ to deal with mental health issues and told how colleagues had been ‘concerned’ by the large numbers allowed to remain at the site.

An Instagram moderator has accused the social media giant of ‘cutting costs’ when it comes to prioritizing safety in the wake of the Molly Russell investigation (pictured)

The moderator, who took over a year after 14-year-old Molly died, said most of those posting self-harming content appeared to be teenage girls

Instagram has since banned all self-harm and suicide content, which the moderator described as a ‘positive’ step that made the job ‘easier’.

This meant staff no longer had to ‘consider whether the amount of wounds and blood’ made it necessary to take it down or ‘try to determine whether the user was in danger or not’.

In an interview with the Daily Mail, she said: ‘But I think they could still do much, much better – and I don’t think they prioritize it in terms of money. I sometimes feel they are cutting costs.’

Molly Russell became the first child in the world to have social media formally found to have contributed to their death on Friday.

The inquest heard how the 14-year-old engaged in thousands of self-harm posts and watched suicide videos on Instagram before her death in 2017.

See also  Russell Crowe and his girlfriend Britney Theriot arrive into Sydney on a personal jet

The company’s head of health and wellbeing, Liz Lagone, claimed the platform was ‘safe’ for children and defended its policy to leave such content if it was ‘permissive’.

Shown the content Molly had interacted with, the chief executive claimed it did not ‘encourage’ self-harm and instead benefited users by allowing them to ‘express themselves’.

The social media giant changed its policy in 2019 to ban all such material following expert advice that it could ‘unintentionally’ encourage users to take such actions.

Now, for the first time, an Instagram moderator has given an insight into how the platform handled material that a forensic pathologist deemed ‘contributed to [Molly’s] death in a more than minimal way’.

Instagram owner Meta’s head of health and wellbeing, Liz Lagone (pictured arrives at the inquest) claimed the platform was ‘safe’ for children and defended its policy of allowing such content to be abandoned if it was ‘permissive’.

The moderator, who moved to another job at Meta a few weeks ago, told how they had to get through about 200 accounts a day – giving them about one to two minutes on each.

Instagram encouraged moderators to ‘take our time’ when it came to dangerous content such as self-harm, she said, but there was still a ‘huge time pressure’.

‘The magnitude of what we were doing could sometimes feel overwhelming.’

Having studied a language degree and her career until then working in ‘basic accounting’ in an office, she said she had no experience or training in dealing with mental health issues.

‘I don’t think I felt qualified. I just did what had to be done, she said. ‘We had guidelines. If there is any indication that the user posted something as a goodbye or farewell, or posted a picture of something that might me a method of suicide, and it was within 24 hours, we would escalate it to law enforcement.’

See also  Carol Vorderman calls for action to help women with menopausal depression and reveals own symptoms

‘But it was up to us to make that call. Sometimes it can be hard to tell. I always tried to err on the side of caution.’

Oliver Sanders KC, the Russell family’s lawyer, questioned Ms Lagone at the inquest why it was up to Instagram to create a platform where users could share their experiences of self-harm and suicide and then decide whether this was useful for them or not – particularly for those under 18 viewing.

Ian Russell, Molly’s father, during a press conference in north London after the equation

The moderator said the guidelines before the change had been ‘confusing’ and ‘unintuitive’ – and she and several colleagues had become ‘concerned’ about how much was left on the platform.

‘I felt there was too much graphic stuff – and it had no reason to be there. But it stayed up because it was admission.’

Most of the self-harming content was shared by teenage girls, she said, and many appeared to be a year or so younger than the platform’s minimum age limit of 13.

Often they would have two accounts – one public facing and another more private profile that would share the more dangerous content. This would often be related to the content of eating disorders – with some posting stories from their hospital beds, she said.

The 2019 policy change signaled the first time Instagram appeared to start treating moderation ‘more seriously’ and putting ‘more resources’ into it.

But when asked if she thought Instagram was safe for teenagers, the moderator said: ‘I think there is no such thing as safe social media – safety is an illusion.

See also  THE BEST SITES TO BUY INSTAGRAM FOLLOWERS (REAL AND SECURE AND OF HIGH QUALITY) IN 2022.

The Daily Mail was put in touch with the moderator by Foxglove, which works with moderators of social media content to fight for better conditions and pay.

Meta said it invested “billions of dollars every year” in its moderation teams and technology to keep its platform secure.

Technology is now playing a ‘more central role’ in proactively detecting harmful content and prioritizing the ‘most critical content to be reviewed by people’.

Anyone who reviews the content goes through an ‘in-depth, multi-week training program on our community guidelines and has access to extensive psychological support to ensure their well-being’.

For help or support, visit samaritans.org or call Samaritans free on 116 123

Grieving parents support her family’s fight

Grieving parents have joined Molly Russell’s father’s fight to stop social media giants making a fortune while providing children with images that promote suicide and self-harm.

Architect Mariano Janin, whose daughter Mia, 14, died after being bullied online, and Ruth Moss, whose daughter Sophie Parkinson committed suicide at 13 after viewing disturbing internet content, both said yesterday they supported the TV director Ian Russell, 59.

Janin, 58, from north London, told The Sunday Times: ‘Ian achieved something important … but we now have to continue his campaign. I watched the investigation with tears in my eyes. I also know the pain Ian has had for the last five years because it has happened to me too.’ Mia’s investigation will continue next year.

After Sophie’s death eight years ago, Miss Moss said she wants online giants, including individual bosses personally, to be prosecuted for the role they played, adding: “Very little has changed since Sophie died.”