Facebook’s Instagram ‘failed self-harm responsibilities’

0 52

Children’s charity the NSPCC has said a drop in Facebook’s removal of harmful content was a “significant failure in corporate responsibility”.

Facebook’s own records show its Instagram app removed almost 80% less graphic content about suicide and self-harm between April and June this year than in the previous quarter.

Covid restrictions meant most of its content moderators were sent home.

Facebook said it prioritised the removal of the most harmful content.

Figures published on Thursday showed that as restrictions were lifted and moderators started to go back to work, the number of removals went back up to pre-Covid levels.

“We want to do everything we can to keep people safe on Instagram and we can report that from July to September we took action on 1.3m pieces of suicide and self-harm content, over 95% of which we found proactively,” said Instagram’s head of public policy Tara Hopkins in a statement.

“We’ve been clear about the impact of Covid-19 on our content-review capacity, so we’re encouraged that these latest numbers show we’re now taking action on even more content, thanks to improvements in our technology.

“We’re continuing to work with experts to improve our policies and we are in discussions with regulators and governments about how we can bring full use of our technology to the UK and EU so we can proactively find and remove more harmful suicide and self-harm posts.”

‘Not surprised’

After the death of the teenager Molly Russell, Facebook committed itself to taking down more graphic posts, pictures and even cartoons about self-harm and suicide.

But the NSPCC said the reduction in takedowns had “exposed young users to even greater risk of avoidable harm during the pandemic”.

The social network has responded by saying “despite this decrease we prioritised and took action on the most harmful content within this category”

Chris Gray is an ex-Facebook moderator who is now involved in a legal dispute with the company.

“I’m not surprised at all,” he told the BBC.

“You take everybody out of the office and send them home, well who’s going to do the work?”

That leaves the automatic systems in charge.

But they still miss posts, in some cases even when the creators themselves have added trigger warnings flagging that the images featured contain blood, scars and other forms of self-harm.

Trigger warning

Mr Gray says it is clear that the technology cannot cope.

“It’s chaos, as soon as the humans are out, we can see… there’s just way, way more self-harm, child exploitation, this kind of stuff on the platforms because there’s nobody there to deal with it.”

Facebook is also at odds with moderators about their working conditions.

More than 200 workers have signed an open letter to Mark Zuckerberg complaining about being forced back into offices which they consider unsafe.

The staff claimed the firm was “needlessly risking” their lives. Facebook has said many are still working from home, and it has “exceeded health guidance on keeping facilities safe” for those who do need to come in.

BBC

You might also like