Sputnik: Instagram's Boss, Adam Mosseri, has announced that all graphic images of self-harm will be removed from the social media platform following the death of 14-year-old Molly Russell, who took her own life in 2017. How significant is this announcement from Instagram?
Sputnik: Are big companies like Instagram and Facebook really to blame for the number of young suicides on their services?
Jo Robinson: It's not simple no, and as a parent myself, my heart goes out to any parent who's lost a child to suicide — it really does, but suicide and self-harm and are terribly complex behaviours. I think it's probably a bit unfair to blame one company or one type of platform, I think the pathways that lead young people to take their own lives are terribly complex and I certainly think that social media platforms can be part of the problem for many young people who may be already vulnerable and distressed but don't think that the only part of the problem.
Certainly some of the young people that we've spoken to, you know, use social media platforms find it part of the solution as well, because on there they can find a sense of community or they can find a way of expressing themselves in a way that they might not have felt able to do otherwise. So I would say that these behaviours are really, really complex. It's terribly tragic when a young person takes her own life. But I think we also need to look at the factors that need somebody to feel that way in the first place and to some of the kind of attitudes that we have as a community and the service sector towards suicide and self-harm that make young people feel unable to express themselves or seek help from professionals.
Sputnik: Instagram has very specific filters on its services, for example, it won't allow hardcore violence and nudity — in particular, female nudity which resulted in condemnation from certain feminist groups. Why do you think it's taken so long for Instagram to ban images of self-harm compared to nudist content?
Jo Robinson: First of all, I would say the Facebook and Instagram are owned by the same company and Facebook's made the same announcement that Instagram have made today so they've changed their policies in exactly the same way that Instagram have. I think it's a very complex behaviour and I think the challenge that these platforms have had over the last few years is that simply removing people's content can actually shut down conversations that young people might not be able to have in other way.
Just simply removing images that are potentially distressing for some people, but not that the issue with posting images around self-harm and, and content around suicide and self-harm, young people don't set out to distress others, they don't set out to cause harm or to embarrass others. It's the kind of inadvertent consequence, I think, or it can be an unintended consequence of that imagery, the impact that they might have another vulnerable young people. So I think one of the challenges that these platforms is by shutting those conversations are taking the images away, what they can further do is compound the sense of shame and stigma and isolation that those young people might have felt that have led them to communicate in this way in the first place.
So I think it's been a complicated process, and I very much welcome the decision that they've made. I think it's the right thing for them to do. We know that images, graphic images of self-harm can lead to instances of contagion or copycat events and those sorts of things. So I very much welcome the move, but I think it has been a complicated process and it's good to see that they put the thought into it, and that what they plan to do is when they do remove content is to respond to those young people directly and explain to them why the contents been removed and reach out to them with some sources of help.
The views expressed in this article are those of the speaker and do not necessarily reflect those of Sputnik.