April 18, 2019 admin

Social Media, Self-Harm & Suicide

By Unaisa Abdul Haqq Baker,[1]

A 14-year-old girl, Molly Russell, took her own life in November 2017 after prolonged exposure to images of self-harm on Instagram. “The caring soul who died after exploring her depression on Instagram,”according to the Daily Telegraph,did not want to burden her parents. “Instead, the 14 year old retreated to a terrifying online world algorithmically tailored to encourage her darkest thoughts.”

She is not alone. Numerous young people are dying because they have been exposed to images of self-harm. Teenagers all around the world are being groomed to kill themselves, right under their parents’ noses. An example of this process is the Blue Whale Challenge, a social network phenomenon which started in 2016. It was known as a ‘game’ which consisted of a number of tasks to be performed over the course of 50 days. The administrators of this game would instruct the players to perform acts of self harm of increasing seriousness, before culminating in the ‘challenge’ of killing yourself.

This ‘game’ was dangerously influential because it was incremental. Boiling-a-frog is the relevant metaphor: if you throw a frog into a pot of boiling water it will jump straight out; but if you turn up the heat only gradually, the frog will allow itself to be boiled alive. Likewise, young people on social media become acclimatised to images of self-harm so that harming themselves is eventually seen as normal.

Inappropriate content which teenagers have easy access to includes photographs and poems about self-harm. Open an app such as Instagram, simply search the words ‘self harm’, and pages, hashtags and posts will flood your feed. Young people who are feeling depressed or simply lonely, may end up following pages which promote suicide. There they are able to express their feelings of alienation, and they feel accepted and understood. This type of recognition may be a spark to the fire. Often there is no one there to pour cold water on an inflamed imagination, because these online communities are usually close-knit. It’s not hard to imagine a a young person visiting these pages daily, speaking to other people there and gradually getting used to the idea that self-harm or even suicide is a normal response to teenage angst.

Suicide is now glamourised in shows such as 13 Reasons Why. This drama is about a girl, Hannah Baker, who takes her life and leaves messages for each person who led her to do so. Instead of making suicide appear as it really is – an appalling travesty of human life and potential, this makes it seem as normal as Hannah. What world do these programme makers live in, if they don’t realise how dangerous this is? It’s not enough to tag a hotline onto the end of the show – the damage may already have been done.

While much of the entertainment industry sends out all the wrong signals, Pinterest sets a commendable example. Paying close attention to what people are looking for on their app, they do not allow harmful material to be searched. If the words ‘self harm’ are searched, the message comes up: ‘If you’re in emotional distress or thinking about suicide, help is available.’

Unfortunately, the most popular social media platforms do not offer this type of assistance. Instagram, for example, seems to take no responsibility for what is searched. This is especially dangerous since the majority of its users are immature young people. Instagram also requires a post to be reported by a number of people before being taken down, and even then it takes a while. Netflix knows that young children regularly access 18-rated shows, but it looks the other way.  Snapchat allows any kind of picture to be sent to anyone. Nude photos are easily accessed on this app, as well as its user’s location. Options to alter this are available, but most users are unaware of these extra options. What is Snapchat doing to raise the bar on their level of security? Apparently nothing.

Social media platforms need to speed up their reactions to inappropriate content, ban certain content from being searched or even posted, and ask for police support in drawing up and implementing serviceable security policies.

And if it takes new legislation to force their hand, so be it. Too many young people are losing life and limb because social media are inadequately controlled.[2]

 

[1]First published in Rising East Publications, 11thMarch 2019: https://risingeast.co.uk/social-media-self-harm-and-suicide/

[2]Permission was granted by the author to reproduce this article.

Share
Tagged: , , , ,
WP Twitter Auto Publish Powered By : XYZScripts.com