Take the pledge to vote

For a better tommorow#AajSawaroApnaKal
  • I agree to receive emails from News18

  • I promise to vote in this year's elections no matter what the odds are.
  • Please check above checkbox.

    SUBMIT

Thank you for
taking the pledge

Vote responsibly as each vote counts
and makes a diffrence

Disclaimer:

Issued in public interest by HDFC Life. HDFC Life Insurance Company Limited (Formerly HDFC Standard Life Insurance Company Limited) (“HDFC Life”). CIN: L65110MH2000PLC128245, IRDAI Reg. No. 101 . The name/letters "HDFC" in the name/logo of the company belongs to Housing Development Finance Corporation Limited ("HDFC Limited") and is used by HDFC Life under an agreement entered into with HDFC Limited. ARN EU/04/19/13618
LIVE TV DownloadNews18 App
News18 English
Powered by cricketnext logo
News18 » Buzz
3-min read

'The Lifeguard': How a Woman is Trying to Prevent Suicide and Self-Harm on Instagram

Instagram banned suicidal and self harm content in February this year following outrage after the death of 14-year-old British teenager Molly Russel in 2017.

News18.com

Updated:November 8, 2019, 2:08 PM IST
facebookTwitter Pocket whatsapp
'The Lifeguard': How a Woman is Trying to Prevent Suicide and Self-Harm on Instagram
Image credit: Instagram/Ingebjørg Blindheim

Trigger warning, this story is about suicide on Instagram.

Despite Instagram's attempts to control content related to suicide and self-harm on its image and video sharing platform, "dark" Instagram accounts still thrive in the nooks and niches of the social media site.

And while these accounts are generally very private and exclusive in terms of who gets to follow them and share their content, many of them directly talk about suicide and other mental health issues and even announce suicides and and display self-harm.

But while Instagram is trying to "crack down" on such posts with its stringent bans following instances of suicide related to Instagram in various countries, a 22-year-old girl from Norway is trying to provide these accounts with a better alternative - a sympathetic ear.

Dubbed "the Lifeguard" by the internet, Ingebjørg Blindheim trawls the so called "dark" Instagram accounts so that she could lend support and help to those on the brink of suicide. Speaking to the BBC, Blindheim said that this wasn't a job she had always dreamed to have but once she found such content, she could not turn away from the (often disguised) cries of help suicidal users or those with mental health issues often left on social media.

Blindheim neither has professional training for dealing with patients of mental health, nor is she paid by Instagram (or anyone else) for her services. Yet, she consistently keeps track of about 450 Instagram accounts, as per a report in BBC, and notifies the police or medical health administrators in case of a suicide or threat of suicide.

However, Blinheim admitted that these groups were far from being "safe-spaces" for those suffering from mental health, often acting as triggers instead that exacerbate the situation. Most such accounts while offering some support to the victims or potential victims also provide tonnes of "how to" ideas and suggestions.

Despite its ban on such content, a simple search on Instagram with the hashtag #suicide revealed 8.4 million posts relating to suicide and self harm. Many of them contained graphic content. As per Blindheim, the more intense a video, the higher the likes, creating a toxic competition among users to top the last attempt to kill oneself or cause grave harm to self for more likes.

Instagram banned suicidal and self harm content in February this year following outrage after the death of 14-year-old British teenager Molly Russel in 2017.

Russel's death, which sparked an international outrage against the social media platform, was linked to the graphic suicidal and self-harm related content that she was posting and apparently consuming on Instagram. Many critics said that social media platforms such as Instagram and its parent company Facebook had not done enough to tackle the problem of self-harm and suicide on social media.

Despite the ban, however, a 16 year-old Malaysian girl took her life in May this year due to an Instagram poll in which she had asked her followers to vote is she should live or kill herself. 69 percent people voted for the latter.

The incident caused intense debate on the credibility of Instagram and other influential platforms to weed out content that can trigger suicides or mental health problems. Many articles and op-eds were written about how to deal with finding a suicidal post on social media.

Instagram actively encourages users to reach out to the platform and report content that is suicide related. If it is about a potential suicide, the platform urges the user to contact security and medical personnel and also provide additional support to the victim in terms of contacting their family (but only if they want) and providing other resources that they may need.

But can users alone be tasked with monitoring 8.4 million suicide posts? In terms of increasing the extent of the ban to cover all suicide related content, Instagram recently extended the ban to include cartoons and graphic imagery that could in any way suggest suicide or self-harm.

"We have expanded our policies to prohibit more types of self-harm and suicide content," Adam Mosseri, Head of Instagram, wrote in a blog post in late October "Accounts sharing this type of content will also not be recommended in search or in our discovery surfaces, like 'Explore'," he added.

Get the best of News18 delivered to your inbox - subscribe to News18 Daybreak. Follow News18.com on Twitter, Instagram, Facebook, Telegram, TikTok and on YouTube, and stay in the know with what's happening in the world around you – in real time.

Read full article
Next Story
Next Story

facebookTwitter Pocket whatsapp

Live TV

Countdown To Elections Results
To Assembly Elections 2018 Results