Instagram Sensitivity Screens

Instagram is one of the most beloved platforms ever. Unfortunately, nothing in this world is perfect. None of the social media sites we know are. But, where are we going with this? Well, bear with us. 

Social media platforms have millions upon millions of users. Therefore, it’s hard to control the content every single person posts online. For instance, Twitter is a site where we can interact with those who we admire.

However, the platform does also host an unbelievable amount of explicit sexual content. Moreover, one cannot remove pornography from Twitter unless the account showing the content is reported several times. 

The same thing happens with both Facebook and Instagram. Users are able to encounter posts that are often disturbing for a lot of people. Animals being slaughtered, violence against a person, you name it. So, how have these big companies solved such a problem? 

Well, the answer is simple. Sensitivity screens! 

Sensitive content on social media

What Are Sensitivity Screens?

To put it briefly, sensitivity screens are a feature implemented by most social media platforms to prevent users from viewing content that could be disturbing. So, if there’s a video of a violent nature, or something sexual related, these social media sites will warn the user before they can see the posts. 

This way, they can decide whether or not to actually consume the content. Initially, this feature was active in both Facebook and Twitter. However, Instagram has now joined the list and implemented the tool as well.

Instagram Sensitivity Screens

Why Did Instagram Implement It? 

But why? Why didn’t Instagram implement this feature before? Well, the truth is that there was no need to. As a matter of fact, Instagram remained to be one of the very few platforms with “family friendly” content. 

Sure, there were certain posts that didn’t belong in the platform. But not as many as there are on Facebook or Twitter. However, everything changed after a tragedy occurred. You see, back in 2017, 14-year-old Molly Russell took the fatal decision of ending her life.  

It was later discovered that she had been seeing distressing material about depression and suicide on her Instagram account. Evidently, the news went public and Molly’s parents blamed Instagram for “taking part” in helping to kill their daughter.

Shortly after, Adam Mosseri (Instagram’s head) announced that the app would roll out the Sensitivity Screens feature. The new Instagram tool blurs both images and videos of people hurting themselves. Still, users have the option to choose to see the content anyway. 

Molly Russell suicide Instagram

Note

Instagram could easily remove all of the disturbing content. However, the platform has opted not to do so. But, why? Wouldn’t it be much easier? Well, it depends on how you look at it. 

Mosseri has stated that they don’t want to restrict the content because they don’t want to isolate the people who post self-harming images. Mostly because they consider such images as nothing but a cry for help. 

So, what do you think? 

LEAVE A REPLY

Please enter your comment!
Please enter your name here