A week ago, Instagram found itself in the eye of controversy and drew the ire of its users when a model accused the social media platform of repeatedly removing photos of her embracing herself with her arms covering her bare chest.
A British model, Nyome Nicholas-Williams, had posted a topless photo of herself, where she can be seen hugging herself while keeping her arms crossed across the chest. Nyome then launched a social media campaign accusing Instagram of censoring Black plus-sized bodies. According to her, numerous "white and skinny women" post similar photos every day, but a "black woman celebrating her body is banned."
View this post on Instagram
Who knew these images @alex_cameron captured of me would start such a movement, I won’t call them a problem as they are far from it. They have however opened up a much bigger conversation that must be had regardless of discomfort, and it is even more of an issue now as @mosseri pledged to amplify black voices back in June when speaking to Cosmo about the shadow banning “accusations”...As we can see nothing about that pledge has come to fruition...if anything It has gotten worse. This is only the beginning @instagram has a lot to answer for.
Soon, the hashtag #IwanttoseeNtyome began trending on Instagram and many accused Instagram of 'fatphobia' as well.
The photo, at first glance, does not appear to be violating Instagram's policies on nudity or inappropriate content. The question then arose: many users post partially nude content on Instagram, but why are only some removed? The 'Explore' section on Instagram is filled with multiple photos of men posing shirtless, top influencers posting photos in their lingerie and so on - none of which are removed. If these photos deserve to remain on Instagram, why the different rules for users like Nyome?
There is no doubt about the fact that Instagram's policies, especially regarding women's bodies, is wildly disproportionate.
In 2018, another plus size model Katana Fatale was infuriated to find that photos of her posing in a "fatkini" were repeatedly removed by Instagram. She later posted a collage of her photo along with that of celebrity Kim Kardashian to show that the latter's photos had been permitted while hers had been removed.
An article by Buzzfeed showed that many Instagram influencers feel that the social media platform is likely to remove photos of women if they are fat. And based on current trends, they may be right.
Now here's the thing. According to Instagram's community guidelines, photos should not display nudity, inappropriate content, genitals, nude butts, sexual acts and female nipples. Anything else is permissible.
Neither Katana's photos nor Nyome's posts violated any community guidelines. Then what got their posts flagged?
Author Stephanie Yeboah decided to get to the bottom of it and reached out to Instagram for a one-on-one. She shared her interaction on Twitter.
During the call, Stephanie brought up numerous instances where Instagram had flagged photos of plus-sized women as inappropriate and removed them. She cited Nyome as an example too. She wrote on Twitter that Instagram had acknowledged the problem and had taken action as well.
Following the social media outrage after Nyome's campaign, Instagram has reportedly decided to update its semi-nudity policy. Instagram apparently explained to her it uses AI and manual reviewers and that photos showing people cupping their breasts get flagged because they are associated with pornography.
At this point, Stephanie asked Instagram why bigger breasts were seen as problematic but smaller breasts were not. Instagram admitted its shortcomings and said that they were updating their policy at the soonest. They also said that they would retrain their reviewers. Instagram also reportedly said that all photos which had been unfairly removed would be reinstated.
While Instagram's prompt response is appreciated, this has been pointed out a number of times before. In 2019, a member of the Instagram team reportedly confirmed that their algorithm detects and flags photos showing more than 60% skin. Now an algorithm is just that - while the policy was intended to curb nude photos, it ended up targeting Instagram users with bigger bodies. And that's where the social media platform's manual reviewers should have ideally intervened.
A few months ago, while the #BlackLivesMatter movement was at its peak, Instagram CEO Adam Mosseri had posted that he and the platform stand in solidarity with the Black community. Asserting that words aren't enough, he had also said that the platform would be incorporating negative feedback about censorship of black individuals into its policies. He had specifically said that algorithm bias would be something they would be focusing on.
Cut to two months later, posts by influencers like Nyome are still getting removed and banned.
Instagram's biased policies aren't new, they've been around for almost as long as the platform has existed. According to Stephanie's tweet, Instagram is planning to turn over a new leaf and mend its ways. It is expected that the new policy changes and a more stringent guideline for manual reviewers will prevent such mishaps in the future.
But only time will tell.