- Instagram’s algorithm is more likely to place topless pictures higher in users’ news feeds than other photos, according to researchers at Algorithm Watch.
- The German nonprofit monitored posts from 37 creators from the accounts of 26 volunteers.
- Posts that contained semi-naked women were 54% more likely to appear in the volunteers’ news feeds than other photographs, and topless posts from men were 28% more likely to appear.
- Facebook said the data was flawed because it drew only from a small data set.
- Visit Business Insider’s homepage for more stories.
Instagram appears to favour pictures of topless creators and bumps those higher on user feeds, a new report from Algorithm Watch has found.
Algorithm Watch, a research and advocacy organisation focused on algorithmic decision-making, collected data from 26 volunteer Instagram users to see which pictures from a select group of 37 creators were placed highest in their main feeds.
It was done using a browser add-on, which automatically opened the Instagram homepage at regular intervals and analysed the photos it saw there. Overall 2,400 photos from February to May 2020 were analysed.
Over that period, 21% of photos from the selected creator accounts contained a shirtless person (women in underwear or swimwear or bare-chested men.) In the volunteers’ feeds, however, these pictures made up 30% of all posts from those accounts.
“Posts that contained pictures of women in undergarment or bikini were 54% more likely to appear in the newsfeed of our volunteers. Posts containing pictures of bare-chested men were 28% more likely to be shown,” the report said.
We don’t know for sure what’s causing the skew toward topless pics
The report also said that without a full audit of Instagram’s algorithm – something that Facebook is unlikely to ever allow as it falls under its intellectual property – it’s impossible to draw definitive conclusions about what’s causing the skew toward topless pictures.
One explanation could be simply that other users are engaging with these pictures more, prompting the algorithm to place them higher in people’s feeds. If you and others “like” a photo more, it could appear higher in other people’s feeds.
The report’s authors also refer to a 2015 patent from Facebook, Instagram’s parent company, which describes a system for analysing the contents of a picture to determine whether it might be a good candidate for high engagement. This patent specifically said it would be able to identify people’s “state of undress” in photos.
It is impossible to say whether the algorithm is simply picking up on the correlation between likes and quasi-nude pictures or whether it has been given the ability to identify seminudity in photos. Facebook has algorithms for spotting nudity, as Instagram bans full nudity and uses such algorithms to take down posts in violation of this rule, but whether these are deployed to rank posts in people’s feeds is not certain.
“We cannot prove causation; there is no amount of data that could prove that,” Nicholas Kayser-Bril, one of the paper’s authors, said. “The only way to know exactly how this works would be to have an audit of exactly what’s going on inside Facebook’s servers, and there is no way we will be able to do that in the current legal environment.”
In a tweet, Kayser-Bril suggested that the algorithm could be picking up on the behaviour of a minority of Instagram users who use the platform as a source of “soft porn.”
…and their behavior is probably picked up by ML systems, amplified, and pictures of nudity are pushed for all users, in a vicious cycle. Just like search engines spread far-right conspiracies by suggesting them to millions of users after a few thousands searched for them.
— Nicolas Kayser-Bril (@nicolaskb) June 15, 2020
Responding to the authors of the report, Facebook dismissed the idea that Instagram’s algorithm systematically promotes seminude images.
“This research is flawed in a number of ways and shows a misunderstanding of how Instagram works,” a spokesperson told Algorithm Watch. “We rank posts in your feed based on content and accounts you have shown an interest in, not on arbitrary factors like the presence of swimwear.”
A Facebook spokeswoman told Business Insider the company thought the sample sizes used by Algorithm Watch were too small.
But whether the algorithm can spot skin in pictures or it’s just picking up on a correlation, the effect on people using Instagram to make a living is the same.
“Almost every single content creator we spoke to confirmed they had to post almost naked pictures if they wanted to have any reach,” Kayser-Bril said.
Researchers say Facebook and Instagram are too opaque
For Algorithm Watch, the main issue isn’t whether Facebook is intentionally boosting topless pictures; it’s about how difficult it is to find out how or why it’s happening.
“In the advertising industry, if we were to investigate it, we could find sources who could describe what is going on, who could explain how decisions are taken. Whereas here in this case all we have is our data, which is as we admitted very small, and then there is Facebook saying ‘No that’s not how it’s happening,'” Kayser-Bril said.
Sandra Wachter, an expert in algorithmic fairness at the Oxford Internet Institute, told Business Insider that a lack of intentionality actually makes algorithms more dangerous than human decision-makers, as many discrimination laws require intent to be applicable.
“You can just let algorithms run wild and they will start to discriminate without any intention; it’s just something that happens if you don’t take an active role in designing and deploying and supervising the algorithm in a responsible way,” Wachter said.
Wachter added that, because the algorithm’s decision-making processes are obscure and hard to scrutinize, it makes them even worse.
She thinks that professionals who use Instagram to promote their business would have legitimate cause to worry about the platform systematically favouring seminude images.
“Knowing that you’re up against that and being dressed a certain way will increase visibility is something that might actually be a problem because that’s really not what you want a platform to do,” Wachter said.
Getting Facebook to investigate and tweak the algorithm wouldn’t be without its pitfalls, either, and Wachter says the platform would have to be extremely careful about any unforeseen knock-on effects – an overarching policy on reducing the visibility of semi-clothed images could unfairly disadvantage users marketing swimwear or underwear, for example.
“I don’t think there’s an easy fix to that problem. It actually requires stepping back and thinking about the possible consequences or the possible policy interventions you could do, and what that actually would mean in practice,” she said.
Business Insider Emails & Alerts
Site highlights each day to your inbox.