According to an internal report obtained by The Verge, a group of Facebook engineers identified a software bug that went unfixed for months and affected “as much as half of all News Feed views”. The bug, first discovered in October of last year, was described by employees as a “massive ranking failure”.
When the bug was first discovered by the engineers last year, it was also when a sudden surge of misinformation began flowing through the News Feed—as written in the internal report. There has been content marked as questionable by Facebook’s third-party fact-checking programme, but the content was still being favoured by the algorithm and was distributed to the platform’s news feeds.
In addition to posts that were flagged by fact-checkers, Facebook’s systems also failed to properly demote probable nudity, violence, and even Russian state media the social network recently pledged to stop recommending in response to the country’s invasion of Ukraine. The posts’ views also spiked by as much as 30 percent globally.
However, the engineers weren’t able to find the root cause of the reported bug. All they could do was watch the surge of misinformation and harmful content subside, only to flare up again until the issue was fixed recently on 11 March 2022.
According to Meta spokesperson Joe Osborne, the company “traced the root cause to a software bug and applied needed fixes”. But he added that the bug “has not had any meaningful, long-term impact on our metrics”. In the internal documents, it said the technical issue was first introduced in 2019 but “didn’t create a noticeable impact until October 2021”.
“…the overwhelming majority of posts in Feed are not eligible to be down-ranked in the first place,” explained Osborne,
Basically, content on Facebook rated “false” would be “downgraded” in news feeds so fewer people will see it. If someone tries to share that post, they are presented with an article explaining why it is misleading. The company also posted a list of what kind of posts they demote, but they don’t exactly explain what they do with those posts. Facebook remarks that they “hope to shed more light” on how demotions work, but have concerns that doing so would “help adversaries game the system”.
“In a large complex system like this, bugs are inevitable and understandable… But what happens when a powerful social platform has one of these accidental faults? How would we even know? We need real transparency to build a sustainable system of accountability, so we can help them catch these problems quickly,” said Sahar Massachi, a former member of Facebook’s Civic Integrity team.
While there might not be malicious intent behind this recent ranking bug, it doesn’t bode well on Facebook’s already deteriorating image. Recently, it was also reported that Meta paid a consulting firm to run a campaign to turn the public against TikTok.