In a very eye-opening 60 Minutes interview, former Facebook product manager Frances Haugen revealed that the Facebook algorithm is used easily to spread hate, violence and misinformation. She also claims that the scale of the problem is much worse than the company lets on.
“The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook, and Facebook over and over again chose to optimise for its own interests, like making more money,” said Haugen.
A month ago, Haugen also filed at least eight complaints with the Securities and Exchange Commission alleging that Facebook is hiding research about its shortcomings from investors and the public. She also previously released pages of internal research and documents as an anonymous whistleblower. This included the information that suggested Instagram makes body image issues worse for teen girls, among other findings about how the platform can affect younger users.
“I’ve seen a bunch of social networks, and it was substantially worse at Facebook than anything I’ve seen before… At some point in 2021, I realized I’m going to have to do this in a systemic way, that I’m going to have to get out enough [documents] that no one can question that this is real,” continued Haugen.
Recruited to Facebook in 2019 to “work on addressing misinformation”, Haugen says that Facebook’s algorithm decides what to show you based on what you’ve interacted with before. One of the consequences to that it is “optimizing for content that gets engagement, or a reaction”.
“Content that is hateful, that is devisive, and polarising… It’s easier to inspire people to anger than it is to other emotions… Facebook has realised that if they change the algorithm to be safer, people will spend less time on the site, they’ll click on less ads, they’ll make less money,” she continued.
According to Haugen, Facebook switched on safety systems to reduce misinformation during the 2020 U.S. election—but as soon as the election was over “they turned them back off” to “prioritise growth over safety”. She suggested that the decision consequently allowed Facebook to be used to help organise the January 6 riot on Capitol Hill.
“Facebook over and over again chooses profit over safety… It is paying for its profits with our safety. I’m hoping that this will have a big enough impact on the world that they get the fortitude and the motivation to actually to actually go put those regulations into place,” said Haugen.
Facebook, however, aggressively pushed back against the reports—calling many of the claims “misleading” and arguing that its apps do more good than harm. In a statement to CNN following the 60 Minutes interview, Facebook spokesperson Lena Pietsch their teams have to “balance protecting the ability of billions of people to express themselves openly” with the need to keep their platform “a safe and positive place”.
“We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true,” said Pietsch.
Around the same time, Facebook and its other platforms—Facebook Messenger, Instagram and WhatsApp—faced a worldwide service disruption for several hours. A few sites, and a few TikTok users, made a possible connection between Facebook’s platforms shutting down and the whistleblower interview. But, what do you think?
[ SOURCE, IMAGE SOURCE ]
TikTok in partnership with Communications and Multimedia Content Forum of Malaysia (CMCF) have recently organised…
Tesla owners in Malaysia have reported that their vehicles can now perform the Autopark feature.…
After unveiling its latest smartphones, the Asus ROG Phone 9 series, to the world, Asus…
WhatsApp has introduced a new Voice Message Transcripts feature which allows users to easily convert…
This post is brought to you by Maybank. Unlock more than just transactions with MAE’s…
After making its debut in China late last month, the Oppo Find X8 series has…
This website uses cookies.