Facebook is going through one of its most delicate moments after weeks of leaks: everything that has come to light so far
The “Facebook files” are causing a crisis of confidence in the social network. The company’s shares have suffered and Mark Zuckerberg’s company has had to testify in front of the US Congress to give more details about them. leaked internal documents.
The publication of these internal investigations began on September 13, when the Wall Street Journal published the first part of the information it had received from a (now ex) high ranking worker within Facebook.
This is everything that has come out so far on Facebook and Instagram, what these internal documents say and what they have revealed about the way the company works and its impact on users.
The two faces of Facebook: their internal investigations and what they say in public
One of the first information that appeared on Facebook was that they had an internal investigation where they found that “32% of girls say that when they feel bad about their body, Instagram makes them feel worse“A negative component of Instagram that clearly contrasted with the public message of Facebook, which to date had maintained a response almost diametrically opposite to that of the report.
The report showed the impact of Instagram among teens who reported having suicidal thoughts, where 13% of British users and 6% of Americans attributed this wish to Instagram. In response, the Instagram team explained that they continue to support the findings of that research and demonstrate their “commitment to understanding complex and difficult issues.”
In addition to not doing what they said publicly, the disclosure of these internal documents showed another problem within Facebook: they were routinely making exceptions for powerful people through an exclusive program known as XCheck. Despite the existence of a series of established rules regarding disinformation, as revealed by these internal documents, they were not complied with in the same way for all users.
Looking away with polarizing content
In 2018, Jonah Peretti, CEO of BuzzFeed, sent an email to Facebook noting that an article titled “21 Things Almost All White People Are Guilty of Saying” had achieved extraordinarily high virality. It is the beginning of the third part of the “Facebook files”, where it is shown that the platform found that the most outraged content was being rewarded and Zuckerberg was reluctant to apply the proposed solutions.
Behind the publication of these files are Frances Haugen, Former Facebook Product Manager. In an interview with 60 Minutes, this data scientist with previous Google and Pinterest experience explains some insider details about Facebook that have put the company in the eye of the storm.
“It’s substantially worse than anything I’ve seen before on other platforms,” says Haugen. The social network encourages “content that is angry, polarizing and divisive.” As Haugen explains, Facebook’s interests – making more money – routinely clashed with what was good for the public.
These internal studies suggest, according to Haugen, that Facebook has been lying about significant advances against hate, violence and misinformation. According to an internal report, after all the changes, hatred on the platform would have dropped between 3 and 5%.
“Facebook makes more money when more content is consumed. People get more involved with things that provoke an emotional reaction. And the more anger they are exposed to, the more they interact and the more they consume,” he explains.
The “weak response” to problems with human or drug trafficking
Another of the chapters of the Facebook documents published so far by the Wall Street Journal shows a case where employees warned that messages were being published in developing countries about human trafficking, organ sales, pornography, violence against ethnic minorities and drugs. All these messages are clearly prohibited by the rules of the platform itself, but where, according to internal documents, the response of the company was in many cases inadequate or null.
The company has local verifiers and associations to keep users safe, but according to Haugen, the lack of more local moderators in those markets it allowed those publications to spread.
The case comes from afar. In 2019, the BBC published a documentary on how human traffickers used Facebook to sell victims, under the guise of employment agencies. As a result of that investigation, Apple threatened Facebook to remove its application from the App Store, as described by the WSJ. The internal document reveals that Facebook was already aware of this problem before the BBC’s post and the Apple threat.
The end of Facebook’s internal files ends at the point where Haugen leaves the company. She worked in the area of Civic Integrity, to fight against misinformation during the elections. However, after these there was a turning point and it was decided to dissolve this area, as explained by the filter. Two months later and with the elections still hot, the assault on the Capitol occurred, says the former Facebook product director.
“Every day, our teams must balance protecting the right of billions of people to speak out with the need to keep our platform safe and positive. We continue to make significant improvements to address the spread of misinformation and harmful content. To suggest that we encourage bad content and do nothing is simply not true, “said Lena Pietsch, a Facebook spokeswoman in response.
Imagen | Barefoot Communications