Facebook has long marketed itself as a social media platform that encourages people to form more “Meaningful connection.” The app, however, has recently come under scrutiny after the U.S. news collectively released the Facebook Papers.
The Facebook papers paint a very different picture from the one Facebook CEO, Mike Zuckerberg, does—the documents show a company that, when faced with a choice between public safety or bottom-line, chooses bottom-line every time.
The Facebook Papers are a collection of more than 10,000 documents, downloaded by Frances Hauge, former Facebook product manager- turned whistleblower, and presented to the U.S. Securities and Exchange Commission. The redacted papers were also reviewed by American news organizations, with Journalists from all over the nation collaborating to gain access to the documents.
The papers, being a mix of presentation, research studies, discussion threads and strategy memos, provide an inside view of how Facebook weigh trade-offs between public safety and their bottom-line. That fact remaining constant even after the researches show that its platform exacerbates real-world harms.
Facebook Papers are a collection of more than 10,000 documents, downloaded by Frances Hauge, former Facebook product manager- turned whistleblower, and presented to the U.S. Securities and Exchange Commission. The redacted papers were also reviewed by American news organizations, with Journalists from all over the nation collaborating to gain access to the documents.
Polarization in Ethiopia.
Earlier this year, when Zuckerberg was confronted with claims that his platform polarized communities into violence, he told the U.S congress, “Some people say that the problem is that social networks are polarizing us, but that’s not clear from the evidence or research.”
His response is, however, at odds with what the Facebook papers show. Facebook employees repeatedly alerted to the company’s failure to prevent the sharing of volatile posts inciting violence in “at risk” countries like Ethiopia. While Facebook ranked Ethiopia high for countries at risk of conflict, the social media giant’s moderation efforts were not nearly enough to match its platform’s influx of inflammatory content.
Additional papers show a Coordinated Social Harm Report report that showed that armed groups in Ethiopia were using the platform to incite violence against ethnic minorities. Fano, a group, flagged in the news, had created a cluster of accounts, with some based in Sudan and was inciting violence, promoting armed conflict, recruitment and fundraising. And though the Facebook team recommended the Fano-related network be taken down, it suggested that other bad actors were slipping through the cracks. but what stood out in the report was the warning that “Current mitigation strategies are not enough.”
Human Trafficking on Facebook.
Yet another set of documents shows that Facebook was well aware of human traffickers using its platforms to traffic domestic workers as early as 2018. While for years Facebook has tried to crack down on content related to domestic servitude, it failed, and it only got worse in 2019, with Apple (APPL) threatening to pull it from its app store.
If Apple were to remove Facebook from their app store, the revenue loss would be significant. As such, Facebook employees went out of their way to implement emergency policy changes and take down content that they deemed problematic—to ward off any loss of revenue. And still, even after that, CNN was able to find Instagram accounts that purported to have domestic workers for sale.
Facebook’s Double standards.
Another significant concern brought forth by the papers is the double standards practiced in the company. Facebook has two different sets of content standards from high-profile accounts and ordinary people like you and me. During Trump’s presidency, he made many inflammatory remarks, of which only a handful was ever removed. Zuckerberg has, in the past, defended that decision of not removing high-profile content by saying that people had the right to know how they think.
The documents, however, show that there is indeed a codified VIP system called XCheck—created to prevent a public relation fallout from celebrities and high-profile users.
Facebook has 2.8 billion global users—the papers reveal that Facebook is struggling to deal with the high number of issues stemming from language complexities. Facebook under-invests in content safety systems for non-English languages. The documents showed that the documents go further to show that the company has failed to scale up staff or add local language to protect people outside America. Hours before the papers’ release, Zuckerberg was again blamed for allowing hate speech due to language shortcomings and bending to state censors in Vietnam.
Despite the release of these damning articles, Facebook went ahead to announce their over $9 billion in quarterly profits on Monday, merely hours after the Facebook papers were released. The earnings for the last quarter grew by 17 percent, with users also increasing to 2.91 billion.
And addressing the papers, Zuckerberg, during the earnings call, said, “Goof faith criticism helps us get better, but my view is that what we are seeing is a coordinated effort to selectively use leaked documents to paint a false picture of our company.”