BETA
This is a BETA experience. You may opt-out by clicking here

Breaking

Edit Story

Here Are The Biggest ‘Facebook Papers’ Charges: Zuckerberg Caves To Communist Government And Lets Celebrities Break The Rules, More

This article is more than 2 years old.
Updated Apr 21, 2022, 08:12am EDT

Topline

Facebook repeatedly lets celebrities and politicians evade its rules and CEO Mark Zuckerberg gave into the Vietnam government’s demands to silence anti-government posts on the platform, according to a trove of internal documents a company whistleblower provided to the Securities and Exchange Commission that are now forming the basis for an ongoing “Facebook Papers” series of exposes running in a number of major news organizations.

Key Facts

Despite CEO Mark Zuckerberg telling employees to have “unimpeachable neutrality,” the Financial Times reports executives including Zuckerberg routinely “interfered” to let celebrities and politicians skirt the platform’s rules despite employees’ protestations, and the CEO allegedly personally intervened to reinstate a video that had been taken down for making false claims about abortion because Republican politicians complained (Politico reports Facebook’s lobbyists have similar influence).

Zuckerberg gave into demands from Vietnam’s communist government to censor posts by anti-government dissents, the Washington Post reports, also citing anonymous sources, removing 2,200 posts between July and December 2020 as compared with 834 the six months prior.

Apple briefly threatened to remove Facebook and Instagram from its App Store because of how its products were being used to “buy and sell” Filipina maids, outlets including CNN and the Associated Press report, and though Apple backed down after Facebook promised “crack down” on the issue, the AP reports it’s had a “limited effect” on fixing the problem—part of a broader issue the platform faces with human trafficking.

Facebook has failed to take any comprehensive action against users that have multiple accounts, despite finding such accounts are a “massive source” of the platform’s “toxic” political posts and are “purveyors of dangerous political activity,” Politico reports.

Facebook documents show the company’s market dominance, with 78% of all U.S. adults using Facebook and “nearly all U.S. teens” using Facebook-owned platforms, Politico reports, which could aid the Federal Trade Commission’s antitrust lawsuit against Facebook by undercutting its claims it faces heavy competition from rival companies.

The Post also reports Facebook internal documents show the company removed less than 5% of hate speech on the platform, despite Zuckerberg testifying to Congress last year the company removes 94% of it, and the CEO disapproved of an idea to have a Spanish-language voting information center for the election because he believed it wouldn’t be “politically neutral.”

Facebook employees raised alarms in the spring about the platform’s poor ability to moderate anti-vaccine content—its detection of “vaccine-hesitant comments is bad in English, and basically non-existent elsewhere,” a memo noted—which the company took months to address, the Verge reports.

Facebook groups various countries into different “tiers” to determine the resources the company allocates to each nation’s elections, the Verge reports, and the company does not provide any assistance to countries in the lowest-tier—which includes all but 30 higher-tier countries—unless specific election-related content is reported for moderation.

Facebook has researched “core features” of the platform like the “like” and “share” buttons and found those “had let misinformation and hate speech flourish on the site,” the New York Times reports, but executives blocked modifications to them so as to not stifle growth and “[keep] users engaged”—part of a broader pattern of how the company “abandoned or delayed” moves that could have reduced “misinformation and radicalization,” the Post reports.

Though Facebook has conducted thorough research showing its drop in popularity among young people, Bloomberg reports the company has “misrepresented” that to investors by leaving out information about its declines among certain demographics and just focusing on overall growth—though the Verge reports the company views teens’ declining use as an “existential” threat.

Facebook often prioritizes “political considerations” in decision making to avoid appearing biased, and gives high-performing right-wing publishers “special treatment” that lets them evade punishment for misinformation, the Wall Street Journal reports, with one Facebook staffer noting in a memo the company makes “special exceptions” for conservative news site Breitbart and “even explicitly endorse[s] them” by including the site in Facebook’s News Tab.

Facebook removed “safeguards” to stop the spread of election misinformation after Election Day before of the January 6 attack and had a “tepid” response after the violence started that employees criticized as insufficient, according to documents reported by outlets including Bloomberg, CNN, the Associated Press, Washington Post, Journal and New York Times.

The Journal and Post report Facebook conducted extensive internal research that provided recommendations for how the platform could stop the spread of extremist content, but “in many instances, executives had declined to implement those steps”—and NBC News reports the efforts Facebook did take to ban QAnon and other conspiracy groups were criticized by internal researchers as “piecemeal” and effectively stop the movement’s “meteoric growth.”

The Journal reports Facebook takes a “whack-a-mole” approach to banning extremist movements, conducting “surgical strikes” on individual entities it believes to be dangerous rather than a “more systematic approach” that officials believed would stifle Facebook’s growth.

Hate speech and misinformation on Facebook has flourished and often been left unchecked in India—Facebook’s largest market—particularly anti-Muslim rhetoric and incitements of violence, despite the company conducting internal research that demonstrates the extent of the problem, the Post, Times, Bloomberg, Journal and AP all reported.

Facebook has been unable to police a lot of content in India because it does not have the ability to effectively moderate and fact-check posts in the country’s 22 official languages, including Hindi and Bengali, which the Post notes are the fourth- and seventh-most spoken languages globally, respectively, and the AP and Wired report the platform has similar issues with Arabic and moderating content in the Middle East.

Two Hindu nationalist groups that have not been banned from Facebook despite spreading anti-Muslim content or incitements of violence have ties to Indian Prime Minister Narendra Modi and his political party, the Journal notes, and one group wasn’t removed because of “given political sensitivities,” according to an internal document.

Big Number

87%. That’s how much of its resources for fighting misinformation Facebook dedicates to the U.S., leaving only 13% for the rest of the world, according to a document cited by the Times. The Post also reports Facebook dedicates 84% of its “global remit/language coverage” to the U.S. versus other countries. (Facebook disputed those figures to the Times, saying they don’t take third-party fact checkers into account, many of whom are abroad.) In addition to India, documents cited by the two outlets also found the platform has had issues adequately policing content in Myanmar, Sri Lanka, Ethiopia, Pakistan and Indonesia.

Crucial Quote

“I’m struggling to match my values with my employment here,” an employee wrote on Facebook’s internal message board on January 6, as quoted by the Post and Bloomberg. “I came here hoping to affect change and improve society, but all I’ve seen is atrophy and abdication of responsibility.”

Chief Critic

Facebook has broadly pushed back against the news outlets’ reporting and defended the company’s efforts against misinformation and extremism. “At the heart of these stories is a premise which is false,” Facebook spokesperson Joe Osborne said in a statement to the Financial Times. “Yes we’re a business and we make profit, but the idea that we do so at the expense of people’s safety or well being misunderstands where our own commercial interests lie. The truth is we’ve invested $13 billion and have over 40,000 people to do one job: keep people safe on Facebook.”

What To Watch For

More stories. Facebook VP of global affairs Nick Clegg told employees Saturday workers “need to steel ourselves for more bad headlines in the coming days,” according to an internal post reported by Axios, and the Verge, one of the outlets who had access to the documents, said Monday to expect more stories “over the coming weeks.”

Key Background

Facebook has long faced criticism for its alleged failure to stop misinformation and hate speech on its platform, but those allegations have ramped up in recent weeks after whistleblower Frances Haugen spoke out on 60 Minutes and testified to Congress about the company’s alleged misdeeds. The former member of Facebook’s Civic Integrity team told Congress the company has “put their astronomical profits before people” and urged lawmakers to take action against Facebook, which Haugen claimed on 60 Minutes is “substantially worse” than what she had seen at other social networking companies. The internal documents that Haugen collected and provided to the SEC, which her lawyers then gave to the news outlets, were first reported by the Journal.

Tangent

In addition to its pieces on India and the January 6 riot, the Journal has also used Haugen’s documents to publish stories on Facebook’s reported policy that exempts “VIPs” from its rules, the company’s awareness of Instagram’s “toxic” effect, particularly on teen girls; how a 2018 algorithm change resulted in making the platform and its users “angrier;” the company’s “weak” response to posts from drug cartels and human traffickers; its failure to police anti-vaccination content; its plans to attract preteens to its platforms; how the company’s employee lists have changed; employee doubts about how effective Facebook’s use of artificial intelligence could be and the company’s struggle to detect users that have multiple accounts on its platform.

Further Reading

‘This is NOT normal’: Facebook employees vent their anguish (Politico)

A whistleblower’s power: Key takeaways from the Facebook Papers (Washington Post)

Inside Facebook, Jan. 6 violence fueled anger, regret over missed warning signs (Washington Post)

Internal Alarm, Public Shrugs: Facebook’s Employees Dissect Its Election Role (New York Times)

Facebook Whistle-Blower Says Company Dropped Guards Against Political Misinformation In ‘60 Minutes’ Interview (Forbes)

Facebook ‘Puts Astronomical Profits Over People,’ Whistle-Blower Tells Congress (Forbes)

Follow me on TwitterSend me a secure tip