But that started to change in 2015, as Trump’s candidacy picked up speed. In December of that year, he posted a video in which he said he wanted to ban all Muslims from entering the United States. The video went viral on Facebook and was an early indication of the tone of his candidacy.
Outrage over the video led to a companywide town hall, in which employees decried the video as hate speech, in violation of the company’s policies. And in meetings about the issue, senior leaders and policy experts overwhelmingly said they felt that the video was hate speech, according to three former employees, who spoke on the condition of anonymity for fear of retribution. Zuckerberg expressed in meetings that he was personally disgusted by it and wanted it removed, the people said. Some of these details were previously reported.
Ultimately, Zuckerberg was talked out of his desire to remove the post in part by Kaplan, according to the people. Instead, the executives created an allowance that newsworthy political discourse would be taken into account when making decisions about whether posts violated community guidelines.
Elizabeth Dwoskin, Craig Timberg & Tony Romm
I have criticized Apple multiple times in previous posts for their greed and hypocrisy, but Facebook is on another level entirely. I write less about it simply because almost every other day another story emerges where Facebook does the wrong thing, apologizes for it, then doubles back on the same mistake. When they finally do something right, it’s either much later than other social networks, or a very toned down version that will have little to no effect. A selection of stories from the past couple of months alone:
- A smear campaign linked to the Cambodian government went viral on Facebook, sending an activist Buddhist cleric into exile to protect himself.
- Facebook appears to be obstructing an investigation seeking to hold Myanmar accountable for charges of genocide against the Rohingya people.
- Instagram’s “Related Hashtags” feature surfaced negative hashtags about the Democratic presidential candidate Joe Biden, but not about Donald Trump.
- Facebook inadvertently fueled the growth of dangerous conspiracy theories like QAnon through changes in recommendation and News Feed algorithms; the company just started to remove some of the groups associated with QAnon.
- Facebook removed ‘strikes’ so that conservative pages were not penalized for violations of misinformation policies.
- Facebook announced it would block new political ads from appearing on the site in the week before Election Day in the US, but the new policies are so narrow that they won’t have any real effect on election information.
- A Facebook executive in India, the head of public policy in the country no less, supported Modi and his Hindu nationalist party before the 2014 elections, mirroring the assistance offered to Donald’s Trump election campaign in 2016.
- Facebook has complied with an order issued by Brazil’s Supreme Court to block the accounts of a dozen top allies of far-right President Jair Bolsonaro, but said the measure was a threat to freedom of speech, and plans to appeal against the order.
- Time and again (this year at least) Twitter added fact-check links to Trump’s tweets criticizing mail-in voting, while Facebook either refused to do the same or acted much later, after the damage was done.
- Facebook failed to take action against an event that led to the Kenosha shooting, even though it was reported for violence hundreds of times – and then publicly lied about their actions.
The value of being in favor with people in power outweighs almost every other concern for Facebook, said David Thiel, a Facebook security engineer who resigned in March after his colleagues refused to remove a post he believed constituted “dehumanizing speech” by Brazil’s president.
Ultimately, this quote goes right to the heart of the problem: as a company, Facebook decided that its best strategy for continued growth and rising profits is to not excessively moderate user generated content, to not restrict or fact-check ads, to side with whoever is in power in a given country to avoid regulation and scrutiny. Well, since Facebook is basically controlled by a single person, Mark Zuckerberg, it is more accurate to point the blame directly at him, as the ultimate decision maker. After cannibalizing news organizations, Facebook has devolved into the perfect channel for state propaganda, for the rich and powerful. And I fear nobody has the power to stop it anymore.
Zuckerberg:
— Scott Galloway (@profgalloway) September 4, 2020
—in sole control of a platform whose algorithms select and deliver content to 2.6B people, 1/3 world
—algorithms are opaque to government agencies
—cannot be removed from office & will likely control these algorithms for the next 5-7 decades#nomercynomalice pic.twitter.com/N5bapiQ0CK
People sometimes say that if Facebook was a country, it would be bigger than China. But this is the wrong analogy. If Facebook was a country, it would be a rogue state. It would be North Korea. And it isn’t a gun. It’s a nuclear weapon.
Because this isn’t a company so much as an autocracy, a dictatorship, a global empire controlled by a single man. Who – even as the evidence of harm has become undeniable, indisputable, overwhelming – has simply chosen to ignore its critics across the world.
Carole Cadwalladr
Post a Comment