The tactics boosting Hernández online were similar to what Russia’s Internet Research Agency had done during the 2016 US election, when it set up Facebook accounts purporting to be Americans and used them to manipulate individuals and influence political debates on Facebook. Facebook had come up with a name for this – “coordinated inauthentic behavior” (CIB) – in order to ban it.
But Facebook initially resisted calling the Honduran activity CIB – in part because the network’s use of Pages to create false personas and fake engagement fell into a serious loophole in the company’s rules. Facebook’s policies to ensure authenticity focus on accounts: users can only have one account and it must employ their “real” name. But Facebook has no such rule for Pages, which can perform many of the same engagements that accounts can, including liking, sharing and commenting.
Zhang assumed that once she alerted the right people to her discovery, the Honduras network would be investigated and the fake Pages loophole would be closed. But it quickly became clear that no one was interested in taking responsibility for policing the abuses of the president of a poor nation with just 4.5m Facebook users. The message she received from all corners – including from threat intelligence, the small and elite team of investigators responsible for uncovering CIB campaigns – was that the abuses were bad, but resources were tight, and, absent any external pressure, Honduras was simply not a priority.
Julia Carrie Wong
These accounts about Facebook’s failure to prevent online manipulation and political propaganda – or worse, using its vast network to censor dissenting opinions on behalf of authoritarian leaders – have become as frequent as weather reports. As many have commented, ‘tight resources’ is almost certainly an excuse, considering how profitable Facebook is; rather the company is unwilling to invest in mitigation measures because it would diminish profits and decrease engagement.
And when it does take action, Facebook is primarily focused on US issues, as their greatest risk of regulation and scrutiny is in their home country. A similar story is developing around misinformation related to the coronavirus pandemic: a recent study has found that a majority (56%) of fact-checked misinformation content in major non-English European languages is not acted upon by Facebook, compared to only 26% of English-language content debunked by US-based fact checkers.
Post a Comment