A study by Robert M Bond, now a political science professor at Ohio State University, and others published in Nature in 2012 described an ethically questionable experiment in which, on election day in 2010, Facebook sent ‘go out and vote’ reminders to more than 60 million of its users. The reminders caused about 340,000 people to vote who otherwise would not have. Writing in the New Republic in 2014, Jonathan Zittrain, professor of international law at Harvard University, pointed out that, given the massive amount of information it has collected about its users, Facebook could easily send such messages only to people who support one particular party or candidate, and that doing so could easily flip a close election – with no one knowing that this has occurred. And because advertisements, like search rankings, are ephemeral, manipulating an election in this way would leave no paper trail.
Are there laws prohibiting Facebook from sending out ads selectively to certain users? Absolutely not; in fact, targeted advertising is how Facebook makes its money. Is Facebook currently manipulating elections in this way? No one knows, but in my view it would be foolish and possibly even improper for Facebook not to do so. Some candidates are better for a company than others, and Facebook’s executives have a fiduciary responsibility to the company’s stockholders to promote the company’s interests.
Robert Epstein
Given the massive amount of information on the Internet and how it’s delivered (mostly) filtered through proprietary algorithms, these results are hardly surprising. But the problem has started to resonate more in the media in the past months, starting with reports of Facebook possibly suppressing conservative news sources in the feed. Google search results were also populated with fake news prior to the launch of HBO’s Silicon Valley’s third season and the company was recently accused of manipulating results in favor of Hilary Clinton.
Even if there’s no conscious manipulation at work in either case, there’s always a certain ‘filter-bubble’ effect at work: because these companies are building an advertising profile on users based on their online activities, personalized news feeds or search results are more likely to show people stories similar to what they (or their friends) read/liked in the past, reinforcing your beliefs instead of challenging them.
As always, don’t believe everything you see on TV – and now on your news feed.
Post a Comment