Consumer reviews of products like toasters work because we have direct experience using them. Consumer reviews of news sources don’t work because we can’t personally verify the facts from direct experience; instead, our opinions of news are driven by strong emotional attachments to underlying sociopolitical issues. Put simply, our research shows that we’ll trust anyone to be objective about their kitchen appliances, but when it comes to news, we want experts who can verify the facts.
Second, user ratings are easily manipulated. We rely on online reviews, but research shows that 15-20% of online reviews are fake. Fake reviews are more common on websites that don’t verify whether the user has actually used the product or service. Zuckerberg said that Facebook would only accept ratings from users who say they are familiar with the news sources they are judging, but the honor system, while logical, won’t stop fake reviews.
Alan Dennis, Antino Kim & Tricia Moravec
Facebook has been struggling with its fake news problem for over a year, but apparently they still haven’t got a clue how to fix it. Their latest idea: just make people vote on what they think are reliable news sources. Never mind that this so-called solution raises more questions than it answers; personally, I fail to understand how this proposal is any different than the way News Feed algorithms selected news items until now. Instead on implicitly guessing your opinions and political attachments (by analyzing likes and reactions to previous posts and links), Facebook wants to ask people to list them explicitly… What could possibly go wrong?