SOURCE: Technologyreview.com
The battle against fake news continues—but new features from Google and Facebook designed to help fight misinformation still firmly distance the companies from the thorny issue of choosing between real and fake content.
In the wake of the presidential election, Facebook came under heavy criticism for the proliferation of fake news in its users’ feeds. Despite attempts to fix the problem, using third-party fact-checkers to flag potentially incorrect news, it remains an issue for the social network. And more recently Google also found itself criticized for allowing its search algorithms to happily serve up misinformation.
Today, both companies have rolled out tools that they hope will ease the problem. Google has taken a page out of Facebook’s, er, book, adding a “Fact Check” tag to snippets of articles that appear under the News tab of its search results. Like Facebook, it uses analysis by fact-checking organizations to alert users to content that appears to be inaccurate.
Ultimately, though, the system leaves the user to decide whether to believe the content or not. “These fact checks are not Google’s and are presented so people can make more informed judgments,” explain Google’s Justin Kosslyn and Cong Yu in a blog post describing the feature. “Even though differing conclusions may be presented, we think it’s still helpful for people to understand the degree of consensus around a particular claim and have clear information on which sources agree.”
Meanwhile, Facebook’s new initiative also puts the onus on the user. Today, millions of users’ news feeds in 14 countries, including the U.S. and the U.K., will be splashed with banners that encourage people to learn “how to spot fake news.” The tips, which were developed by the U.K. fact-checking organization Full Fact, include sensible advice—from checking URLs, date stamps, and formatting, to questioning headlines and inspecting photographs. Facebook’s is only a temporary experiment, though, meant to last just “a few days.” Presumably if it’s successful the trial may be widened out—though what counts as success here is up for debate.
Indeed, both Google and Facebook are painfully aware that this is an incredibly hard problem to solve. As we’ve pointed out in the past, deciding between what’s true and false can be incredibly difficult: there are murky lines between disagreeable opinions, bad information, and outright lies. Picking them apart is understandably something that both Internet giants are afraid to do.
For his part, Mark Zuckerberg has stated that the issue is “complex, both technically and philosophically,” and that he wants the social network to “be extremely cautious about becoming arbiters of truth.” For now, then, that role falls to you.