0 Comments

The growing stream of reporting on and data about fake news, misinformation, partisan content, and news literacy is hard to keep up with. This weekly roundup offers the highlights of what you might have missed.

Facebook’s fact-checking program relies on third-party fact-checking and news organizations to help weed out fake and misleading news on the platform. That limits the program from scaling up, since there are only so many organizations that do this work.

But what if Facebook could use regular people — people whose day job is not fact-checking — to fact-check articles instead? A new working paper from researchers at MIT suggests that it would be surprisingly effective, even if those people are only reading headlines. “Crowdsourcing is a promising approach for helping to identify misinformation at scale,” the paper’s authors write.

The team used a set of 207 news articles that had been flagged for fact-checking by Facebook’s internal algorithm. (The study was done in collaboration with Facebook’s Community Review team and funding from the Hewlett Foundation.) Two groups were then asked to check the accuracy of the articles: Three professional fact-checkers, who researched the entire articles to make their verdicts, and 1,128 Americans found on Mechanical Turk who determined accuracy based online on the articles’ headlines and lede sentences. The result:

We find that the average rating of a politically balanced crowd of 10 laypeople is as correlated with the average fact-checker rating as the fact-checkers’ ratings are correlated with each
Source…