It seems like Facebook is really trying (for real this time, guys!) to get a handle on its fake news problem. Today the company announced a major change to the way it prioritizes news articles in your feed. Instead of prioritizing news content that has the most engagement, the social media giant will now show users more original reporting first. “Original reporting plays an important role in informing people around the world,” said Campbell Brown, VP of Global News Partnerships, and Jon Levin, Product Manager, in a Facebook blog post.
While most of the news stories in users’ feeds will still be from pages and friends they follow on Facebook, the company is changing its AI algorithm to identify news sources that are most often cited as the original source to show up first in users’ feeds. “We do this by looking at groups of articles on a particular story topic and identifying the ones most often cited as the original source. We’ll start by identifying original reporting in English language news and will do the same for news in other languages in the future.”
So basically, if enough news publications cite Facebook’s blog as their original source while reporting on Facebook prioritizing original news sources, then users will see Facebook’s blog in their news feed (if enough people share it) before any of the other ones, including the one you are reading right now. It sounds like the company’s new algorithm will scrape multiple articles written about the same topic to see if they all have the same link to the same source.
While this is a good way of tracking down information to its point of origin, it still doesn’t tackle the issue of reliable verses unreliable news sources. The new algorithm could still prioritize misleading or outright false articles if enough publications cite the same source. (Spamming articles around a single source for amplification is a common tactic for disreputable publishers.) However, Facebook plans to demote news content if it does not have “transparent information about the publisher’s editorial staff.” So if an article lacks a byline or a publication lacks an About page, any content shared from those sources will be buried in users’ news feeds. But again, if a questionable source puts bylines on its articles and has an About page, it seems like Facebook’s algorithm wouldn’t demote those articles as intended.
Facebook recently announced that it would add a notification to news stories older than 90 days if a user tries to share it to their profile. The idea is to provide users with timely news stories relevant to current events, but there are a few problems with the idea. The system is largely reliant on an article’s publication date, yet anyone running their own “news” site can recycle years-old stories without mentioning the publication date of the original article. TapHaps.com is one site that does this, focusing on old, incendiary stories and reporting on them as “opinion” pieces. The site also has author bylines and a full About page, so where would it land among Facebook’s new attempts to get a grip on its fake news problem?
This prioritizing feature is still in its infancy, and there’s bound to be many loopholes. But if nothing else, it’s still a reminder to actually read the article you are about to share on your social media network and click through the black links (if any) to see where the news originated. Until all the issues are worked out, we’ll just have to unfollow or unfriend people—like we’ve been doing for years already—to stop seeing the things they share that we don’t want to see.