Facebook is secretly rating your trustworthiness
It’s not a new system either, the company has been working away on it for the last year, silently rating users on a scale of 0 to 1. Facebook’s system is designed to help prevent the gaming of its systems by measuring the credibility of its users, highlighting rotten apples amongst them.
The system came about as part of Facebook’s efforts to curb the spread of fake news on the platform, reports The Washington Post. In an interview, Tessa Lyons, the product manager in charge of the project explained that the need for such a system arose from complications like users deliberately falsely reporting news items as untrue.
Lyons explained that it’s “not uncommon for people to tell us something is false simply because they disagree with the premise of a story or they’re intentionally trying to target a publisher.” Because of this, Facebook has to assess an individual’s trustworthiness before it can begin to act upon something they’ve reported.
While a Facebook trustworthiness score sounds ominous, Lyons stresses that it isn’t meant to be an absolute indicator of a user’s credibility. Instead, it’s just a single metric alongside many other variables that Facebook measures about you to determine if you’re someone the platform needs to heed over another user.
Facebook isn’t just using this to keep an eye on its users either. By assessing the publications and pages that people flag as trustworthy or problematic, it knows which brands and publications it needs to reign in or promote. It’s a balancing act that requires user honesty.
One way in which Facebook assesses its users is to see how people interact with articles on the site. “If someone previously gave us feedback that an article was false and the article was confirmed false by a fact-checker,” explains Lyons, “then we might weight that person’s future false-news feedback more than someone who indiscriminately provides false-news feedback on lots of articles, including ones that end up being rated as true.”
Why Facebook is ranking your trustworthiness
Clearly it’s a complex system, and one that Facebook isn’t willing to explain in great depth for fear users will simply begin to gain it once again. However, knowing that Facebook is sat there silently judging your credibility is enough to turn many people off using social media entirely.
However, no matter how dystopian it sounds, it seems that a rating system like Facebook’s or Twitter’s own assessment of a user’s online social circle is necessary to keep trolls at bay.
The recent case of activist group Sleeping Giants calling on followers to shut down Infowars conspiracy theorist Alex Jones proves that user reporting tools can be used to game a system. In this specific case it was for the right reasons but, as many experts have pointed out in the past, cases of users reporting a page en-masse is a tactic taken straight out the book of far-right online harassment campaigns.
For Facebook it was important to build a trust-based rating system as it doesn’t actually individually assess the reports people make against other users or content. Every time you report something as false or problematic, it gets filed to a third-party fact-checkers. To make sure these people aren’t spending time on baseless claims over genuinely important and problematic ones, a system that ranked user trustworthiness was needed to filter out erroneous reporting.
So, while it certainly sounds very doom and gloom that Facebook is there silently assessing your credibility – all while farming your personal information, sharing data about you with big corporations and being entangled in one of the largest data scandals of the decade – it’s doing it for the benefit of everyone.