
Facebook Using Secret Formula to Rate Users' 'Trustworthiness'
Facebook has begun assigning its users a reputation score on a scale from 0 to 1.
The trust ratings, which went into effect this year, were developed as part of Facebook's fight against so-called "fake news."
Facebook told The Washington Post it relies in part on reports from users to help identify fake and malicious stories. If enough people report a story as false, a fact-checking team will look into it. But it can't investigate every potential "fake news" report, so Facebook uses other information to find "fake news."
Facebook would not say what that other information is.
Users' trustworthiness score is not meant to be the final word on a person's credibility, Tessa Lyons, the product manager in charge of fighting misinformation, told the Post. And there is no single unified reputation score assigned to users.
Still, Megan Barth, co-chair of The Media Equality Project, called the rating system "a touch dystopian."
The revelation comes after a wave of censorship against conservative voices on the internet by Facebook, Twitter, and YouTube.
A study in June by The Gateway Pundit found Facebook had eliminated 93 percent of the traffic of top conservative news outlets.