In another effort to fight the spread of misinformation on its platform, Facebook will now start rating users on their trustworthiness, The Washington Post reported Tuesday (Aug. 21).
A user will get a rating between 0 and 1, which is meant to measure their reputation. It’s not the sole factor that determines a person’s credibility, Tessa Lyons, a product manager at Facebook, told the Post. It’s not clear what exactly the rating measures, how it is determined, or how Facebook intends to use it. Facebook wasn’t immediately available to comment.
Efforts like these are supposed to counteract when users flag legitimate information as untrue on Facebook. Lyons told the Post that Facebook looks, for example, at whether a user has reported content that external fact-checkers have deemed as false. If the two align, the user’s future reports will be weighted more than those of someone who indiscriminately flags content. But the company wouldn’t discuss any further details of its assessment system, because of concerns that bad actors could game the system.
Facebook similarly rates the trustworthiness of news publishers, a score that is based on user surveys.
Update: A Facebook spokesperson responded with a statement. “The idea that we have a centralized ‘reputation’ score for people that use Facebook is just plain wrong and the headline in the Washington Post is misleading,” the statement reads. “What we’re actually doing: We developed a process to protect against people indiscriminately flagging news as fake and attempting to game the system. The reason we do this is to make sure that our fight against misinformation is as effective as possible.”