Facebook is rating the trustworthiness of every user to crack down on 'bad actors' (but it won't tell you how you score)

Facebook is rating the trustworthiness of every user to crack down on 'bad actors' (but it won't tell you how you score) Hello people in the world, today Random Find Truth will provide information about the correctness and important updated opinions that you must read with the title Facebook is rating the trustworthiness of every user to crack down on 'bad actors' (but it won't tell you how you score) that has been Random Find Truth analysis, search and prepare well for you to read all. Hopefully information from Random Find Truth about Articles News, the Random Find Truth write you can make us all human beings who are knowledgeable and blessed for all.

Title : Facebook is rating the trustworthiness of every user to crack down on 'bad actors' (but it won't tell you how you score)
Link : Facebook is rating the trustworthiness of every user to crack down on 'bad actors' (but it won't tell you how you score)

  • Facebook will soon rank users based on their 'trustworthiness' to stop fake news
  • The system will score users from zero to one and use other behavioral metrics
  • It's unclear who will get a score or what other metrics will be factored into scores
  • Some have criticized the move as being similar to China's social credit system
  • Earlier this year, Facebook asked users to rank news sites based on credibility
Facebook is secretly rating users based on their 'trustworthiness' to try and cut down on fake news on its platform.
The social media giant plans to assign users a reputation score that ranks them on a scale of from zero to one, according to the Washington Post
It marks Facebook's latest effort to stave off fake news, bot accounts and other misleading content on its site. 
But the idea of a reputation score has already generated skepticism about how Facebook's system will work, as well as criticism that it resembles China's social credit rating system. 
Facebook plans to assign users a reputation score that ranks them on a scale of from zero to one. It marks Facebook's latest effort to stave off fake news on its platform  
Facebook plans to assign users a reputation score that ranks them on a scale of from zero to one. It marks Facebook's latest effort to stave off fake news on its platform  
Facebook product manager Tessa Lyons told the Post that the firm has been developing the system for the past year and that it's directly targeted toward stopping fake news. 
The firm partly relies on users to flag content they believe to be fake, but this has led some to make false claims against news outlets, often due to differences in opinion or a grudge against a particular publisher. 
Users' 'trustworthiness' score will take into account how often they flag content as being false, Lyons said. 
As part of the system, Facebook will also factor in thousands of other metrics, or 'behavioral cues.' 
It's unclear what those other metrics will be, who will get a score and what Facebook will use that data for.
It also remains unclear whether the score will be solely applied to reports on news stories, or if it will factor in other information as well. 
The idea of a reputation score has already generated skepticism about how Facebook's system will work, as well as criticism that it resembles China's social credit rating system
The idea of a reputation score has already generated skepticism about how Facebook's system will work, as well as criticism that it resembles China's social credit rating system
Lyons declined to elaborate on the intricacies of the system, saying it could tip off 'bad actors' who use the knowledge to manipulate their score. 
However, this has done little to quiet users' concerns about whether or not they will be assigned a score and what contributes to it. 
'Not knowing how [Facebook is] judging us is what makes us uncomfortable,' Claire Wardle, director of Harvard Kennedy School's First Draft, told the Post. 
'But the irony is that they can't tell us how they are judging us - because if they do, the algorithms that they built will be gamed.'
Several Twitter users pointed out that the proposed trustworthiness score reminded them of the popular series Black Mirror, which chronicles the dystopian downsides of technology in the future, as well as China's social credit system. 
The controversial scoring system rates assigns citizens a score that can move up or down based on certain behaviors. It can be negatively impacted if you don't pay bills, fail to care for elders or if you're lazy and spend to much time playing video games. 
Others questioned Facebook's ability to assign users a trustworthiness score, citing the firm's recent user privacy scandal and murky success with removing fake news from its platform.
Earlier this year, Facebook began using a system that ranks news organizations based on their trustworthiness, relying on data from users who indicated whether they were familiar with certain sites, as well as whether or not they trusted them. 
It then ranks stories on a user's News Feed based on that credibility score. 

WHAT HAS FACEBOOK DONE TO TACKLE FAKE NEWS?

In 2016, following the shock November 2016 US election results, Mark Zuckerberg claimed: 'Of all the content on Facebook, more than 99 per cent of what people see is authentic'.
He also cautioned that the company should not rush into fact-checking. 
But Zuckerberg soon came under fire after it emerged fake news had helped sway the election results.
In response, the company rolled out a 'Disputed' flagging system that it announced in a December 2016 post. 
The system meant users were responsible for flagging items that they believed were fake, rather than the company.
In April 2017, Facebook suggested the system had been a success. 
It said that 'overall false news has decreased on Facebook' - but did not provide any proof.
'It's hard for us to measure because we can't read everything that gets posted', it said. 
But it soon emerged that Facebook was not providing the full story. 
In July 2017, Oxford researchers found that 'computational propaganda is one of the most powerful tools against democracy,' and Facebook was playing a major role in spreading fake news.
In response, Facebook said it would ban pages that post hoax stories from being allowed to advertise in August 2017.
In September, Facebook finally admitted during congressional questioning that a Russian propaganda mill had placed adverts on Facebook to sway voters around the 2016 campaign.
In December 2017, Facebook admitted that its flagging system for fake news was a failure.
Since then, it has used third-party fact-checkers to identify hoaxes, and then given such stories less prominence in the Facebook News Feed when people share links to them.
In January, Zuckerberg said Facebook would prioritise 'trustworthy' news by using member surveys to identify high-quality outlets.
Facebook has now quietly begun 'fact-checking' photos and videos to reduce fake news stories. However, the details of how it is doing this remain unclear. 

Facebook is rating the trustworthiness of every user to crack down on 'bad actors' (but it won't tell you how you score)

Enough news articles Facebook is rating the trustworthiness of every user to crack down on 'bad actors' (but it won't tell you how you score) this time, hopefully can benefit for you all. Well, see you in other article postings.

Read More:


Facebook is rating the trustworthiness of every user to crack down on 'bad actors' (but it won't tell you how you score)


You are now reading the article Facebook is rating the trustworthiness of every user to crack down on 'bad actors' (but it won't tell you how you score) with the link address https://randomfindtruth.blogspot.com/2018/08/facebook-is-rating-trustworthiness-of.html

Subscribe to receive free email updates:

AdBlock Detected!

Suka dengan blog ini? Silahkan matikan ad blocker browser anda.

Like this blog? Keep us running by whitelisting this blog in your ad blocker.

Thank you!

×