On 17th of July 2018, Facebook announced that they’re working on developing some new tools, that enables stronger control over the content published on the platform.
This step is a part of Facebook actions to react to UK Channel 4 charges, that has raised important questions about those policies and processes, including guidance given during training sessions in Dublin, in its TV report. Facebook commented on that report within a written response.
In addition to that, Facebook Vice President for Global Policy Solutions, Richard Allan, also answered their questions in an on-camera interview.
“We provided all this information to the Channel 4 team and included where we disagree with their analysis. Our Vice President for Global Policy Solutions, Richard Allan, also answered their questions in an on-camera interview. Our written response and a transcript of the interview can be found in full here and here.”
Facebook announced that they’re working on those mistakes:
“We have been investigating exactly what happened so we can prevent these issues from happening again.”
As an attempt to solve the problems that happened during the training sessions in Dublin, Facebook required all its trainers in Dublin to do a re-training session — and are preparing to do the same globally, in addition to reviewing the policy questions and enforcement actions that the reporter raised and fixed the mistakes that were found.
Alongside these actions, Facebook took steps to make Facebook safer, so they developed tools which enables them to control the content and prevent hate speech, terrorism and war crimes.
One of the newly introduced tools is the cross check, which simply means that some content from certain Pages or Profiles is given a second layer of review to make sure it applies to Facebook’s policies correctly. This tool specifically targets high profile, regularly visited Pages or pieces of content on Facebook so that they are not mistakenly removed or left up.
“We want to make clear that we remove content from Facebook, no matter who posts it, when it violates our standards. There are no special protections for any group — whether on the right or the left”
As reported by Facebook, Cross Checks will focus on content posted by celebrities, governments, or Pages where mistakes were made in the past, in addition to Many media organizations’ Pages — from Channel 4 to The BBC and The Verge —.
“To be clear, Cross Checking something on Facebook does not protect the profile, Page or content from being removed. It is simply done to make sure our decision is correct.”
Minors
Preventing people under 13 from having a Facebook account, is also one of Facebook ways to make Facebook safer and control the content through its platform.
According to this, if someone is reported to be under 13, the reviewer will look at the content on their profile (text and photos) to try to ascertain their age. If they believe the person is under 13, the account will be put on a hold and the person will not be able to use Facebook until they provide proof of their age.
“Since the program, we have been working to update the guidance for reviewers to put a hold on any account they encounter if they have a strong indication it is underage, even if the report was for something else.”
It should be said that to create a safe environment for its 1.4 billion daily users from all over the world, Facebook is doubling the number of people working on our safety and security teams this year to 20,000. This includes over 7,500 content reviewers.
“We are constantly improving our Community Standards and we’ve invested significantly in being able to enforce them effectively. This is a complex task, and we have more work to do. But we are committed to getting it right so Facebook is a safe place for people and their friends.”