Facebook is testing something new that will let users know when their post has been removed by automation. This initiative comes in response to the Oversight Board, which said the social network should be more transparent with users about how their posts are removed.
The company unveiled the new test in a new report that provides updates on how Facebook is handling the Oversight Board’s policy recommendations. The test appears to be a response to one of the first cases that the Supervisory Board addressed, which was a publication on Instagram with the purpose of raising awareness about the breast cancer that the company removed, taking into account its relative rules to nudity.
Facebook restored the post, arguing that its automated systems made a mistake, and updated Instagram’s rules to allow “health-related nudity.” But the Oversight Board also recommended that Facebook alert users in cases where a post was removed with automation, rather than as a result of a human content reviewer. Facebook previously said it would test this change, which is now in effect.
“We launched a test on Facebook to assess the impact of telling people more about whether automation was involved in law enforcement,” writes Facebook in its report. “People in the test now see if the technology or a Facebook content reviewer made the decision to oversee their content. We’ll look at the results to see if people better understand who has removed their content, while looking at a possible increase in recidivism and appeal rates. The company added that it will provide an update on the test later this year.
The report also provided some additional tips on how the company is working with the Supervisory Board. The report notes that between November 2020 and March 2021, it referred 26 cases to the board, although it chose only three – one of which was in response to Donald Trump’s suspension. (Notably, the most recent report only covers the first quarter of 2021, so it does not address the board’s recommendations in response to Trump’s suspension.)
Although the Supervisory Board only assessed a few cases, its decisions resulted in some policy changes by Facebook that could have a much broader effect. However, in some areas, the company has refused to follow through on its policy suggestions, such as that Facebook is studying its own role in enabling the Jan. 6 events. In a blog post, the company noted that “the size and scope of the board’s recommendations go beyond the policy guidance we anticipated when we set up the board, and many require investments of several months or years.”