Facebook is testing a change that will let users know when their post was removed as a result of automation. The new experiment comes in response to the Oversight Board, which has said the social network should be more transparent with users about how their posts are removed.

The company revealed the new test in a new report that provides updates on how Facebook is handling the Oversight Board’s policy recommendations. The test comes in response to one of the first cases the Oversight Board took up, which dealt with an Instagram post meant to raise awareness of breast cancer that the company removed under its nudity rules.

Facebook restored the post, saying its automated systems had made a mistake, and updated Instagram’s rules to allow for “health-related nudity.” But the Oversight Board had also recommended that Facebook alert users in cases when a post was removed with automation rather than as a result of a human content reviewer. Facebook previously said it would test this change, which is now in effect.

“We’ve launched a test on Facebook to assess the impact of telling people more about whether automation was involved in enforcement,” facebook writes in its report. “People in the test now see whether technology or a Facebook content reviewer made the enforcement decision about their content. We will analyze the results to see if people had a clearer understanding of who removed their content, while also watching for a potential rise in recidivism and appeals rates.” The company added that it will provide an update on the test later this year.

The report also shed some additional insight into how the company is working with the Oversight Board. The report notes that between November 2020 and March 2021 it referred 26 cases to the board, though it’s only chosen to take up three — one of which was in response to its suspension of Donald Trump. (Notably, the latest report only covers the first quarter of 2021, so it doesn’t address the board’s recommendations in response to Trump’s suspension.)

Though the Oversight Board has only weighed in on a handful of cases, its decisions have resulted in a few policy changes by Facebook that could have a much broader effect. However, in some areas, the company has declined to follow up on its policy suggestions, such as one that Facebook study its own role in enabling the events of January 6th. In a blog post, the company noted that “the size and scope of the board’s recommendations go beyond the policy guidance that we first anticipated when we set up the board, and several require multi-month or multi-year investments.”

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.

Originally found on Engadget Read More

Similar Posts