Facebook “lost” an important policy for three years and only noticed after the Oversight Board began looking at the issue, according to the latest decision from the board. In its decision, the board questioned Facebook’s internal policies and said the company should be more transparent about whether other key policies may have been “lost.”
The underlying case stems from an Instagram post about Abdullah Öcalan, in which the poster “encouraged readers to engage in conversation about Öcalan’s imprisonment and the inhumane nature of solitary confinement.” (As the board notes, Öcalan is a founding member of the Kurdistan Workers’ Party, which Facebook has officially designated as a “dangerous organization.”)
Facebook had initially removed the post, as Facebook users are barred from praising or showing support for dangerous organizations or individuals. However, Facebook also had “internal guidance” — created partially as a result of discussions around Öcalan’s imprisonment — that “allows discussion on the conditions of confinement for individuals designated as dangerous.” But that rule was not applied, even after the user’s initial appeal. Facebook told the board it had “inadvertently not transferred” that part of its policy when it moved to a new review system in 2018.
Though Facebook had already admitted the error and reinstated the post, the board said it was “concerned” with how the case had been handled, and that “an important policy exception” had effectively fallen through the cracks for three years.
“The Board is concerned that Facebook lost specific guidance on an important policy exception for three years,” the group wrote. “Facebook’s policy of defaulting towards removing content showing ‘support’ for designated individuals, while keeping key exceptions hidden from the public, allowed this mistake to go unnoticed for an extended period. Facebook only learned that this policy was not being applied because of the user who decided to appeal the company’s decision to the Board.”
The board also chastised Facebook for not being transparent about how many other users may have been affected by the same issue. Facebook told the board it wasn’t “technically feasible” to determine how many other posts may have been mistakenly taken down. “Facebook’s actions in this case indicate that the company is failing to respect the right to remedy, contravening its Corporate Human Rights Policy,” the board said.
The case highlights how Facebook’s complex rules are often shaped by guidance that users can’t see, and how the Oversight Board has repeatedly challenged the company to make all its policies more clear to users.
Though it’s only taken up a handful of cases so far, the Oversight Board has repeatedly criticized Facebook for not following its own rules. “They can’t just invent new unwritten rules when it suits them,” board co-chair Helle Thorning-Schmidt told reporters after they said Facebook was wrong to impose an “indefinite” suspension on Donald Trump. The board has also criticized Facebook for not alerting users to key parts of its policies, such as its “satire exception.” It’s pushed the company to clarify its hate speech policies, and how it treats speech from politicians and other high-profile figures.
Facebook has 30 days to respond to the Oversight Board in this case, including several recommendations that it further clarify its “Dangerous Individuals and Organizations” policy and update its transparency reporting process.
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.
Originally found on Engadget Read More