Facebook to tweak some of its policies in response to recommendations from the Oversight Board. The board issued its of decisions of content moderation decisions last month, in a series of rulings that overturned some of Facebook’s original actions. In addition to those decisions on a handful of specific posts, the board also made recommendations about how the social network could change its policies.
Now Facebook has responded to those suggestions. The company says it is “committed to action” on 11 of the board’s recommendations, including updates to Instagram’s nudity policy. But in other areas, like a suggestion that Facebook alert users when moderation decisions are the result of automation, the company isn’t yet committing to making permanent changes.
Of the areas where Facebook say it’s “committed” to change aren’t so much big policy tweaks as promises to increase “transparency” around its existing rules. On this front, Facebook says it will make rules around health misinformation more clear, such as its recent vaccine policy updates which specify the kinds of claims the company will remove. Facebook also plans to launch a new transparency center for users to better explain its community standards. The company further said it would “share more information about our Dangerous Individuals and Organizations policy,” but that it was “assessing the feasibility” of a recommendation that the company list the groups and individuals covered under the rules.
One area where Facebook has agreed to a more significant change is in Instagram’s nudity policy. It now allows for “health-related nudity,” after Facebook restored a post from a user who had posted photos to raise awareness about breast cancer.
Facebook’s use of automation tools in making content moderation decisions also came up in several of the board’s recommendations. The board had said that facebook should let users know when enforcement is the result of automation rather than human content reviewers. The social network says it will “test the board’s recommendation to tell people when their content is removed by automation,” but stopped short of a permanent commitment.
The one area where Facebook declined to implement any changes, though, is its coronavirus misinformation policy. The Oversight Board had ruled that Facebook should reinstate a French users’ post that falsely claimed hydroxychloroquine could cure COVID-19. The board further recommended that Facebook use “less intrusive measures” in dealing with misinformation about the pandemic when “potential for physical harm is identified but is not imminent.”
But in its latest response, Facebook said that while it would make its coronavirus misinformation rules more clear to users, it wouldn’t change how it enforces them. “We’ll take no further action on this recommendation since we believe we already do employ the least intrusive enforcement measures given the likelihood of imminent harm,” Facebook . “We restored the content based on the binding power of the board’s decision. We will continue to rely on extensive consultation with leading public health authorities to tell us what is likely to contribute to imminent physical harm. During a global pandemic, this approach will not change.”
While not necessarily surprising, Facebook’s response offers some insight into how the social network views the Oversight Board. Facebook has likened the independent board to its “Supreme Court” and, like a court, its decisions are meant . But Facebook has considerable leeway in whether it adopts the broader policy changes the board recommends. That Facebook has adopted some, while only agreeing to consider others, suggests it’s still at least a little reluctant to let the board have too much influence on Facebook’s broader policy structure.
The company’s response comes as it’s gearing up for what could be the Oversight Board’s highest-profile decision: whether or not to reinstate . That board hasn’t indicated exactly when it will rule on the matter, but a decision is expected within the next few weeks.