Facebook’s Oversight Board Has Spoken. But It Hasn’t Solved Much

The Facebook Oversight Board issued its first five decisions Thursday. The rulings are well thought out and show the board members, charged with reviewing Facebook decisions to remove content and make recommendations on Facebook policies, take their job seriously. More than anything, though, they show the futility of moderating content across networks with more than 3 billion users—nearly half the people on earth.

The cases involve posts in five languages, and often, subtleties of meaning and interpretation. Two touch on deep-seated global conflicts: China’s oppression of Uighur Muslims and the ongoing border war between Armenia and Azerbaijan. We’ve long known that the vast majority—now approaching 90 percent—of Facebook users are outside the US, but the breadth of these cases drives home the magnitude of Facebook’s challenge.

Facebook has touted automation as one solution to that challenge, but these cases also highlight the shortcomings of algorithms. In one, Facebook’s automated systems removed an Instagram post in Portuguese from a user in Brazil showing bare breasts and nipples. But the post was an effort to raise awareness about breast cancer, an exception to Facebook’s general policy against nudity, and an issue that has bedeviled Facebook for a decade. To its credit, Facebook restored the post before the Oversight Board heard the case; but it still underscores problems with letting algorithms do the work. In the other case, involving a quote purportedly from Nazi propaganda chief Joseph Goebbels, Facebook’s memory feature had actually recommended that the user recirculate a post from two years earlier. The older post had presumably been allowed to remain, raising questions about the consistency of Facebook’s standards for reviewing content.

useful content
sites
view it
Full Article
click over here now
visit this web-site
see
Our site
read the article
next page
look at this now
find out
Read Full Report
see here now
visit here
click here to find out more
why not check here
her response
published here
check
discover this
from this source
basics
read what he said
visit the site
browse around this web-site
visit this site
link
click for source
click this link now
blog
why not look here
more information
look at these guys
site link
helpful hints
pop over to this web-site
go to my site
see this page
browse around this website
view website
my sources
webpage
Discover More Here
Learn More Here
company website
click for info
Read Full Article
his response
click over here
take a look at the site here
more tips here
helpful resources
check out this site
look at this website
have a peek at this site
the original source
Continue
visit our website
visit this website
go to this website
pop over here
Home Page
Recommended Reading
these details
advice
try these out
check my reference
her comment is here
useful link
Resources
hop over to here
click this link here now
blog link
Continue eading
Click Here
Clicking Here
Go Here
Going Here
Read This

Facebook announced the creation of the board in 2018, after years of criticism about its role in fomenting ethnic hatred, political misinformation, and other evils. It took almost two years to assemble the 20 members, whose rulings on specific pieces of Facebook content are supposed to be binding.

In a statement Thursday, Monika Bickert, Facebook’s vice president for content policy, said the company would follow the board’s decisions to restore four items, including the Instagram post from Brazil. The board also suggested changes in Facebook policies, which the company is supposed to reply to within 30 days. Bickert said the recommendations “will have a lasting impact on how we structure our policies.”

In one case, though, she left some doubt. The board recommended that Facebook inform users when their content is removed by an algorithm, and allow for appeals. Bickert said the company expects to take longer than 30 days to respond to this recommendation.

Thursday’s cases may have been relatively easy ones. Coming soon: the politically fraught decision of whether to restore Donald Trump’s account, which is sure to anger a bloc of Facebook users (and employees) no matter how it is decided. Facebook punted that decision to the board last week.

Taken together, the cases decided Thursday reveal the enormity of Facebook’s challenge. Social media management company Social Report estimated in 2018 that Facebook users post 55 million status updates and 350 million photos every day; they send 9 million messages an hour and share 3 million links.

A decision on any one of those posts can be enormously complex. In October, a user in Myanmar, writing in Burmese, posted photographs of a Syrian Kurdish child who drowned attempting to reach Europe in 2015, and contrasted the reaction to the photo to what the user said was a “lack of response by Muslims generally to the treatment of Uighur Muslims in China.”

Leave a Reply

Your email address will not be published. Required fields are marked *

Previous post Jeff Bezos Steps Down as CEO—and Shows Amazon Is a Cloud Company Now
Next post The FTC Cracks Down on Bot-Wielding Ticket Scalpers