Mark Zuckerberg reportedly intervened after a Facebook employee posted a controversial defense of police in the wake of Kenosha shootings

  • A Facebook employee sparked outrage after defending "well-intentioned law enforcement officers" and disputing that there's racial bias in policing in a post on internal company messaging boards, The Daily Beast reported Wednesday.
  • According to The Daily Beast, the backlash was so strong that Mark Zuckerberg alluded to the post in a note to employees, saying some people weren't "appreciating the impact their words are having on our Black community."
  • The pro-police post came days after a police officer in Kenosha, Wisconsin shot Jacob Blake seven times in the back and vigilante Kyle Rittenhouse was charged in the fatal shootings of two protesters in the city.
  • Facebook employees have become increasingly critical of the company's response to hate speech and calls to violence on its platform, which escalated last week after reports revealed Facebook ignored hundreds of warnings about a Kenosha militia group calling for armed vigilantes to confront protesters.
  • Visit Business Insider's homepage for more stories.

A recent internal debate among Facebook employees over racism and police violence escalated to the point that CEO Mark Zuckerberg eventually intervened, The Daily Beast reported Wednesday.

Last week, an employee shared a post on Facebook's internal messaging board, Workplace, in which he defended "well-intentioned law enforcement officers who have been victimized by society's conformity to a lie" and disputing the role of race in policing, according to The Daily Beast.

The author reportedly went on to argue that the criminal justice system doesn't produce racially biased outcomes, racism isn't a major factor in police shootings, and victims of police shootings are often under the influence of drugs or didn't follow officers' orders.

Facebook did not immediately respond to Business Insider's request for comment.

The post came several days after police in Kenosha, Wisconsin, shot Jacob Blake seven times in the back, leaving him paralyzed, and 17-year-old Kyle Rittenhouse was charged with multiple counts of homicide after prosecutors accused him of fatally shooting two anti-police brutality protesters.

The post instantly caused an uproar among Facebook employees, prompting the author to delete comments on the post because he deemed them "unproductive and overwhelming," according to The Daily Beast.

The exchanges on Workplace eventually got so heated that Zuckerberg alluded to the post in his own message to employees, The Daily Beast reported, saying: "We designed our respectful communications policy to allow people to discuss very different viewpoints… But I'm concerned that some people are doing that without appreciating the impact their words are having on our Black community."

According to the report, Zuckerberg appeared to suggest that Facebook plans to start funneling controversial topics off the company-wide forum and into more specific channels.

"You won't be able to discuss highly charged content broadly in open groups," he said, according to the Beast. "As you know, we deeply value expression and open discussion, but I don't believe people working here should have to be confronted with divisive conversations while they're trying to work."

Zuckerberg has repeatedly defended the Facebook platform as a place for free speech, but the company has faced increasing pressure both internally and externally to crack down more tightly on hate speech and calls for violence on the platform.

That pressure ramped up again in the wake of the Kenosha shootings after The Verge reported that Facebook ignored multiple warnings from users about a Kenosha militia group that had created a "call to arms" event where users discussed plans to bring weapons and confront protesters.

Facebook has long banned content explicitly calling for violence, but recently introduced a new policy targeting militia groups that "have celebrated violent acts, shown that they have weapons and suggest they will use them, or have individual followers with patterns of violent behavior."

A Facebook spokesperson initially told Business Insider that the warnings simply hadn't made it in time to Facebook's "specialized team" of moderators that handle militia-related content, but the company's response came under further scrutiny after BuzzFeed News reported that the company had been warned 455 times and repeatedly determined the group didn't violate its policies — only removing the content following the shootings.

Zuckerberg told employees it was "an operational mistake," according to BuzzFeed News, while employees slammed him, with one saying: "At what point do we take responsibility for enabling hate-filled bile to spread across our services?"

Source: Read Full Article