News, Social Media

Facebook is not targeting hate speech properly, according to audit

Written by Hamza Zakir ·  1 min read >

As the largest social network on this planet, Facebook has quite a hefty responsibility when it comes to moderating content for its viewers and ensuring that it is doing its best to curb fake news, hate speech and terrorist propaganda. The most recent audit of the company, however, has shown that it is not doing enough.

Conducted by ACLU Washington Director Laura Murphy over the course of the past year, Facebook’s civil rights audit has comprised of two phases: the first six months involved gathering information from civil rights organizations to determine their concerns, while these past six months have been focused on moderation and enforcement of content on Facebook.

While the company has certainly made wholesale changes to its platform, especially the ban imposed on white supremacy, the auditors have concluded that its policy is still “too narrow.” As it happens, the social network has only forbidden explicit support or representation of the terms “white nationalism” or “white separatism”, but it hasn’t technically outright banned references to those terms and the ideology as a whole.

What this means is that in spite of the ban, individuals can still get away with subtle references to white supremacy which will continue to impact victims and other viewers. As the report itself puts it, “As a result, content that would cause the same harm is permitted to remain on the platform.

The auditors, therefore, recommend that Facebook expand its policy by including any and all references to the racist ideology of white supremacy and any form of related hate speech in its ban.

In all fairness to the social network, it has acknowledged the recommendation and states that there is a need to get better at not just getting rid of the wrong content, but also ensuring that the right content persists on its platform.

With regards to the audit, Facebook COO Sheryl Sandberg said, “Getting our policies right is just one part of the solution. We also need to get better at enforcement — both in taking down and leaving up the right content.

It’s a tall order, but that’s the price you have to be willing to pay for being the platform responsible for connecting and informing the masses.

Written by Hamza Zakir
Platonist. Humanist. Unusually edgy sometimes. Profile