Advertisers and users are upset at inadvertent tolerance of abuse of women on the site. So why isn't Facebook taking more action?
Last week, Laura Bates, of the Everyday Sexism Project, tweeted a message to FinnAir, asking the company if it knew its advertising appeared on a Facebook page endorsing violence against women. The company responded:
"This is totally against our values and policies."
Several other angry advertisers felt the same way.
I doubt that most advertisers are aware of how regularly this situation occurs. For example, this morning, a Duracell battery ad is visible on a group page called "I kill bitches like you;" Sexy Arab Girls, "join our page for more porn videos," was sponsored by the Wilberforce Dinner "Honoring Cardinal Timothy Dolan," and the now-removed page, "Domestic Violence: Don't Make Me Tell You Twice," populated by photos of women beaten, bruised and bleeding, was the platform for Vistaprint.
"We occasionally see people post distasteful or crude content. While it may be vulgar and offensive, distasteful content on its own does not violate our policies," a Facebook spokesperson explained, when I asked what Facebook's response to similar pages is.
"However, there is no place on Facebook for content that is hateful, threatening, or incites violence, and we will not tolerate material deemed to be genuinely or directly harmful."
Facebook has detailed procedures for handling complaints and clearly states that user safety is a company priority. Given the astounding volume of people and content that Facebook deals with (more than 1 billion users), the company only acts when content is reported. The issue is, therefore, how words like "hateful" and "genuinely" "harmful" are defined, and, importantly, whether or not men and women understand "safety" differently.
Facebook moderators deal with real instances of violence and crime every day. "Not real" content depicting rape and the physical abuse of girls and women is often categorized by Facebook as [Humor] and readily viewed. Recent examples include a photograph of a man carrying a limp girl with the caption, "Rohyphnol: When Traditional Dating Methods Just Aren't Cutting it!" and the page "I Love the Rape Van". The company tries to address complaints within 72 hours, but pages like "Raping Babies Because You're Fucking Fearless" can remain up for more than a month.
To the founders of Rapebook, a page started last fall to "tackle misogyny on Facebook by sharing and reporting pages", content trivializing sexualized and domestic abuse is intrinsically hateful and harmful. Immediately, the page became the target of massive trolling and administrators were threatened with violent rape and death and bombarded with graphic images and porn. Posts, such as one urging people to give a donation to an anti-violence campaign at Amnesty International, generated more than 100 comments, including "fuck that. hit that hoe (sic)," and "Domestic violence is a 2 way street you hypocritical cunt." This suggests hostility. Which might provoke anxiety. And create an environment that does not feel safe to the average woman. Studies show that content like this is triggering and degrades the ability of consumers of the content to empathize with victims.
When I spoke to Facebook representatives, they responded quickly and were forthcoming about their policies. Guidelines are clear about harassment, bullying and hate speech – which is why this problem is not about constraining people's "free speech". It's about how mainstream misogynistic norms are embedded, not only in Facebook's interpretations of "free speech", "safety", "humor" and "credible threats", but in the very way their review process is structured.
First, what is notable about cases like Rapebook co-founder Trista Hendren's is the comfort and speed with which opponents resort to violent rape and death threats using misogynistic language. Facebook's guidelines prohibit hate speech, even though hate speech is, in fact, protected in the US by the first amendment. Users comfortable with denigrating women manipulate a review process that does not recognize sex-based hate speech and is not set up to consider context. Specifically, Facebook has no reporting mechanism for considering how a hostile environment (treating rape and violence against women literally as a joke or ignoring content that is viscerally threatening) might affect its female users.
Second, what people like Hendren are protesting is not easily mocked hurt feelings, but systemically tolerated hate, degradation, objectification and marginalization of girls and women, behind which loiters actual violence. Women, acculturated to a world where one in three women will be sexually assaulted (in the US, that number is one in five; for men, one in 77), cannot separate this reality from their online experiences. Domestic violence statistics reflect a similar epidemic. The vast majority of perpetrators in either case are men. This dynamic is reflected in online misogyny.
"At first, people started posting pictures of women and young girls being raped or beat up and commenting on the page saying things like, "I will skull-f**k your children," explains Hendren. "Then the harassment moved offline after our personal information was posted all over Facebook. I was called and emailed repeatedly. Later my address and children's names were posted as well."
Despite the fact that Facebook representatives may have done their best to work closely with Rapebook, the administrators closed the page after months of receiving up to 500 messages a day, including photographs of actual rapes and child pornography. Hendren's photo was used to create rape memes. She has left Facebook. It's important to note that people who supported Rapebook's efforts were unwilling to publicly show their support in Facebook, for fear of similar targeting.
How is this not a loss of free speech for these users (overwhelmingly women), resulting from bullying, harassment and misogyny? The people left feeling comfortable at Facebook are rape apologists and those who create content glorifying the debasement of women.
A common retort to all of this is: "This is the internet. It's offensive. If you don't like it, leave." That is correct: speech on the internet can be offensive and the right to be offensive is vital to democracy. But Facebook is not "the internet". Facebook is a company with principles and community standards that create a reasonable expectation in users that it will enforce rules it itself has established in an unbiased manner. Facebook is perilously close to allowing "freedom of speech" to be used as a defense of unjust actions that are clearly intimidating and silencing female users.
If Facebook is already considering these issues, they aren't sharing that fact. Last week, a new page was created on Facebook, Sheryl Sandberg LEAN-In And Remove Misogyny from FB. According to Bates, Facebook is resolute in not responding to the Everyday Sexism campaign. Advertisers are, however, and as the saying goes, money talks. But is it too much to hope for that Facebook's famously pro-woman chief operating officer might do something about Facebook's misogyny problem before she is simply forced to act because it's hurting the company's bottom line?
The Guardian: Facebook's big misogyny problem by Soraya Chemaly
Please can you not use white type on black? It is impossible to read. There's a reason news sites use Black on White. Your message is to important to obfuscate.
ReplyDeleteDone!
Delete