Solving gendered-harassment: a feminist framework for creating safety on Wikipedia

Solving gendered-harassment: a feminist framework for creating safety on Wikipedia

By Aishwarya Vardhana

Introduction

My product team, Trust and Safety Tools, has been tasked with building a system for people to safely report harassment on Wikipedia.1 As a woman my instinct was to first ask, how does harassment affect women on the platform? What about other people who are targeted based on their gender identity? Knowing that the internet is rampant with sexist, homophobic, and transphobic rhetoric, what could our team do to make a small difference in this complex space?

Understanding the problem

I’m a designer, and I follow a design process, therefore my first question is to ask, how well do we understand the problem of gender-based harassment? Who experiences it and in what form is the harassment? What do targets of gendered-harassment do about it? What do these “users” (to use product-speak) need? What are their goals and motivations? If we were to create personas for them, what would they be?

In order to answer these questions we, of course, need to do user research. However, in order for this research to be properly contextualized, it needs to be parsed through a feminist lens. Here we push up against the limitations of the individual designer. To design for complex, sociocultural scenarios, I believe a designer needs to be fluent in sociology and the disciplines it spawns, such as the study of race, gender, class, and religion. You see, when analyzed in a vacuum, gendered-harassment is easily misunderstood as “bad actors misbehaving”. However, if one understands the mechanics of patriarchy, one is able to see how gendered-harassment is the byproduct of a misogynistic culture. I, as the designer, don’t need to be an expert in sociology, but I need to know who to talk to, to appropriately understand the problem space. As I proceed with this project, I will be doing exactly that!

A disproportionate number of editors who experience harassment on Wikipedia are women and/or identify as LGBTQIA+. They are vulnerable to harassment in the form of name calling, discrimination, stalking, trolling/flaming, and threats of violence.2

First, we need to understand and accept that we have a cultural problem on our hands, and cultural problems are not easy to fix. They require intervention and prevention, such as inclusion training, protections for the vulnerable, accountability for those who inflict harm, and strong, egalitarian values embodied by leadership. So, before my team begins this work I would just like to say, no singular product can solve gendered-harassment.

However, solving gendered-harassment can and should be prioritized, and not an afterthought which is why I’m writing this blog post and would like to drive this work forward.

Why this matters

Solving gendered-harassment is not only a moral imperative, it is central to the mission of the Wikimedia Foundation. When editors are harassed they stop contributing, and we will never reach our vision of free knowledge without equitable participation from folks of all genders.

To quote Professor Jessica Wade’s TEDxWomen London 2018 talk, a Wikimedian herself and voice for diversity in science, “…the majority of history has been written by men, about men, for other men … and the majority of content on Wikipedia is written by white men in America.” Professor Wade, and many other community members, are working to recruit more women to edit Wikipedia, but we must also ask, why aren’t more women organically evangelizing or joining Wikipedia? Once they join, what is their experience like? Do they stay? Do they encourage more women to join? Why or why not? Is the editing community an inviting, welcoming, and friendly space for women?

Simply put, it is not enough to acquire women who edit, we must retain them.

The community’s efforts

For many years various Wikipedia community members have asked for improvements to reporting systems, specifically to protect women and LGBTQIA+ editors. Daniellagreen proposed a way to end sexism and intolerance on Wikipedia. Wikimedian-in-residence blueraspberry sketched the basics of a centralized harassment reporting and referral service. User beauxlieux centered ‘apologies’ in their suggestion for how to make Wikipedia a kinder and more civil place. Editors QEDK and Djembayz teamed up to propose a whole suite of reporting tools and best practices specifically for addressing gender concerns such as “have moderators who are trained to understand the reality of women’s safety needs’’. One of my favorites is by Wikimedian and sociologist radfordj who derived inspiration from the U.S. Women’s Rights movement in the 60s and 70s, and proposed a consciousness raising-style repository to document and elevate the stories of traditionally marginalized groups, in the hope that this data can be used to understand and mitigate marginalization.

Online gender-based harassment is not a new phenomena or a simple problem to fix. A few infamous examples include Harassment via Wikipedia Vandalism – Feminist Frequency, Women and Wikipedia: Perspectives on the gender gap, harassment, and the Gamergate controversy, Quora’s misogyny problem: A cautionary tale | ZDNET, and a woman-volunteer bringing a harassment issue straight to Jimmy Wales.

Demonstrating the use of a feminist lens

In this blog post, I want to demonstrate how to critically examine the existing ecosystem on Wikipedia through a feminist lens.

English Wikipedia has five foundational principles. 3 One of these principles is about conduct and the two values that underpin conduct are “civility” and “assume good faith”.

For the sake of analysis let us assume a new, woman-identifying editor is harassed by a veteran male editor and searches for resources on Wikipedia. Below are the first four instructions she’s given. I have highlighted parts of the instructions which I find counterproductive to our shared goal of supporting targets of harassment.

When dealing with incivility:

  1. First of all, consider whether you and the other editor may simply have misunderstood each other. Clarify, and ask for clarification.

  2. Consider the possibility that something you said or did wrongly provoked a defensive, irritated or fed-up response. Be prepared to apologize for anything which you could/should have done better. (If an awful lot of people seem to be getting frustrated with you, the problem may be with you.)

  3. Even if you’re offended, be as calm and reasonable as possible in your response. Until there is clear evidence to the contrary, assume that the offense was unintended.

In a case of ongoing incivility, first decide if anything needs to be done. Confronting someone over a minor incident – particularly if it turns out that you misinterpreted what they meant – may produce more stress and drama than the incident itself. Consider your own behavior, and, if you find you have been uncivil, apologize to them instead. (civility documentation)

Steps 1-3 undermine the target’s emotions and her judgment. One could go so far as to make the argument that these instructions gaslight the target, by first asking her what she did wrong. 4

The second core value is to “assume good faith”. 5 Asking victims of harassment to assume good faith, when trust has been broken, places an unfair burden on them. This can lead to inequitable distributions of emotional labor and a higher rate of burnout. Other advice we give is to “first and foremost act calmly (even if difficult)”, “simply ignore it”, and to stay cool when the editing gets hot.

I understand that our current policies are intended to be helpful. Wikipedia is written and run by volunteers with limited time, and these policies attempt to save volunteers time and emotional labor by asking people to resolve conflicts themselves. However, policies that don’t serve victims of gendered harassment ultimately create more work for the responder.

Using the inclusive product development playbook

Changing policies is out of scope for my team, Trust and Safety Tools. We are focused on designing a minimal viable product (MVP) for how someone will report harassment and what the responder will see on the other end. I am advocating that we prioritize solving for gendered-harassment, and reporters who are women or LGBTQIA+. With the rollout of the inclusive product development playbook, we have the tools and support to approach an old, systemic issue in a new way. 6

Our team is in the first phase of product development, also known as the STRATEGIZE phase. I’ve read through the playbook and grouped together a set of tasks pertaining to our audience:

Establish a baseline of current users and use cases, and opportunities for equity based on your context. Include existing research (internal and external) of the problem space you want to address.

Be intentional and clear about who you will empower and engage with (have a clear why), and what that engagement will look like (target audience). If your answer is everyone, consider thinking about who you will engage with first for learning and then scale.

Establish “Who are we leaving out?” and be clear if you will include those that are being left out in this iteration or in the future based on your baseline understanding of “established” audiences and “growth” audiences.

When establishing objectives and key results, be specific about who you are serving, what assumptions you are making and how you will validate or challenge those assumptions.

Clearly define what barriers and/or knowledge gaps you are aiming to address as well as what new opportunities you are planning to create for your target audience and why your team is uniquely positioned to do so.

Our challenge is to understand the pain points of women and LGBTQIA+ folks because a Wikipedia that is safer for the most vulnerable, is safer for all. Within our target audience, we should focus on those who are the least well-connected and influential within the community i.e. the newcomers. We will need to conduct user interviews to better understand who we are building for and speak with responders who have helped resolve or witnessed gendered-harassment. Given how complex and expansive this project is, it will be necessary to who we are not designing for as well.

Establish what partners (internal and external) need to be brought into each phase and give them advance notice of the goal you intend to accomplish if not sooner. Advanced notice should occur as soon as you are aware you will need the team’s support, ideally at least a quarter in advance.

Some potential partners include mission aligned organizations such as Geek Feminism Wiki, Black Girls Code, Pollicy, Feminist Internet, Ada Initiative, Fundación Karisma, Cyber Civil Rights Inititiave, GenderIT, APC, The Bachchao Project, CLEAR, and HeartMob. For internal partnerships a few potential partners include the Global DEI team, the Trust and Safety legal department, and the inclusive product development working group.

Anti-harassment work is inclusion work

It is easy to identify the root cause of harassment to individual personalities e.g. “so and so is just a bully”. Harassment can indeed be random and isolated, but it is frequently about power. When women or LGBTQIA+ folks are targeted because of their gender-identity, it can be with the intention of systematically driving them away. This form of harassment is not interpersonal, it is ideological. If we didn’t view harassment in this way, a feminist way, we would be missing a very important insight about the problem space. 7 According to Valerie Aurora, ED of Ada Initiative, anti-harassment work is part of a successful diversity initiative. 8

If we wish to retain and cultivate gender diversity on Wikipedia, we must design and build a reporting system that addresses the nuances of nonmale needs. This blog post is an analysis, a call to action, and just the beginning.

Get in touch

Please email me at avardhana@wikimedia.org if you’d like to discuss this work further.

Thanks to Selene Yang, Bethany Schowengerdt, Lucy Blackwell, and Carolyn Li-Madeo for their thoughtful edits.