Problem #3: Systemic Bias

As Wikipedia developed, it became increasingly apparent that Wikipedia had an immense problem with systemic bias. While the WikiProject Countering Systemic Bias dates back to 2004, attention to this problem built momentum in Wikipedia's second decade, resulting in increased efforts to analyze and address these problems. Again, it is worth highlighting these problems for a moment since they also are entangled in the editing problems we want to discuss.

As is documented in an internal essay on systemic bias, the heart of these problems is the fact that the average Wikipedian is not at all representative of the larger populations, even within particular language groups. Not everyone has time, interest, or access to make them capable of editing Wikipedia, so a portion of the bias emerges from a simple selection bias. Although no one explanation should or can account for this bias, the fact that a majority—particularly early in its lifespan—of its editors are white, educated, technically inclined men from developed nations led to certain discernible and statistically visible biases, both in terms of the amount of content and the focus of content. Generally, this bias can be considered in three categories, which, as of 2022, each have their own page outlining these issues: gender bias, racial bias, and geographical bias. The research on and efforts to address these issues has been increasing rapidly, both within and outside Wikipedia, such that this constitutes the third major point of attention for course correcting Wikipedia.[8]

We want to note here a few developments of interest for our third problem. The first is that this problem of identifying and addressing systemic bias is not straightforward and occasionally leads to tension with the other two problems. Procedures meant to assure the quality of content, especially notability, can exacerbate problems with bias, resulting in significant pushback against ways of using a standard like notability to filter certain articles (Peake, 2015). As the community has embraced the need to address these problems, it has experimented with explicit procedures and policies for ameliorating the bias that policies may create, sometimes with unintended side effects. For instance, one way of addressing geographical bias is to edit from a worldwide perspective. However, this goal sometimes leads to articles organized around how various countries viewed a topic, which may not be the most coherent mode of organizing articles (more on this in Part 1).

These issues of systemic bias have also created an interest in activist-style editing, in a positive sense: people entering the Wikipedia community with the explicit purpose of addressing these biases.[9] It has also cultivated a new generation of Wikipedia assignments in college courses related to underrepresented topics. These new editors entered the space with the explicit goal of creating and enhancing articles not sufficiently represented on Wikipedia, leading to an infusion of interest in Wikipedia editing. While this has been an essential and positive development, this interest occasionally leads to articles and editing that are justified on grounds of addressing systemic bias when additional attention needs to be paid to their quality. In the process of teaching and examining stalled articles, we came across a noticeable number of articles of this particular sort: articles written by classes or other more activist-style editing, which are in need of significant improvement. Some of these articles—especially essay-like articles—need significant developmental editing work to ensure that editing to address systemic bias is of appropriate quality. In the long run, the hope is that attention to these writing issues will help make this activist-style editing more sustainable by assisting in making these articles of equally high quality as any.


[8]This issue of bias has been the primary focus of scholarship on Wikipedia within its second decade. For example, refer to Gruwell (2015), Lemieux et al. (2023), Roy et al. (2022), Nelson and Jacobs (2017), Pratesi et al. (2019), Vetter (2018), and Xing and Vetter (2020).

[9]This was particularly the case amongst women's studies programs, as was reported in the The Chronicle of Higher Education (Kerr, 2018). Other edit-a-thons have been organized around racial and global problems, such as those noted by William Beutler (2020). One illustrative example: Alexandria Lockett and Jamila Lyn have organized edit-a-thons and advocated for both these areas, which is documented at their outreach dashboard site (Atlanta Spelman ArtAndFeminism 2017, 2017).