A study published Tuesday in the Journal of Management Information Systems examined the gender bias of Wikipedia articles. Comparing Wikipedia’s treatment of 50 female Fortune 1000 CEOs to similar groups of male CEOs, the study found the men were 22 percent less likely to have articles on the site than female executives, and the women had generally higher-quality articles as well. The researchers concluded this demonstrates activist editors overcorrecting for the site’s gender bias, leading to a bias towards women and suggested similar overcorrection could arise in other groups.
Claims of a gender gap in contributors to Wikipedia and a resulting bias in content towards men have prompted numerous efforts by left-wing feminist groups to increase participation on the site and force changes to policy.
In the study “The Gender Bias Tug-of-War in a Co-creation Community: Core-Periphery Tension on Wikipedia” authored by Amber Young, Ari Wigdor, and Gerald C. Kane, and published on December 1, past studies into gender bias at Wikipedia were noted as having claimed a bias towards creating more and better articles about men consistent with a reported gender gap in contributors to the site. While corporate media have praised Wikipedia as a counter to “fake news” online in keeping with messaging suggested by a public relations firm belonging to the Clinton Foundation’s Head of Communications, the alleged gender gap and resulting bias have prompted significant criticism.
One effect of this external media pressure has been an increase in activist efforts on Wikipedia to create and improve content about women on the site, including through “edit-a-thons” run by left-wing feminist organizations. The study’s authors sought to determine how this increased focus on creating and improving content about women on Wikipedia affected the bias in site contents.
For the study, the researchers used the Fortune 1000 list of CEOs and compared how 50 female CEOs were covered on Wikipedia relative to four sample groups of 50 male CEOs each. Sample groups consisted of male CEOs whose firms were one rank higher, male CEOs whose firms were one rank lower, male CEOs in the same industries at similarly-ranked firms, and male predecessors to the female CEOs. Researchers then pulled the text and metadata of the articles using a public Wikipedia tool, as well as the historical revisions of the pages.
In examining potential bias the researchers used several different criteria. First, they examined “selection bias” by determining how many CEOs in each group had Wikipedia pages created about them. Second, they examined “source bias” by determining the number and quality of sources used in the pages that did exist. Third, they examined “influence disparities” by determining how often and how much editors contributed to pages in each sample. Lastly, the researchers examined “content bias” relying on a research method that uses four language metrics for how subjects are portrayed.
Regarding the “selection bias” criteria, the study determined male CEOs in comparable positions to female CEOs were 22 percent less likely to have Wikipedia pages, confirming a bias towards women. On the “source bias” criteria, the study found pages on female CEOs generally had more sources and that they were of higher quality, confirming a bias towards women. The study’s findings on the “influence disparities” criteria again showed a bias towards women with female CEO pages getting more edits more often relative to their male counter-parts.
When examining the “content bias” criteria, the study found mixed results on two of the four language metrics it used, while the remaining two showed more biased results for the women’s articles. One of the two remaining metrics was “clout” where a higher result indicated the articles on women spoke with “higher expertise and confidence” than articles on their male counterparts. The other metric was “authenticity” where a lower result indicated articles on male CEOs were generally more honest and subdued, while the female CEO articles saw more flattering and embellished content, according to the study. Results for every form of bias were consistent with the study’s expectations.
Beyond identifying an increasing bias towards women on Wikipedia, the researchers also sought to determine how these shifts emerged by examining page edit histories. The researchers divided editors into two groups: core and peripheral editors. “Core” editors were those editors whose had made nine or more edits, while “peripheral” editors were those who made less than nine edits. Researchers then examined the revisions of these different groups according to three of the four bias criteria used to examine article bias, excluding “content bias” citing the difficulty in identifying such bias in individual edits.
The researchers found in their study that the “core” group of editors was the primary force behind increasing bias towards women. Reviewing two time periods from 2003 to 2011 and 2011 to 2016, the authors further examined how much the “core” group’s influence shifted over time and found the group’s influence became heavily pronounced in the most recent time period from 2011 to 2016.
Discussing the study’s findings, the researchers argued the “peripheral” group’s early predominance contributed to the previously observed bias towards men, but that more significant involvement by the “core” group of editors gradually shifted the bias towards women in an over-correction by activist editors seeking to address the previously-identified bias towards men. The authors of the study suggested that such a phenomenon could occur in other information-focused communities, such as the open-source community and bloggers, and even in general society and politics. Several recommendations are made in the study for addressing such issues on Wikipedia, including regular monitoring for systemic bias and automated bias detection methods to alert users.
Key in the study’s findings is that an increasing bias towards women on Wikipedia goes against traditional expectations that the majority group in a community would tend to be biased towards the majority. Notably, the observed shift in bias towards women attributed to the more active “core” group of editors came in recent years as significant attention has been brought to an alleged gender bias in content on Wikipedia. Part of this attention was tied to a false narrative regarding the dispute over articles about the GamerGate anticorruption movement in gaming, which falsely paint the movement as misogynists who harassed women the movement tied to corrupt practices in the industry. Aside from seeking to create or expand more articles about women, efforts to address gender bias have included changes to site policies.
This year, the Wikimedia Foundations that owns Wikipedia announced they were imposing a “universal code of conduct” on the site, citing diversity issues such as the alleged gender gap as one major factor in the decision. Such efforts have also extended to race with the Foundation endorsing the Black Lives Matter movement earlier this year declaring “no neutral stance” on racial justice and citing the code of conduct as part of their efforts to address “equity” on the site. When the Foundation revealed its proposal for the code of conduct, it included many provisions advancing left-wing identity politics, such as requiring use of “preferred pronouns” by users.
Previous studies and analyses of Wikipedia have shown the site to have a general left-wing bias as well. Such bias has been criticized by the site’s own co-founder after examining several major articles. Expressed concerns about “diversity” on Wikipedia have aggravated this bias with activist editors forming a Black Lives Matter group to help push the movement’s agenda on the site earlier this year and editors banning expressions of support for traditional heterosexual marriage on profile pages causing an outcry from Christian and family organizations. Despite significant evidence of bias, media, academia, and Big Tech, still rely on Wikipedia for information.
T. D. Adler edited Wikipedia as The Devil’s Advocate. He was banned after privately reporting conflict of interest editing by one of the site’s administrators. Due to previous witch-hunts led by mainstream Wikipedians against their critics, Adler writes under an alias.
COMMENTS
Please let us know if you're having issues with commenting.