LWL #18 Algorithmic Bias and the Gender Gap

LINKS WE LIKE #18

A recent article by Wang et al studies gender trends in computer science authorship, revealing that, “if current trends continue, parity between the number of male and female authors will not be reached in this century”. Although examining the proportion of female authorship over time only captures one of the many facets of representation in academia, quantifying gender inequality in computer science furthers the debate on whether a discipline that has traditionally been white and male is imbuing its systems and algorithms with the biases inherent to the workforce at large. 
How does the lack of gender representation in academia, particularly in computer science and its related disciplines, affect the systems, machines and algorithms designed to represent, measure and study human behavior? Does the lack of gender parity play a role in introducing gender biases to machine learning and AI? In this week’s Links We Like we feature recent articles that tackle this continuing debate, introducing the question of gender parity and its relationship to algorithmic bias.studies gender trends in computer science

Karen Hao, artificial intelligence reporter for MIT Technology Review, explains how AI technology has “automated the biases of its creators to alarming effect”. This comes as no surprise, as she notes that “only 18% of authors at leading AI conferences, 20% of AI professorships, and 15% and 10% of research staff at Facebook and Google, respectively” are women. But fixing the gender gap in the AI industry is not only about improving workplace diversity. Hao interviewed Jessie Daniels, a researcher for Data & Society, a research institute based in New York City. Daniels noted that “the tech industry was fundamentally built on the ethos that technology exists independently of society”, and thus neither race and gender were thought to be relevant in the cyberspace. To this day, she adds, the industry has built on the idea that tech products are designed and exist “independently of the sexism, racism, and societal context around them”. To read Hao’s article, and Daniel’s interview click here

AI as the equalizer?

The Catalyst’s “Gender Bias in AI” brief hones in on the difference between assisted, augmented, and autonomous intelligence to discuss how hiring and talent management systems relying on AI may have potential to “move the needle on gender equality in workplaces” by using more objective criteria than when decisions are made without these systems. Some examples include hiring tools that assess applicants based on specific data, skills, and abilities, others filter a candidate’s appearance and voice during the interview process to reduce gender bias. Yet, AI as an equalizer remains an aspirational objective: “unconscious biases” that individuals hold often translate into “unintentional biases” trained onto the machines. To read more on how gender bias is built onto AI, click here

Further Afield

Dive deeper into gender biases and AI:

Interested in learning more?

Sign up for our newsletter here.

Share
Keywords
Author(s)
Share
Recommendations

Event Paper

Politics vs. Policy in Disinformation Research: A Systematic Literature Review

Despite the wealth of research on disinformation, knowledge production is unevenly distributed

Annual Report

Overview and Outlook 2023-2024

The world of 2024 should be much safer, fairer, more empathetic, sustainable,

Project Report

Segundo Informe Nacional Voluntario de Guinea Ecuatorial 2024

El Segundo Informe Nacional Voluntario de Guinea Ecuatorial 2024 recoge el impacto