Gender Bias in AI Recruitment Systems: A Sociological-and Data Science-based Case Study | IEEE Conference Publication | IEEE Xplore

Gender Bias in AI Recruitment Systems: A Sociological-and Data Science-based Case Study


Abstract:

This paper explores the extent to which gender bias is introduced in the deployment of automation for hiring practices. We use an interdisciplinary methodology to test ou...Show More

Abstract:

This paper explores the extent to which gender bias is introduced in the deployment of automation for hiring practices. We use an interdisciplinary methodology to test our hypotheses: observing a human-led recruitment panel and building an explainable algorithmic prototype from the ground up, to quantify gender bias. The key findings of this study are threefold: identifying potential sources of human bias from a recruitment panel’s ranking of CVs; identifying sources of bias from a potential algorithmic pipeline which simulates human decision making; and recommending ways to mitigate bias from both aspects. Our research has provided an innovative research design that combines social science and data science to theorise how automation may introduce bias in hiring practices, and also pinpoint where it is introduced. It also furthers the current scholarship on gender bias in hiring practices by providing key empirical inferences on the factors contributing to bias.
Date of Conference: 10-12 November 2022
Date Added to IEEE Xplore: 28 August 2023
ISBN Information:

ISSN Information:

Conference Location: Hong Kong, Hong Kong

I. Introduction

Existing scholarship has long identified gender biases in hiring practices. Human conscious and unconscious gender bias influences decision mechanisms and has harmed the representation of women in the labour force [1]–[5]. In the past two decades, however, the upsurge of computer-based, automated decision-making (ADM) in recruitment has become more prevalent. Quite predictably, given its supposed pragmatism, automation is assumed to be more impartial, scientific, and mathematical, and thereby is assumed to mitigate the very issue of human biases [6]. It appears that ADM has emerged as a solution to the increasing challenges of recruitment [7], [8]. However, literature has increasingly recognised the vulnerability of ADM in making fair decisions [7], [9], [10]; and attempted to dissect the issue of fairness from intersecting dimensions of race, gender, ability, sexuality, and others [8], [11–19].

Contact IEEE to Subscribe

References

References is not available for this document.