Inclusive AI in hiring: How should employers debias hiring algorithms? Experimental evidence on behavioral responses to different debiasing methods
Ip, E
Date: 2025
Article
Journal
Behavioral Science & Policy
Publisher
SAGE Publications / Behavioral Science & Policy Association
Abstract
Hiring algorithms are increasingly popular among employers, but they are highly susceptible to biases in male-dominated environments. Debiasing hiring algorithms represents a vital solution. However, it is unclear how algorithms should be debiased. This study investigates which debiasing method leads to the most desirable gender diversity ...
Hiring algorithms are increasingly popular among employers, but they are highly susceptible to biases in male-dominated environments. Debiasing hiring algorithms represents a vital solution. However, it is unclear how algorithms should be debiased. This study investigates which debiasing method leads to the most desirable gender diversity and inclusion outcomes for employers using mock labor market experiments. We study how decisions to apply for a competitive job differ depending on the algorithm used to evaluate applications. We find that algorithms that ensure equal chance of success for men and women attract the most female applicants and should be considered by employers wishing to increase diversity in hiring; whereas gender-blind algorithms are perceived to be the fairest and should be considered by employers wishing to improve inclusivity.
Economics
Faculty of Environment, Science and Economy
Item views 0
Full item downloads 0