ARTIFICIAL INTELLIGENCE AND RECRUITMENT
Leicht-Deobald et al., “The Challenges of Algorithm‑Based HR Decision‑Making for Personal Integrity,” Journal of Business Ethics, vol. 160, pp. 377-392. June 2019, https://doi.org/10.1007/s10551-019-04204-w.
O’Neil finds that AI algorithms used in processes involving human resources, such as performance appraisal, recruitment, or evaluation, are impaired by racial and gender bias. Algorithms used in machine learning use data to train it. If the data is from historical employment, such as having most managers being male, the algorithm would assume women are not interested in management positions and thus hide the advertisement from women if used in social media platforms. Other research shows racial bias. For facial recognition algorithms, when the data that is used to train the algorithm is biased according to some external factors, the result of the algorithm will be biased. The facial algorithm tends to recognize white women and discriminate against African Americans, especially women. Also, the algorithm tends to reflect the background of the developers. Since Caucasian males create most of the algorithms, the result of the algorithm would lean more on white people. Such biases impede the objectivity of HR activities such as recruitment.
Elfenbein, D. W., & Sterling, A.D. “(When) Is Hiring Strategic? Human Capital Acquisition in the Age of Algorithms,” Strategy Science, vol. 3, no. 4, pp. 668-682, December 2018. https://doi.org/10.1287/stsc.2018.0072.
Elfenbein and Sterling give a case example of Amazon company that developed an AI tool for recruiting that was biased against females. The computer models were trained using data from Amazon resumes for over ten years. In the same period, the majority of resumes for software developer jobs were from male candidates, and the algorithm learned to prefer male candidates over female candidates. It took time for the company to realize there is a problem with the system since it was gender-biased. Having a resume with female activities or coming from female-only colleges was a disadvantage. The company had to do away with the system even after changing the algorithm to remove the bias. There were fears that the system would devise other ways to discriminate candidates in the future.
Wright, J. & Atkinson, D. “The impact of artificial intelligence within the recruitment industry: Defining a new way of recruiting,”
AI algorithms can help in focusing on pure facts rather than emotions or facts. However, Wright and Atkinson note that the same algorithm can have discrimination. For instance, if Quarterbacks in football can have higher sales performance based on data fed showing good decision making and leadership, the algorithm will be biased against females from the recruitment since they cannot be in the American football team. Without much analysis, such bias is hard to be spotted by humans. The use of automation removes a great deal of bias from a process, but in AI, social factors introduce a great deal of prejudice.
References
Elfenbein, D. W., & Sterling, A.D. “(When) Is Hiring Strategic? Human Capital Acquisition in the Age of Algorithms,” Strategy Science, vol. 3, no. 4, pp. 668-682, December 2018. https://doi.org/10.1287/stsc.2018.0072.
Leicht-Deobald et al., “The Challenges of Algorithm‑Based HR Decision‑Making for Personal Integrity,” Journal of Business Ethics, vol. 160, pp. 377-392. June 2019, https://doi.org/10.1007/s10551-019-04204-w.
Wright, J. & Atkinson, D. “The impact of artificial intelligence within the recruitment industry: Defining a new way of recruiting,”