Artificial intelligence is a powerful tool that corporate organizations are increasingly leveraging in their searches for talent. Due to its processing power and impartiality, AI has the capability to overcome some of the most entrenched, significant challenges in hiring today. That includes removing human bias in the hiring process.
As the Harvard Business Journal described in a summary of its article on the subject, “AI holds the greatest promise for eliminating bias in hiring for two primary reasons. It can eliminate unconscious human bias, and it can assess the entire pipeline of candidates rather than forcing time-constrained humans to implement biased processes to shrink the pipeline from the start.”
The true power of AI, however, remains in the hands of humans. Machines must be told what to do, programmed to carry out tasks within parameters set by people. If our elemental hiring processes are tainted by bias, and those biases are left unaccounted for at the initial stages, not even state-of-the-art machine learning can correct them. So how do we go about removing human bias in the hiring process?
How Biases Affect the Hiring Process
To begin forming a solution, we first need to understand not only how AI works on the back end of the hiring process, but also how human bias can bend results starting on the front end. Any number of biases can affect human decision-making – some attribute more than a dozen types of biases to how we ultimately make their choices – but two stand out above others with regard to hiring:
Affinity bias. People often find themselves drawn to others they can relate to based off of certain similarities. “We like people who remind us of ourselves or someone we know and like,” according to Diversity Resources. This phenomenon may be relatively innocuous in our day-to-day social interactions, but it slows the progress of diversity, inclusion, and equity in hiring – and often leads to companies missing out on valuable job candidates.
Not every questionable decision made by a recruiter can be chalked up to racism, gender bias or ageism. Many well-intentioned people are simply moved by a human, subconscious instinct to identify or categorize. But this is the nature of bias, and without accounting for it as a first step in the hiring process, every subsequent step will be tilted by it as a result.
Attribution bias. As humans, we tend to attribute our own accomplishments to self-driven or intrinsic factors such as diligence and intelligence. According to Diversity Resources, “when we assess others, we often think the opposite. We believe their successes are due to luck, and their failures are due to poor capabilities or personal errors.”
Affinity bias can often open the door to, or even amplify, the effects of attribution bias. If a hiring manager sees more of themselves in one job candidate than another, the latter prospect is more likely to be doubted or given due credit for their achievements.
How Human Biases Are Introduced to AI
So how do these human biases leak into what we may think of as the incorruptible logic of artificial intelligence? Consider an example:
If your company uses AI to cull through an initial wave of candidates for a role, the technology is directed to evaluate prospects based on the company’s previous hires. Even (and perhaps especially) when the evaluation parameters are adjusted based on the company’s “best” employees, bias is introduced into the hiring process. AI machine learning will predict which candidate the client will find most desirable based on historical data points that, as we know, may have already been influenced by human bias.
And that bias isn’t always based on social or cultural factors. You might ask, “As long as candidates of all races, genders and ages are being considered equally, what’s wrong with a company’s 51st hire resembling the first 50 in the door? Because it’s exceptionally difficult to optimize AI to take a nuanced view of candidates. It’s the machine’s job to parse through huge numbers of job applicants based on specific parameters designated by humans. But is a candidate with nine and a half years of experience demonstrably less qualified than one with 10? And isn’t the magna cum laude at a slightly less distinguished school every bit as attractive as the average graduate at an Ivy League university?
When properly programmed, removing human bias from the hiring process can benefit from AI. At the same time, humans have the ability to account for some of the limitations of AI. But without the recognition that bias exists, the status quo won’t change. Organizational leaders should know that correcting for it isn’t only a moral imperative but a business-critical issue. By settling on an outdated hiring approach, your next great hire is likely to slip right under your radar.
Looking to learn more about how technology can influence your hiring processes? Learn how our talent cloud is the new standard in working with independent contractors.