Breaking Down Bias: Mitigate The Pitfalls of A.I. In Tech Hiring

CEO

Since the start of the pandemic, 86% of businesses have moved their interview processes online. The extent of digital hiring transformation varies by company. For many, remote interviews are still challenging. But for others, the transformation has opened the door to introducing more technology to help keep up with the surging demand for software engineers. This includes an increase in artificial intelligence (A.I.) and other automated tools to identify the best job candidates.

While some people believe that using A.I. can reduce hiring biases by eliminating the human element, this is often not the case. Most A.I. tools in the market today are more accurately defined as machine learning. These include algorithms that are programmed to find patterns in historical hiring data that subsequently inform future decisions.

The problem is that using historical data to predict the future reinforces the status quo. Defining the “best” candidates as those most similar to past employees is essentially digital redlining.

Beware of pedigree bias

If a company has a history of only hiring a certain type of individual for engineering positions—i.e., White or Asian males—the A.I. tool will learn to prioritize candidates with similar profiles to the current employees in the company. This injects pedigree bias into the hiring process.

Automated pedigree bias begins excluding qualified individuals from nontraditional backgrounds at the earliest stage of the recruiting process—the application screen. According to our data from nearly 100,000 technical interviews, on average, fewer than 10% of direct applicants progress to the interview stage. This process filters out candidates without the education or past employment pedigree recruiters are looking for.

We often hear that when it comes to DEI “sunlight is the best disinfectant,” but layering A.I. on top of today’s resume screens will not only exacerbate the pedigree bias problem, but it will also create a black box around the vetting process by obfuscating the bias. The algorithm will continue selecting candidates based on pedigree proxies—i.e., candidates who may have attended a top university or have experience at a Big-5 tech company. This artificially shrinks your candidate choices and pipeline diversity.

One way to mitigate pedigree bias is to give more interviews to direct applicants. By analyzing direct applicant success rates from our technical interviews, we discovered that direct applicants are being over-screened compared to other talent sources. This over-screening, which is being exacerbated by algorithms, is bad for two reasons.

First, unless your organization has a robust talent pipeline coming from HBCUs and HSIs, direct applications are almost always the most diverse talent pool available. Over-screening this source based on pedigree will make it much harder to achieve meaningful DEI goals.

Second, you’re creating a more expensive pipeline by over-indexing on proactively recruited candidates. Direct applicants close (i.e. accept offers) at a higher rate compared to their proactively sourced counterparts. They are less likely to receive counter-offers, and spend less time in the hiring process. This means by over-screening direct applicants with A.I., not only are you likely to constrain diversity, but it also costs more and takes longer to make a hire.

If you have a 10% application-to-tech-screen rate, try pushing it to 15% or even 20%. If you’re still seeing success, keep expanding it to drive better overall hiring efficiency and equity.

Automation without empathy

Another way to break down hiring bias is to make sure there is a human element to your technical assessment.

This is particularly true for automated pass/fail coding tests that require a candidate to produce fully working code. We see about 55% of all offers go to candidates with incomplete technical interview solutions. This is because human interviewers can spot silly mistakes like typos, even if there isn’t a working technical solution.

Furthermore, the binary nature of non-human tests disproportionately impacts women and candidates from underrepresented backgrounds, who receive offers for incomplete solutions at higher rates than their White- and Asian-male counterparts.

Well trained interviewers can put candidates at ease, clarify what the team is looking for, and reduce false negatives that week out qualified engineering candidates. Consider designing interviewer training with an emphasis on providing clear and transparent guidance to candidates. This includes, at a minimum, telling candidates which competencies are being assessed; what success looks like; and that it’s okay to ask clarifying questions during the interview.

This will set candidates up for success, and also provide a baseline level of transparency that will prevent well-networked candidates from having more insight into the hiring process, which helps level the playing field.

The human + technology solution

The key to implementing digital transformation in hiring is to strike the right balance of human and technology. Use technology to lighten the cognitive load of your interviewers by suggestion questions based on the role and competencies being evaluated. Use video recordings to review your interviewers and train them to spot mistakes like ambiguity or preferential treatment.

Successfully implementing A.I. in recruiting and hiring is going to take an investment. Not just an investment in technology, but an investment in creating more  inclusive data science teams. This step is critical to ensure we’re not codifying today’s biases in the next generation of tech.

Using a structured interview rubric will help get the best of both worlds. It provides hiring managers with structural rigor to make data-driven decisions and removes the black box of a purely A.I. solution. It ensures that candidates are being assessed on a consistent and level playing field, while also leaving room for appropriate human empathy and judgement.

A human + technology approach lets interviewers focus on what’s important: building a rapport with candidates, providing clarity, and setting them up to show their best selves in the interview. Giving candidates an interview that is predictive, fair and ultimately enjoyable will unlock opportunities for employees to thrive and for teams to grow.

Products You May Like

Articles You May Like

‘Mufasa’ Muted As It Lopes To $33M Through Friday Overseas, On Way To $125M Global Launch – International Box Office
Christmas Box Office Getting Richer With ‘Nosferatu’, ‘A Complete Unknown’ & More Joining ‘Sonic’-Charged Marketplace; 2024 Domestic Eyes $8.75 Billion Final
The Best Historical Fiction of the 21st Century (So Far)
Broadcom’s CEO Dismisses Intel Acquisition Amid AI Growth Focus
The New York Times Readers’ Picks for the Best Books of 2024

Leave a Reply

Your email address will not be published. Required fields are marked *