How to Avoid Bias in Hiring

Committing to diversity and inclusion doesn’t automatically reduce bias in the hiring process. Recent research from the University of Pennsylvania found that firms who reported that they consider gender and racial diversity when making hiring decisions still displayed significant bias toward resumes from women, minorities or applicants from lower socioeconomic backgrounds.

Bias in the hiring process is deeply rooted in how humans interpret the world. Humans are constantly overloaded with information that our brains must quickly process. Our defense to manage this is to make generalized judgments based on obvious cues (AKA stereotyping).

The problem is that these snap generalizations are often wrong. When these generalizations find their way into the development and training of AI-based hiring tools, we extend the reach of our bias to automated processes that amplify those biases exponentially.

If you are looking for ideas on how to avoid bias in hiring, it helps to think about it holistically and look out for its presence across the entire hiring workflow. Here’s how to look for and combat hiring bias at all phases of the candidate journey.

Create Fair Access to Employment Opportunities

Hiring bias begins outside of the formal hiring process— starting with AI that influences who sees job advertisements to begin with. After all, you can’t hire someone who doesn’t even know your job exists.

Automated ad placement tools used by platforms such as Facebook and Google present different ads to different people based on how the system has been trained and reinforced. This may be a powerful, efficient way to drum up interest in consumer products by targeting people based on their other interests, but it creates dangerous limitations on who is targeted for opportunities for employment and our ability to connect with diverse talent pools.

Digital advertising platforms aren’t the only source of potential bias. Above-the-funnel sourcing and matching tools can also introduce systematic bias into hiring. 

As discussed in our 2021 Talent Assessment Market Report, job matching platforms such as ZipRecruiter, and other “recommender systems” tend to reinforce job seekers’ own cognitive biases.

For example, if a woman with only a few years of experience tends to click on lower-level jobs because she doubts she is qualified for more senior positions, over time, she may be shown fewer higher-paying jobs than she would otherwise be qualified for.

Automated sourcing platforms also use the historical preferences of all recruiters using the system to determine which job seekers to present them with. For example, if a recruiter tends to click primarily on the profiles of male software engineers, she’ll be shown more male software engineers — and other recruiters seeking candidates for similar roles may also see more male software engineers. In this way, the tools can actually accelerate gender bias.

If you’re serious about reducing hiring bias, start by auditing the automated systems you are using in your recruitment process and holding the vendors accountable for the disparate impact of their tech. Take direct control of your sourcing efforts through creative programs such as those that seek out diverse talent in new places.

Focus on Valid Assessments of Generalizable Abilities

Once candidates opt into your hiring process, they are immediately screened and evaluated by recruiters, automation, or some combination thereof. The most common sources of information used for the initial evaluation of candidates — resumes and formal education — are fraught with bias. 

Resumés and academic credentials are firmly entrenched as the accepted pathway for movement into the workforce. But what about those who don’t have the opportunity for a traditional education?

Recent research has found that over the past two decades, around 50% of high school graduates from low-income households have consistently enrolled in college, compared to 80% from high-income households. Well-designed assessments have the potential to become a great equalizer because they can replace reliance on academic credentials with objective, reliable, and job-relevant signals that better represent an individual’s capabilities.

A reliable, validated assessment from a quality vendor can significantly reduce bias in your hiring process. When using assessment tools of any kind, including AI-based tools, make sure you start with a job analysis and validation study to ensure what you’re measuring is job-related. 

Look for “Culture Add” instead of “Culture Fit”

Culture fit is one of the most talked-about concepts in hiring. While the idea that hiring people who will fit in with the values of an employer makes intuitive sense, in practice, it can introduce a major source of bias. This is especially true in the “last mile” of the hiring process, where hiring managers have the final say in who gets the nod. Structured interviews can help, but even then, these decisions are most often made based on managers’ subjective evaluations of which applicants are “like me” or “fit the mold.” 

When hiring for cultural fit, the company culture must be carefully and objectively defined. Culture is inherently a collective concept, and defining it takes a dedicated effort that involves team members and leaders alike in creating the benchmarks against which candidates will be evaluated.

Even carefully defined and measured culture fit can still be problematic because it leads to the creation and perpetuation of a homogenous workplace. Truly combating bias and achieving your diversity goals requires a shift in mindset from hiring for culture fit to hiring for “culture add.” 

Hiring for “culture add” turns the notion of cloning employees on its ear because it shifts the focus to hiring for valuable and diverse characteristics missing from the team. The mindset of “culture add” empowers leaders to build teams that allow employees to contribute to the culture through their cognitive diversity. 

These “cultural contributions” can directly impact performance across the board, with research demonstrating that cognitively diverse teams significantly outperform those who do not.

If you’re going to use “cultural fit” as part of your hiring process, make sure the benchmarks you are using are clearly defined and that hiring managers are held accountable for making objective decisions based on them. Then, consider including a structured process to evaluate candidates based on what their individual background and experience can add to your organization.

We will never be able to completely prevent bias, but with dedication to a holistic mindset that considers the entire hiring process, we can better identify and manage it.