When some recruitment teams hear the words “artificial intelligence,” they immediately lock their doors, shut their windows, and hide. Why? AI has been a recent source of fear for organizational leaders and recruitment teams, especially in the context of helping with diversity recruiting

But where did this fear come from? You may have read stories of automated AI tools exacerbating workforce diversity gaps, concerns of algorithmic bias against underrepresented candidates, and ongoing policy debates about AI regulation for recruiting tools. 

The evidence might seem stacked against AI, but is technology to blame for this hysteria? Or is it the misuse of technology by individuals who prioritize convenience over everything else? If we ripped off the monster’s mask, we would actually discover many employers using AI for diversity recruiting in the wrong ways. 

This begs the question, what are the wrong ways to use AI for diversity recruiting? And, most importantly, how can recruiters get it right? 

Don’t: Let AI Subjectively Judge Candidates

Let’s be completely transparent: AI is not perfect. However, the level of AI imperfection drastically differs based on the goal of a particular solution and the consequent algorithms to help it reach that goal.

As John Jersin, Vice President of Linkedin Talent Solutions said in this article, “I certainly would not trust any AI system today to make a hiring decision on its own. The technology is just not ready yet.”

But should AI ever reach a point where it’s ready to make a hiring decision on its own? Talent acquisition is a profession that relies on human connection, empathy, and, most importantly, context. No one candidate is the same, and a recruiter has the responsibility to understand those differences within a given situation to make the right call. If the goal of AI technology in your diversity hiring efforts is to make these decisions and not inform them, then you’ve got yourself a problem.

When AI is programmed to measure subjective aspects of a candidate, organizations find themselves face-to-face with decisions that don’t take into account behavioral and psychological nuances that make humans human. For instance, one AI recruitment company created an algorithm to rank candidates based on everything from “word choice to facial movements” during video interviews. Ask yourself this: Do you know anyone with particular speaking habits or expressions that are unique but not indicators of intelligence or competency? 

There was widespread concern from this technology that “traditional applicants (white, male)” would rank as more employable than others, like non-native English speakers and/or disabled people. If a recruiter isn’t there to vet these situations for additional context on a candidate, this technology will exacerbate diversity gaps rather than address them. 

Do: Let AI Save You Time With Objective Assessments and Analytics

When some people think of AI, they picture a robotic system working much like a human replacement — it’s no surprise that the most popular Google searches associated with AI are “Science Fiction” and “Existential Threat.” 

However, AI isn’t all about predicting behavior and employability through the collection of intrusive data. Its ability to process large amounts of information systematically allows teams to recognize patterns from online profiles and job descriptions to indicate the best sources for underrepresented talent and the right methodology to build appropriate engagement with them. 

AI-powered recruitment tools like Hiretual and Textio use AI technology in different ways to analyze information that’s publicly available or readily provided by employers and job seekers. 

By using AI to assess candidate profiles and resumes across the open web for pronouns, schools, or affiliations to professional organizations and societies, Hiretual helps hiring teams expand talent pools for underrepresented talent, identify the most effective channels and locations for diversity hiring, and finally tie up all those findings with performance reporting and analytics. On the other hand, Textio uses AI to help employers increase diverse representation in the applicant pool by identifying non-inclusive and monotonous language in company job descriptions. 

In both platforms, AI is used to augment recruiting responsibilities and not automate decision-making. Recruiters using Hiretual leverage the platform’s search results and analytics to inform the candidate selection process, while teams with Textio use the tool to guide the writing process and not replace those who write. 

Don’t: Rush The Selection And Implementation of Diversity Technology 

Whether it’s tight deadlines or an eagerness to complete a task — the more you rush something, the higher your chances of making mistakes.

With diversity and inclusion, there are so many human perspectives, experiences, and best practices to consider. These include identifying cognitive biases present in your team, understanding different cultural workplace etiquette, and remaining sensitive to communicating appropriately with different groups of people. If your team is serious about improving the diversity of your workforce, you can’t rush the selection, testing, and implementation of the recruitment technology that will help you achieve your goals. 

Companies that rush their diversity initiatives and technology implementation usually face two unfortunate outcomes. First, they show consumers, candidates, employees, and their professional peers that their attempts at diversity and inclusion are lazy and thoughtless. Second, they are likely to fall into the common pitfalls of choosing AI tech that subjectively measures candidates, performs tasks using biased data sets, and automates the wrong parts of the recruitment process. 

Do: Take Your Time And Vet Your Tech

In the same way recruitment teams shouldn’t look for a quick fix for diversity recruiting, they have to be equally dedicated and patient with the growth and ethical functionality of their AI recruitment tech. If you’re in the process of selecting tech, talk to your vendors about how your priorities and concerns align with their product. Don’t feel rushed to push out your tech, and test it with different data sets to mitigate any potential biases. 

The best tech is flexible enough in its infrastructure to allow for input and adjustments. Hiretual users keep the platform accountable with in-app feedback for search results that are integrated as a compulsory step in any sourcing workflow. Recruiters are also given the freedom to test and cross-reference search results with different AI filters and contextual search criteria before deciding on the best way to move forward.

One customer uses Hiretual’s filters for diversity to expand on candidate searches for a particular role before comparing it with the current makeup of their talent pipelines. They explain, “Oftentimes the pipelines you create can be skewed towards one area or another. I create multiple sourcing tasks within a project or one open req, and I’m able to cover those different areas to make sure it’s actually an equal pipeline.” 

A tactic like this is useful for de-biasing your data to make sure that candidates from all backgrounds are getting a fair shot at a job opportunity. Most importantly, it signals the right mindset to have by taking the time to test out your technology to ensure it’s helping organizational efforts, not hurting them. 

Don’t: Automate Without Caution

The goal of workplace technology, especially as it pertains to recruiters, is to augment their capabilities and help them do their job better. Recently, there has been significant concern surrounding AI automation in the recruitment process. 

For instance, one e-commerce company attempted to automate part of their hiring process with an “experimental hiring tool” ranking candidates according to the quality of their resumes. However, they realized their program was not ranking candidates in a “gender-neutral” way because quality was derived from a data set of resumes submitted by predominantly males. 

In this scenario, automation should not have been used to begin with since it involved candidates being ranked according to a subjective definition of quality. What is a ‘quality resume’? For most organizations, there’s no one set criteria of a good resume. In fact, if you ask 10 of your friends to hand over their resumes you’ll find that each resume might reflect a hint of their personality and personal preferences. 

It’s important to note that not all types of automation are bad in the recruitment process. As Hiretual’s CEO Steven Jiang explains, “things like data migration, system integrations, list-cleaning and scrolling through databases don’t need human personalization but still take up large chunks of time.” By automating processes like these, recruiters are left with more time to deal with the nitty-gritty tasks of recruitment — like screening resumes and analyzing interview results with proper context.

Do: Stay In Control Of Your Recruitment Technology

When discussing the role of AI in recruitment technology, Jiang explains that “the goal of an AI infrastructure is to build an environment where recruiters have full flexibility and control of their business processes.”

At the end of the day, organizations should always be in control of their processes so they can take responsibility for their practices. Whether you’re using a tool to expand your talent pool of underrepresented candidates or improve the inclusive language of your job descriptions, recruiters should always have a systematic understanding of their diversity recruiting technology, organizational goals, and intended outcomes. 

No Need For Fear

Similar to the road towards racial equality, successful diversity recruiting will have its fair share of obstacles. AI recruitment technology is meant to help you overcome those obstacles and not add even more barriers to inclusion.

If organizational leaders automate carefully, thoroughly vet their technology options, and not allow AI to subjectively judge candidates, AI diversity recruitment technology will be a powerful ally in your diversity hiring efforts. And hopefully, if you ever saw AI as nothing more than a horror movie for diversity recruiting, these guidelines will give that movie a happy ending.  

For more recruitment tips, check out Using Health Benefits to Recruit Employees