This website uses cookies to ensure you get the best experience on our websites. Learn more

Skip navigation

Get the latest insights

delivered straight to your inbox

Aug 28, 2023

The Promise and Pitfalls When HR Uses AI For Hiring

Brenda R. Smyth, Supervisor of Content Creation

Since 2016, AI has become an increasingly popular tool in the hiring process. But this technology is not a replacement for humans and is dependent on oversight and ongoing course corrections. While it can help make hiring decisions more objective, it can also introduce bias. While it can easily pluck the most qualified candidates from hundreds of resumes, it can also filter out highly qualified applicants.

“There are long-standing debates about whether AI eliminates or amplifies bias in hiring,” reports an article for Pew Research Center. A survey shows that in general the public is optimistic with 53% saying they think using AI in hiring will improve any bias or unfair treatment based on race or ethnicity. Thirteen percent believe hiring bias will be worse because of AI. The same survey shows there is more skepticism among Black adults that AI will improve fairness.

Using AI tools in job postings and applicant screening

We can all agree that AI makes recruiting more streamlined. But there are two specific areas of hiring that we humans can make the best use of AI tools and avoid potential hazards: job postings and candidate screening.

Sign up for a new online HR workshop: ChatGPT and AI Basics for HR Professionals.

  1. Beginning with the job posting, recruiters can get help from ChatGPT. The recruiter and the hiring managers work closely together to create a list of qualifications and AI can quickly turn it into an organized post — a big time savings. However, this composition capability also makes it much easier for everyone to get carried away with a lengthy list of “ideal” qualifications so long that job candidates are turned off or intimidated; they don’t even apply. And those that do apply may not have expansive job descriptions on their resumes. Which means that the more minor qualifications on your list aren’t included in their resumes and AI will screen these individuals out.

    Case in point — A young geologist has been working three years for an engineering firm. Their career is off to a great start, but they’re eager to move on to their next adventure in another city. They’ve sent out close to 50 resumes without so much as a call back. They can check off every qualification listed on some of the job postings and don’t understand why they’re not getting responses. They admit they’re not a great writer and don’t adjust their resume every time they send out a new application. Are engineering firms looking for geologists who can write? Or are they looking for geologists with experience overseeing a drill site?

    Recruiters can adjust for this by carefully narrowing the qualification list for every job. This is not a time to consider including everything you can think of. Instead, look at people who have been or are currently successful in this position. What do they have in common? Is it a degree? Do they really need five years’ experience or has anyone succeeded in the job with less? Also consider if those qualifications introduce bias in hiring. Just because all the previous workers are 35 and outgoing, is that truly necessary for success? Is a degree necessary for your sales pros or would putting new workers through an accelerated training course better ensure they have the skills you’re looking for?

  2. Using AI to screen applicants can eliminate and add bias. Machines don’t have busy days, terrorizing commutes or energy highs and lows – so they are unerringly objective. But, because humans determine the criteria included in the screening process, they are following human directions which could unintentionally add bias. And it’s not just in the resume and cover letter screening. Video screening interviews are becoming more commonplace in hiring and some organizations use AI tools to review them. A New York AI hiring law passed in 2021 (with enforcement starting in July 2023) is designed to help keep technology free of racist or sexist bias. It requires annual “bias audits” and that job candidates be notified about the use of AI tools in the hiring process. Whether the law works is yet to be seen; regardless, the automation used to help you hire could be eliminating people you need.

    Case in point. An experienced programmer completes a video screening interview. They succinctly outlines their qualifications and answers the required questions without mincing words, getting right to the point and rarely looking into the camera. The lack of eye contact makes them look unfriendly and they get screened out. They’re on the autism spectrum and brilliant. Is eye contact and a smile truly important to this job?

    Recruiters can adjust by carefully considering who they’re weeding out and if they’re eliminating much needed candidates in today’s highly competitive job market. Ever stopped to take a look at the resumes that your AI tools kicked out? If there were qualified candidates in this group, what facets of their resumes caused them to be eliminated? It’s worth checking.

AI in recruiting and hiring is a vital tool for HR, speeding up time-to-hire at a time when job applicants are sometimes in short supply and HR pros are short on time. But it requires caution rather than a set-and-forget approach. Taking time to question outcomes and learn how screening tools might be working for or against you will ensure the best “humanly possible” results.


Share

Brenda R. Smyth

Supervisor of Content Creation

Brenda Smyth is supervisor of content creation at SkillPath. Drawing from 20-plus years of business and management experience, her writings have appeared on Forbes.comEntrepreneur.com and Training Industry Magazine.

Latest Articles

loading icon