Combining AI with the Human Touch to Eliminate Bias in Sourcing, Recruiting and Hiring
How AI can bring the “human” back to recruiting
By design, artificial intelligence (AI) algorithms make decisions which generally require a human level of expertise. When it comes to employment, AI can be used to help find and hire the best talent.
AI can reduce recruiting workloads by locating qualified candidates, improving recruiter response times, and even help recruiters anticipate skill gaps and talent shortages. More advanced uses include using facial recognition to secure video interviews or using chatbots to answer questions that enrich the candidate or associate experience. However, the growing use of AI in sourcing, recruiting, and hiring has created a debate about bias and fairness. There have been some cases where artificial intelligence bias has occurred.
Bias in diversity
For example, when it comes to facial recognition, AI has failed to recognize the faces of iconic women of color. These iconic women include Oprah Winfrey, Serena Williams, Michelle Obama, and more. Many suggest this is because people of color are rarely in technology development positions. To demonstrate, at LinkedIn, only 1.2% of tech employees are black. At Facebook, that number is only 1.5%. In Google’s tech workforce, it’s 2.8%.
Amazon realized that there was artificial intelligence bias within their AI-based applicant system. It had a male bias when it came to technology-based jobs. That is because Amazon’s algorithm code was to observe patterns in resumes submitted over ten years.
Most of that historical data came from men, reflecting male dominance in the tech industry during that time. This data caused the algorithm to penalize words such as ‘women’ or the names of women’s colleges on resumes.
Remember, AI is only as biased as the human processes it simulates.
AI, at its core, is data. If that data comes from a biased data set, it can perpetuate artificial intelligence bias. In other words, AI in and of itself does not have a conscience; it can only be as flawed as the human processes it simulates.
As such, those who use AI must ensure the design is appropriate. The first step for those using AI is to recognize prejudices and identify where an algorithm could either perpetuate or prevent bias in the hiring process. When intentionally incorporated and managed (by humans), AI can help eliminate bias in sourcing, recruiting, and hiring. Here’s how:
Sourcing shapes the talent pool by finding qualified people. AI makes it possible to review all applications in your possession; instead of the few a human can consider on a tight timeline. However, automated decisions at this stage can cut applicants before they even knew there was a job.
So, what can a time-strapped recruiter do?
Consider the wording and adjectives used in the job description.
Look at which channels you’re using to promote the job and how job alerts function on those channels.
Finally, consider the criteria you’re using to screen, and what questions you’re asking on the application.
Unquestionably, understanding where bias arises in these areas can help organizations make the best use of AI. Understanding this will enable AI to optimize the complete sourcing experience, attract more diverse candidates to your job, and convert more qualified applications.
AI and machine learning increasingly are being leveraged to predict how likely a candidate is to achieve success in a given job opportunity. Unfortunately, such algorithms often use historical data, either from the company itself or industry data, to project who’ll be a good hire. The rest get eliminated. But what if the historical data is incomplete or flawed?
For example, what if your historical data focuses on candidates with proven track records of results, 10+ years of experience, and references from senior management? The problem is that this leaves no room to understand whether those achievements were a result of hard work, talent, or privilege. Institutional bias, from housing to healthcare to education, has limited access and opportunity for broad categories of candidates. Therefore, in many fields, only specific candidates would meet the above criteria.
To demonstrate, in technology, men still hold more than 80% of jobs. This historical data would continue to paint a picture of the same guy who’s worked that job for decades. No one different would ever get hired. Instead of looking at historical data and random human beings, put AI to work getting to know your actual candidates and their capabilities.
AI can help recruiters and employers evaluate interview responses with video analysis. You can also use it on work samples or skill assessments to predict performance, productivity, or likelihood to want the job.
During the evaluation phase, chatbots can further limit bias by taking questions and answers at face value, upholding the truth that “there are no stupid questions.”
AI can give you data, analysis, and projections about whom to screen, interview, and hire. But only you and your team can connect with the human being behind the application. Once you make that connection, AI can help. AI can measure the likelihood of a particular candidate to accept the job and how to configure a competitive offer. Also, AI won’t err on the side of a salary gap to save a few bucks.
AI-powered chatbots can remove social stigma and stress from salary negotiations and open the floor to questions during the onboarding process. Additionally, AI can measure employee engagement. It can also focus on delivering better onboarding experiences for each new employee. With this intention, this will increase engagement and improve your retention rate with an experience that continues to grow.
AI can bring the “human” back to recruiting
In summary, AI can give staffing and recruiting professionals more time to do the parts of their jobs that benefit from human connection and require decision-making. Recruiters are better equipped than machines to connect, human-to-human. They can learn about candidates’ passions, challenges, triumphs, and fears. When AI becomes an assistant that can help humans deepen these connections with candidates and employees of all backgrounds, genders, and orientations, that’s a win-win for everyone.
This article originally appeared in Recruiting Daily.