In the second of three articles on using technology to enhance recruitment, Jeremy Swinfen Green explores the ways that AI can help – or hinder – the selection of ideal candidates, while ensuring diversity and fairness
AI-driven recruitment platforms can analyse vast amounts of data from various sources, including job boards, social media, events and professional organisations, to identify a diverse set of potential candidates who match the job requirements and, perhaps more importantly, are likely to perform well.
Removing bias from applicant selection
Bias-free CV screening
Often, recruiters can be overwhelmed with applications for a particular vacancy. Traditional methods of winnowing out applications can be random (based on CV design preferences perhaps), inappropriate (ignoring relevant experience in preference to academic qualifications) or simply unfair (bias against people with foreign-sounding names).
Blind recruitment processes that can often help avoid unconscious bias are increasingly being used. AI systems can help to reduce bias (whether conscious or unconscious) in CV selection, for example automatically stripping out information that might indicate age, gender or ethnicity from CVs before they are examined by recruiters.
Another opportunity is to use AI to conduct initial interviews. A video call can be set up with large numbers of candidates being interviewed simultaneously (but individually) by an avatar, or perhaps a deep fake of a real human. “Interview avatars” can be used to screen potential employees by evaluating their soft skills and other competencies to create a comprehensive initial judgement of the candidate’s suitability.
Technology must be used with care though. An AI-powered system set up to screen candidates by focussing solely on their qualifications and professed skills could miss excellent candidates who haven’t had the chance to get a particular qualification, who are diffident about their skills, or who fail to express those skills in a way that the AI system can understand. And employers based in the EU will need to be aware of the constraints placed on AI-supported recruitment outlined in the EU’s AI Act. However, used with proficiency and care, technology can promote fairness and diversity in the selection process.
Bias-free interviewing
Once the number of applicants has been reduced to a manageable number (and, one hopes, unsuccessful applicants have been given a polite “thanks but no thanks” message), the difficult task of selecting the best candidate from a shortlist should be handed to a human.
HR professionals will be expected to be unbiased when selecting candidates. Unfortunately, many, perhaps most, people are biased in some way – not always consciously of course. For example, bias against women is very strong globally.
Technology has an important part to play in reducing bias during the interviewing process. Structured interviews can be used to reduce subjectivity by ensuring consistency and fairness in the interview process. If the questions used in structured interviews are generated by an AI system rather than a human, the more “knowledgeable” AI may well avoid questions with bias hard baked into them. For example, including a question about hobbies – something that a human might feel is useful – could result in bias if certain hobbies are associated with gender, age or ethnicity.
A well-trained AI can also suggest evaluation criteria that are more bias-free than a human could create. It can in addition be used to analyse the interviews, alongside a human’s analysis, perhaps spotting important clues that the human missed.
Another potential opportunity is to use AI to detect emotion during an interview. This is rather more contentious. At a fundamental level, depending on facial expressions and vocal tones is not always an accurate way of detecting an interviewee’s emotional state. Furthermore, levels of emotion during an interview may not be a good indicator of suitability. Take stress and nervousness: some people are naturally more nervous in a face-to-face interview than others, or they may show higher levels of stress because they are more desperate to be selected for a particular job than competing candidates.
Another key weakness in using “emotion AI” for recruitment is that it is unlikely to be bias-free. AI systems need to be trained on a set of data that reflects the population they are going to be used on. If, for example, the data from only a few South Sea Islanders has been included in the AI system’s training data, then it is quite possible that the AI system will be biased against South Sea Islanders when suggesting candidate selections because it does not sufficiently recognise and take account of their cultural differences.
So should emotion tracking be part of recruitment? One day perhaps, but probably not today.
Effective matching
It is not sufficient to ensure diversity and avoid bias in recruitment. Ultimately, recruiters need to select candidates who will excel in the position they are applying for. Technology can help here in several ways including cultural fit assessment and predictive success analytics.
Cultural fit assessment
HR professionals are often focused (fixated?) on whether a particular candidate will “fit into the team”. Leaving aside the possibility that such as assessment is driven by bias (too old to be managed by a millennial, too badly educated to “behave properly”…), it is of course helpful to ensure that teams are a good balance of personality types with a wide diversity of attitudes and experience.
AI algorithms can analyse online profiles and social media activity to assess their cultural fit with the company. By considering factors such as values, interests and communication style, AI helps employers identify candidates who are likely to react positively to an organisation’s culture, leading to better long-term retention and job satisfaction.
This is all true, at least in theory. In practice, the opportunities for poor analysis based on very incomplete and possibly out of date information (people do change as they grow) are enormous. And of course, having someone who isn’t a perfect cultural fit may well improve a poor culture, perhaps one dominated by aggression, blame or a deferential attitude.
A focus on cultural fit can be very problematic, especially for diversity, but for those organisations that rate its importance highly, the best way of ensuring cultural fit is probably to avoid technology and simply rely on the opinions of a few team members, as well as the candidate’s ability to sell their own attitudes and values.
Predictive analytics
A better use of technology in recruitment is for predictive analytics. This is a technique increasingly used by recruitment professionals with a 50 per cent rise in use reported between 2019 and 2022. By using AI, HR professionals can identify characteristics that are predictive of success in the role and the organisation and use this information to prioritise the candidates most likely to excel in the role and, importantly, to remain in the organisation.
Caution though: this type of AI use may well improve the effectiveness of recruitment in terms of generating successful outcomes but it could also work against increased diversity. As always with AI, there are balances to be struck.
Being seen to be fair
Ultimately, most organisations want to be fair in the way they select new employees. Avoiding bias and ensuring diversity is acknowledged as a key driver of business efficiency. For example, Deloitte has found a 39 per cent increased likelihood of outperformance for those in the top quartile of ethnic representation compared with the bottom quartile.
Recruitment fairness is not just an important driver of profit. According to the CIPD, fairness in selection and promotion “influences candidates’ subsequent behaviour, including how likely people are to accept a job, reapply to an organisation, or recommend it. It even affects how well employees perform in a role if selected.”
Because recruitment fairness is such an important driver of business success, organisations that focus on it should proclaim their successes. And technology can help here. AI can be used to monitor recruitment processes for potential bias and discrimination so that HR professionals can provide proof of fairness while taking corrective action to maintain a fair hiring process if any issues are uncovered. AI can automatically provide HR executives with insights into the diversity of their candidate pools and hiring outcomes, monitoring diversity metrics such as gender, ethnicity, education and age. It can even analyse the feedback collected about recruitment processes from different stakeholders, including candidates, employees and external partners.
As the business landscape continues to evolve, HR departments must embrace technological innovation to stay ahead of the pack. By harnessing the power of AI throughout the recruitment process, organisations can streamline HR operations, reduce bias and ultimately attract and retain top talent more effectively.
In the first of this series of articles, I looked at how technology can help with locating and attracting job seekers. In the next article, to be published in June 2024, I will consider how the job seeker’s experience with a prospective employer can be improved using technology.
© 2025, Lyonsdown Limited. Business Reporter® is a registered trademark of Lyonsdown Ltd. VAT registration number: 830519543