For companies looking to succeed in an increasingly complex marketplace, a diverse workforce can help boost creativity and innovation. But recruitment bias is still hampering the best intentions. How, asks Alison Coleman, can businesses make sure they are blind to all but the benefits of every candidate?
People like to think that they don’t judge others according to where they are from, their age or their gender, but because human decisions are often the result of unconscious thought processes, it’s inevitable that some are influenced by bias. And when it comes to recruiting staff, this implicit bias has an overwhelming impact on the types of people who are offered a job.
Companies are under enormous pressure to make the right hire. The costs of it going wrong and having to recruit someone else can be enormous. To minimise the risk, many hiring managers, unconsciously or otherwise, go for the ‘safe’ option, hiring people like themselves, from a similar background, and with similar personal experiences. But this doesn’t guarantee they will succeed in the role, and in fact someone from a very different background could bring in the new ideas and fresh approach that the organisation needs.
By allowing unconscious bias to influence hiring decisions, organisations miss opportunities to build diversity into their workforce, and to attract and retain top talent, but tackling it isn’t easy. Unconscious bias exists because of the nature of the beast; it’s unconscious. Because you don’t know what you don’t know, it’s more difficult for individuals and companies to recognise. But with a better understanding of unconscious bias comes a genuine desire to improve it, as Rob Grimsey, group marketing director at global recruiter Harvey Nash, explains.
“An increasingly common way to begin the journey to uncover and overcome bias is to ensure that shortlists include candidates from unrepresented areas,” he says. “In the beginning it was something we encouraged our clients to do, but now it is just as common for them to request it, and it has proved to be a very useful, and sometimes an eye-opening, process."
Other strategies are also having a positive effect, including blind recruitment, in which personally identifiable information, such as name, gender, age and education, is omitted from applicant CVs. The policy is more prevalent among professional services and financial services firms, but other large organisations are starting to incorporate blind recruitment practices as part of a public commitment to fairly recruiting staff, as they demonstrate positive steps towards widening their talent pool of credible candidates.
Yvonne Smyth, group head of diversity and inclusion at recruitment firm Hays, says: “Generally speaking, the use of blind CVs is a positive step as it does help mitigate the impact of bias, however it is not a silver bullet. Blind recruitment needs to be used alongside a thorough and consistent approach to diverse and inclusive talent attraction and selection. This requires a series of interventions to keep the pipeline as wide as possible.” These interventions, she says, include things like reviewing and assessing all the collateral and materials that candidates see throughout the recruitment process and ensuring that it appeals to and depicts a wide range of social groups and cultures. Limiting the number of essential requirements on each job and person specification and job advertisement, avoiding biased language in job specifications and ads, and involving a wide range of diverse stakeholders when reviewing CVs and interviewing candidates will also help to support diversity. Organisations that implement blind recruitment policies have seen improvement.
Four years ago EY introduced a blind CV policy. Historically, some elements of the recruitment system had worked against people coming from state schools. The new system stopped filtering on UCAS points and degree classification, and which school the applicants had attended. There are also aptitude and situational tests to be completed. In the first intake of 2016 following the use of blind CVs, one in five trainees didn’t have a 2:1 or 360 UCAS points.
There are potential downsides to blind recruitment. It only affects the initial assessment stage, leaving potential for bias to creep in at the interview stage. And there is an argument that says in order to increase social inclusion it might be useful for a potential employer to know the background of the individual and judge their achievements, for example, in the context of a disadvantaged background. Nevertheless, blind selection does at least enable employers to compile an optimal shortlist from which they can select the best person, in terms of competence and ability fit for a particular job, without taking into account information such as their ethnicity, gender and age that might otherwise bias opinions.
Blind recruitment needs to be used alongside a thorough and consistent approach to diverse and inclusive talent attraction and selection. This requires a series of interventions to keep the pipeline as wide as possible.
“We support the principle of unconscious bias training,” says Grimsey. “However, while it’s a good place to start, if a company has an existing and established culture of unconscious bias, it is unlikely that a one-day training course will move the needle.” Recent advances in technology, particularly around AI and machine learning, are being hailed as the most effective way yet of minimising the effect of human bias on the hiring process. There has been a surge in tech start-ups offering AI programmes that have both timesaving capabilities and the ability to screen candidates and select the best interviewees without prejudice. Worksome is a platform that uses AI to match individuals with projects. When a company posts a job requirement on the platform the algorithms produce a shortlist of the top relevant profiles matching the requested competencies, requirements, experience level and industry knowledge. These individuals are then invited to look at the project and bid on it if they’re interested.
“The algorithm always finds the best person for the job, no matter their age, gender, or race,” says Worksome’s co-founder and COO Mathias Linnemann. “The algorithm per default doesn’t take into account variables that are traditionally associated with biases. It only looks at the person’s competencies. Hence, bias is not even an option.” VCV is an AI-powered technology that automatically screens job candidates using facial and voice recognition and has been used by blue-chip companies including PwC, L’Oreal, Danone, Mars and Citibank. Candidates record a video using a computer or smartphone and face and voice recognition technology identifies their nervousness, mood and behaviour patterns to help recruiters assess whether a person is a good cultural fit for the company.
While these AI-based recruitment systems are programmed to avoid the risk of having any unconscious bias, the technology is not fool-proof. In fact, AI and algorithms in the talent space can have huge issues, as Roger Philby, CEO and founder of The Chemistry Group, which predicts people performance using assessments and data, explains. He says: “Amazon recently had a problem with an automated recruiting tool that showed bias against women. Without any ill intent, it used an old hiring pattern mirroring the tech industry, which caused bias issues, also resulting in widespread negative media coverage.
An algorithm simply cannot spot all the white men. A human can. And therefore the two must work together.” Philby recalls once being asked, as a member of a speaking panel, whether he would prefer a new employee to be selected by an algorithm or a human being. “I chose algorithm, but with the assumption the algorithm is designed appropriately, so it can deliver the best shortlist,” he says. “In other words, the process needs human intervention to start and end the process in the best way. Removing human bias upfront in the process will make it far more reliable, while later in the process, an algorithm may say that candidates are best for a job, so a hiring manager can choose the best one to work with, the one where there’s a personal connection.”
It is essential to minimise biases in recruitment and ensure selection is objective and fair by taking steps to really understand what is needed for a job and finding the best ways to assess these attributes, via personality questionnaires, practical exercises and structured interviews. But as John Hackston, head of thought leadership at The Myers-Briggs Company, points out, while it is important to trust in these results, human intuition and gut feeling should not be ignored. He says: “If your ultimate reaction is that you still don’t like the candidate, you need to take a moment to think about why. If you are going to be working directly with them it could be a valid concern. If you can identify something tangible about why you don’t prefer a candidate then this should be considered; although if it is just that the individual is different, it could be that different is what’s needed right now.”
While these AI-based recruitment systems are programmed to avoid the risk of having any unconscious bias, the technology is not fool-proof.
Case study
“We know that these are far more predictive of success than previous experience,” says head of talent Nick Gallimore. “We know the behavioural and preference profiles that are linked to success in certain types of roles, and we are looking to identify whether or not potential candidates exhibit those behaviours and preferences and to what extent. We then base our hiring decisions on those factors.” Diversity has increased as a result, and in terms of specific data points, the company’s gender pay gap reduced by 2.6% from the first year of reporting (2017) to the second. “Our snapshot forward-looking data suggests it has narrowed further since then. We believe that our best practice is a direct driving force behind this,” says Gallimore.
The strategy is also applied to the career promotion process, with 65% of non-entry level roles filled by internal promotions, which in every case also rely on the firm’s predictive behavioural approaches. Gallimore points to the fact that the UK has a culture of accidental managers, where most of the time the best individual performers in a specific specialist areas are promoted to management, irrespective of whether or not they are potentially a great manager.
He says, “We are bucking this trend by identifying potential for future roles in our existing workforce, and ensuring that we are promoting people based not just on performance, but also against tangible data points that point to success in their next role."
Publication information
Originally published in Economia on 6 June 2019.
Latest research
You are permitted to access articles subject to the terms of use set by our suppliers and any restrictions imposed by individual publishers. Please see individual supplier pages for full terms of use.
More support on human resources
Read our articles, eBooks, reports and guides on HR and employment law
Human resources hubeBooks on human resourcesCan't find what you're looking for?
The ICAEW Library can give you the right information from trustworthy, professional sources that aren't freely available online. Contact us for expert help with your enquiries and research.
-
Update History
- 06 Jun 2019 (12: 00 AM BST)
- First published
- 09 Dec 2022 (12: 00 AM GMT)
- Page updated with Latest research section, adding further reading on overcoming recruitment bias. These new articles provide fresh insights, case studies and perspectives on this topic. Please note that the original article from 2019 has not undergone any review or updates.