How to build a diverse and high performing workforce after the Covid-19 pandemic
Applying six key debiasing techniques from behavioural science to revamp your recruitment strategy
Applying six key debiasing techniques from behavioural science to revamp your recruitment strategy
For most organisations, the Covid-19 pandemic brings about a large amount of uncertainty about what the future will look like. An unfortunate consequence of this uncertainty is that recruitment efforts are being put on hold (Barret, 2020). This new reality is challenging for both the affected organisations and the potential new hires.
On the other hand, the new circumstances also offer opportunities. Time normally spent on attracting, recruiting and onboarding new colleagues is now available for other activities. Why not use this extra time to revamp your organisation’s recruitment strategy and start building a more diverse workforce? In this article we’ll outline six key debiasing techniques grounded in behavioural science.
It’s cheesy but true: people make (or break) an organisation. Your recruitment strategy is the starting point for the personalities, skillsets and talents that make up your company, which ultimately determine important outcomes such as employee productivity and engagement levels (Akhtar, Boustani, Tsivrikos, & Chamorro-Premuzic, 2015; Newman, 2005; Skyrme, Wilkinson, Abraham, & Morrison Jr, 2005).
No matter how motivated we may be to hire the best person for the job, our decision-making is prone to many behavioural biases that result in suboptimal recruitment decisions. In other words: we’re rejecting the candidates who would actually perform best on the job. At the same time, these suboptimal hiring decisions often lead to a lack of diversity in the workforce, which also hurts business outcomes. From a behavioural science point of view, it’s surprising how many organisations do not yet apply readily available knowledge in order to hire the best people in the least biased way.
By using the extra time available to revamp your recruitment strategy, your organisation will have optimally prepared itself to hire the best new colleagues when the aftershocks of the pandemic have calmed down.
Let’s face it: it is difficult to make an accurate assessment of how someone will perform in a job, based on a few hours of CV reading, assessing and interviewing.
In a large meta-analysis in 1998, Schmidt and Hunter looked at the past 85 years of research on personnel selection and discussed the effectiveness of 19 different selection methods. They found that a traditional, unstructured job interview predicted about 14% of actual job performance. An alternative selection method is “work sampling”: the act of asking a candidate to perform tasks that resemble typical day-to-day tasks in the job they are applying for. Schmidt and Hunter found that out of all 19 selection methods, work samples were the most accurate predictor of how a candidate would perform on the job: these samples predicted 29% of their performance.
In addition to work samples benefiting the hiring party, work samples are also helpful to the candidate: it gives them an honest idea of what the job would entail. Because work samples are beneficial to both sides, many consulting firms (including us) ask candidates to complete a case to mimic the types of challenges they would deal with on the job.
So, get creative and think of the best possible way to get your candidates to demonstrate how they would go about solving challenges of the job they are applying for. For example, ask them to conduct a bit of research, design a mock-up, have a client conversation or let them review a set of CVs.
Many interviewers ask the candidate to recall situations from the past where they demonstrated a certain skill or overcame a certain challenge (behavioural questions). While research shows that using behavioural questions can be effective for assessing future performance (Barclay, 2001; Gibb & Taylor, 2003), these types of questions also carry some risks: some candidates could polish their past stories (or even make them up completely). An alternative is asking situational questions; questions that are not about past experiences, but about how candidates would deal with hypothetical situations in the future.
An often-cited meta-analysis shows that these situational questions are also a valid predictor of future job performance (McDaniel, Morgeson, Finnegan, & Campion, 2001). Integrating these types of questions in your interviews prevents candidates from polishing their answers. The added benefit is that this also moves you away from favouring candidates with lots of experience (and many – picture perfect – stories to share) and gives room to talented candidates with less experience.
It’s tempting to turn an interview into a fluid conversation where an answer of a candidate largely determines follow-up questions and the subsequent course of the conversation. However, this can cause substantial differences in the topics that are discussed during each conversation, which is not helping you in making a fair assessment of how one candidate compares to another.
Research shows that structured interviews, where you stick to a list of predetermined questions, can help you make a more accurate assessment of the candidate’s performance when compared to unstructured interviews (Macan, 2009). We’re not advocating for robotic-like interviews without any spontaneity. Instead, decide on a set of key metrics that you want to score candidates on before the assessment procedure starts. This allows for an easier and fairer comparison between candidates and will help you in your decision-making process (Uhlmann & Cohen 2005). Research by behavioural science pioneer Daniel Kahneman indicates that up to 6 metrics leads to the best results. He also adds an important caveat to using assessment metrics: ‘Firmly resolve that you will hire the candidate whose final score is the highest, even if there is another one whom you like better – try to resist your wish to invent broken legs to change the ranking’ (Kahneman, 2011).
Many big corporations such as eBay and Amazon have reported using brainteasers during assessment procedures (Kaplan, 2007). Brainteasers are unconventional questions which have no straightforward answer and therefore require an unorthodox way of thinking. Think of questions like “Why is a tennis ball fuzzy?” or “Why is a manhole round?”. Recruiters often use these types of questions to assess if candidates can “think of their feet” (Kaplan, 2007).
However, research shows little evidence that brainteasers actually predict job performance (Bock 2015; Highhouse, Nye, & Zhang, 2018). Remember the research on work samples that supports using real-life work scenarios to test the capability of candidates: the situations proposed in brainteasers are probably not relevant in the work someone will be doing. In addition, there is some evidence that brainteasers might actually hurt the recruitment process. Wright, Sablynski, Manson and Oshiro (2012) found that in comparison to “traditional” interview questions, candidates experienced brainteaser questions as less procedurally fair. Candidates also indicated that their answers were less representative of their capabilities (regardless of the type of job that they were applying for). And fun fact: Highhouse et al. (2018) also found that people who use brainteaser questions in interviews score higher on the character traits such as sadism and narcissism.
So, unless the vacancy you wish to fill involves estimating how many ping pong balls fit in a room, don’t fall for the funny, clever-sounding brainteaser questions.
Even when we try to be as objective as possible until the end of the recruitment process, our minds are likely to form impressions of situations and people at rocket speed. Some studies suggest that within four minutes of an interview, a decision has already been made on whether to hire that candidate or not (Ambady & Rosenthal 1992; Barrick, Shaffer, & Degrassi, 2012).
One situation where rapid assessment occurs is while reviewing CVs: profile picture, name, area of residence and type of school may trigger negative stereotypes associated with gender, age, ethnicity and socio-economic status (SES), and therefore influence decisions. For instance, one study shows that candidates get invited for an interview 50% more often when the name on their CV is associated with a white skin colour, such as Emily, in comparison with names that are associated with a darker skin colour, such as Jamal (Bertrand & Mullainathan 2003). Another study showed that female applicants were seen as less competent and less hireable in comparison with male applicants, even when they had identical applications (Moss-racusin, Dovidio, Brescoll, Graham, & Handelsman, 2012). How to prevent this?In a lab study, seeing multiple CVs next to each other at the same time decreased the chance that judgment of candidates was influenced by gender stereotypes
(Bohnet, Van Geen, & Bazerman, 2012). Additionally, you could ask a colleague to cover up stereotype-prone information (such as name and area of residence) or to ask candidates to leave out less relevant personal information on his or her CV (such as photograph and date of birth).
Another situation where rapid assessment occurs is during the candidate interview. Interviews have been criticised as a way of assessment because interviewers easily base their decisions on irrelevant information (Pingitore, Dugoni, Tindale, & Spring, 1994). For instance, there is plenty of evidence that shows that we often hire people that are similar to ourselves in terms of hobbies, experiences and presentation styles (Rivera 2012), even though these factors do not necessarily predict job performance.
As stated before, using a set of key metrics can help increase the fairness of a process. The previously mentioned study on assessment metrics by Uhlmann and Cohen (2005)shows that committing to hiring criteria before seeing any of the applicant information eliminates discrimination. Furthermore, it’s important to constantly remind ourselves and our fellow recruitment colleagues just how strong our own biases can be in order to come to an objective assessment of the candidate.
As stated before, we tend to hire people that are like ourselves: people who fit our organisational culture (Rivera 2012). In addition to this not being the most objective form of recruitment, this can also cause issues when it comes to diversity and inclusion. It prevents you from building a workforce with varying gender, ethnicity, age, sexual orientation and other attributes. A lack of diversity in the workplace has been linked to decreased organisational performance, such as lower innovation and decreased responsiveness to market changes (Herring, 2009).
Instead of just looking for the famous ‘cultural fit’, train yourself to look for ‘cultural add’as well.Before hiring a new colleague, it could be beneficial to consider what new values, experiences and/or world views they will bring to the company. One example of this approach in practice comes from Google: when looking for new colleagues, Google occasionally goes against their own selection criteria by purposely hiring candidates who ‘don’t fit’ (Bock, 2015).
Yes, it’s important that a new hire sparks joy, brings comradery and makes the workday fulfilling for colleagues, but by only hiring people who like golf, you will prevent your teams from reaching the multi-disciplinary, diverse thinking that’s needed to solve today’s and tomorrow’s challenges.
All in all, the evidence-based techniques outlined above offer plenty of potential to optimise your organisation’s recruitment strategy. Implementing these techniques will make sure you are fully prepared to hire the best new colleagues once the Covid-19 pandemic has passed. Better times will come and better hiring is within arm’s reach.
Joanne is a consultant at &samhoud and specialises in applying insights from behavioural science to organisations. Before joining in 2017, she worked at the UK Behavioural Insights Team for 5 years.
Merijn has been working as a consultant at &samhoud since 2019. With a background in behavioural and organisational psychology he looks to optimally understand and sustainably change behaviour.
Akhtar, R., Boustani, L., Tsivrikos, D., & Chamorro-Premuzic, T. (2015). The engageable personality: Personality and trait EI as predictors of work engagement. Personality and Individual Differences, 73: 44-49.
Ambady, N. & Rosenthal, R. (1992) Thin slices of expressive behavior as predictors of interpersonal consequences: a meta-analysis. Psychological Bulletin, 111(2): 256–274.
Barclay, J. M. (2001). Improving selection interviews with structure: organisations’ use of “behavioural” interviews. Personnel Review, 30(1), 81–101. doi:10.1108/00483480110380154
Barret, S. (2020). Coronavirus jobs survey: 49% of companies considering layoffs, more than one-third freezing new hires. Retrieved from https://www.cnbc.com/
Barrick, M. R., Shaffer, J. A., & Degrassi, S. W. (2009). What you see may not be what you get: relationships among selfpresentation tactics and ratings of interview and job performance. Journal of Applied Psychology, 94(6): 1394–1411.
Bertrand, M. & Mullainathan, S. (2003). Are Emily and Greg more employable than Lakisha and Jamal? A field experiment on labor market discrimination. The American Economic Review, 94(4): 991-1013.
Bock, L. (2015). Work rules! Insights from inside Google that will transform how you live and lead. London: John Murray.
Bohnet, I., Van Geen, A., & Bazerman, M. H. (2012). When performance trumps gender bias: joint versus separate evaluation. Management Science, 62(5): 1225-1531.
Frieder, R. E., Van Iddekinge, C. H., & Raymark, P. H. (2015). How quickly do interviewers reach decisions? An examination of interviewers’ decision-making time across applicants. Journal of Occupational and Organizational Psychology, 89(2): 223-248.
Gibb, J. L. & Taylor, P. J. (2003). Past experience versus situational employment: Interview questions in a New Zealand social service agency. Asia Pacific Journal of Human Resources, 41(3):
Herring, C. (2009). Does diversity pay? Race, gender, and the business case for diversity. American Sociological Review, 74(2):208-224.
Highhouse, S., Nye, C. D., & Zhang, D. C. (2018). Dark Motives and Elective Use of Brainteaser Interview Questions. Applied psychology, 68(2): 1-30.
Kaplan, M. (2007). Want a job at Google? Try these brainteasers first. Retrieved from https://money.cnn.com/
Macan, T. (2009) The employment interview: a review of current studies and directions for future research. Human Resource Management Review, 19(3): 203–218.
McDaniel, M. A., Morgeson, F. P., Finnegan, E. B., & Campion, M. A. (2001). Use of situational judgment tests to predict job performance: a clarification of literature. Journal of applied psychology, 86(4): 730-740.
Moss-Racusin, C. A., Dovidio, J. F., Brescoll, V. L., Graham, M. J., & Handelsman, J. (2012). Science faculty’s subtle gender biases favor male students. Proceedings of the National Academy of Sciences of the United States of America, 109(41): 16474–16479.
Newman, J. L. (2005). Talents and type IIIs: The effect of the talents unlimited model on creative productivity in gifter youngsters. Roeper Review, 27(2): 84-90.
Pingitore, R., Dugoni, B. L., Tindale, R. S., & Spring, B. (1994). Bias against overweight job applicants in a simulated employment interview. Journal of Applied Psychology, 79(6): 909–917.
Rivera, L. A. (2012). Hiring as cultural matching: the case of elite professional service firms. American Sociological Review, 77(6): 999–1022.
Schmidt, F. L. & Hunter, J. E. (1998). The Validity and Utility of Selection Methods in Personnel Psychology. Psychological bulletin, 124(2): 262-274.
Skyrme, P., Wilkinson, L., Abraham, J. D., & Morrison Jr, J. D. (2005). Using Personality to Predict Outbound Call Center Job Performance. Applied H.R.M. Research, 10(2): 89-98.
Uhlmann, E. L. & Cohen, G. L. (2005). Constructed criteria: redefining merit to justify discrimination. Psychological Science 16(6): 474–480.
Wright, C. W., Sablynski, C. J., Manson, T. M., & Oshiro, S. (2012). Why are manhole covers round? A laboratory study of reactions to puzzle interviews. Journal of Applied Social Psychology, 42(11): 2834–2857.