Bookshelf bias – even with AI
People judge candidates by their backgrounds. But so does AI
Bookshelf bias is not just a figment of our imaginations, it seems technology doesn’t deal with it either
I’ve been bashing on about background bias for a while now and how what is visible during online interviews influences the interviewer. In another post “New with remote working: background bias” I said tongue in cheek “Not everyone can have dedicated space with full leather-bound bookshelves, especially when combining this with home-schooling kids or co-working with a partner.” It would seem that bookshelf bias is not just a figment of all our imaginations, but something quite real. What’s more, it seems that technology doesn’t help deal with it either.
The increase in the use of artificial intelligence has prompted a discussion around how well software helps reduce bias and improve human decision-making. Every area of human interaction is susceptible to bias which includes our hiring processes, especially as significant chunks are now carried out remotely. We have already established that background bias influences hiring managers when they see dirty dishes, unmade beds, and kids running amok. It would seem that not only do humans judge candidates by their backgrounds, but so does AI.
It is obvious that many of us felt that a background of books lent our online presence some gravitas. But now we have some scientific evidence to back that up.
Hung Lee included this nugget in his Recruiting Brainfood newsletter. Research carried out by reporters from Bayerischer Rundfunk (German Public Broadcasting) set out to assess the ability of AI to filter out stereotypes and bias. Using a group of actors to simulate an interview, they examined the software created by a Munich tech start-up Retorio which focuses on video-based behavioural assessment. It seems that not only is AI swayed by the appearance of the candidates themselves, but by their backgrounds.
The software is intended to analyse the tone of voice, language, gestures, and facial expressions to create a behavioural personality profile. Uwe Kanning, Professor of Business Psychology from the University of Osnabrück, suggests that “The software should actually be able to filter out this information in order to be better than the gut feeling of any person who is susceptible to such influences.” But it didn’t.
What they found was the software was indeed influenced by perceptions of candidates as scored by the personality measuring tool Ocean. They also found the results shifted based on the background behind the candidate. This shift was particularly evident in four of the main components: openness, conscientiousness, extraversion and agreeableness.
F. Andrew McAfee of MIT said, “If you want the bias out, get the algorithms in.” Biases in humans decision-making are well researched, especially in the hiring process. But maybe we need to do more work with AI too and check into the number of root causes. A common thread is that the training data used to create algorithms often reflects the life and background of the engineers responsible.
Research suggests that as these individuals are frequently white men from high-income countries, the objects they identify will be familiar to them. Studies continue to demonstrate that tech companies “evaluate, define, and shape the world in their own image.” This includes bookshelves.
As a leader, manager or recruiter do you recognise your biases? Learn more with our Unconscious Bias Training Workshops
Found that interesting?
Learn more about our services
Make your dreams a reality with a professional evaluation of your career to date.
The evidence is in. More women in your company can deliver 35% greater financial returns. (Catalyst)
Linkedin Live on Ageism Friday 24th September 2pm BST with Hung Lee
Join Dorothy Dalton and colleagues - Jo Weech, Head of People, (Exemplary Consultants), Jacob Sten Madsen, Talent Acquisition Advisor (Nielsen) & Anne-Hermine Nicolas, Head of Executive Recruitment (ex-Deloitte), Frank Zupan, Director of Talent Management (Associated Materials) to discuss critical issues in Hung Lee’s Brainfood Live.
Dates for the Diary
September 21st - ENGIE Gender bias in Performance Assessment online
September 24th - Linkedin Live on Ageism with Hung Lee
October 26th - Banque de Luxembourg Préjugés sexistes dans le processus de recrutment.
We have Remote Learning Programs available
Check out our exciting portfolio of offerings to support your business in upskilling and competence building for your teams, to address the unprecedented challenges that women face in this new totally a digital world.
Download and listen free podcasts
Post pandemic the next generation gap around remote working is here. How can we avoid this generational clash?
Bystander tips for male allies – things need to change! See something. Say something, Do something! Bystanders are complicit.
Organisations will have to take action against ageist practices and policies so that we all benefit from an aging population.