After more than a year of the covid-19 pandemic, millions of people are looking for jobs in the United States. AI-based interview software claims to help employers sift through applications to find the best people for the job. Companies specializing in this technology have reported an increase in their activities during the pandemic.
MyInterview and Curious Thing
But as the demand for these technologies increases, questions about their accuracy and reliability also increase. In the latest episode of MIT Technology Review’s “In Machines We Trust” podcast, we tested software from two companies specializing in AI job interviews, MyInterview and Curious Thing. And we’ve found variations in the predictions and job match scores that raise concerns about what exactly these algorithms are evaluating.
Learn to know you
MyInterview measures the traits considered in the Big Five Personality Test, a psychometric assessment often used in the hiring process. These traits include openness, awareness, extroversion, pleasantness, and emotional stability. Curious Thing also measures personality traits, but instead of the Big Five, applicants are assessed on other metrics, like humility and resilience.
The algorithms analyze candidates’ responses to determine personality traits. MyInterview also compiles scores indicating how well a candidate matches the characteristics identified by hiring managers as ideal for the job.
To complete our testing, we first set up the software. We posted a bogus job posting for an office administrator / researcher online on MyInterview and Curious Thing. Next, we built our ideal candidate by choosing personality traits as the system demands.
On MyInterview, we’ve selected characteristics such as attention to detail and ranked them by level of importance. We have also selected interview questions, which are displayed on the screen while the candidate records the video responses. On Curious Thing, we have selected characteristics such as humility, adaptability, and resilience.
One of us, Hilke, then applied for the job and interviewed for the job on my interview and Curious Thing.
Our candidate had a telephone interview with Curious Thing. She first went through a regular job interview and scored 8.5 out of 9 for her English skills. In a second essay, the automated interviewer asked the same questions, and she answered each by reading the Wikipedia entry for Psychometrics in German.
Still, Curious Thing gave him a 6 out of 9 for his English skills. She finished the interview again and received the same score.
Our candidate turned to MyInterview and repeated the experience. She read the same Wikipedia entry aloud in German. The algorithm not only returned a personality assessment, but it also predicted that our candidate would match 73% of the fake job, placing her in the top half of all the candidates we asked to apply to.