Companies Are Outsourcing Job Interviews to AI. What Could Go Wrong? | Unpublished
Hello!
Source Feed: Walrus
Author: Mihika Agarwal
Publication Date: June 2, 2025 - 06:30

Companies Are Outsourcing Job Interviews to AI. What Could Go Wrong?

June 2, 2025
In May 2024, Adam applied for a UX designer role at the US-based grocery chain Kroger (he asked to be identified only by his first name to protect his identity). Adam heard back from a company called HireVue. After emailing him a brief set of instructions—“Show passion!!” “NO headphones can be used”—the AI-based HR management platform invited him to do a recorded one-way interview online. The bot asked Adam five questions about his work experience. He was given exactly three minutes to answer each. “I had to prepare, focus and concentrate, repress my anxiety, and force a smile to gain the approval of a machine,” he recalls. The lack of a human on the other side, the need to maintain constant eye contact with his webcam, and the overt analysis of his verbal responses unnerved him, who felt like he “needed to meet or exceed the system’s criteria without knowing what that criteria was.” Toward the end of the interview, Adam was presented with “brain teaser” questions, including matching and memory games. The software gave him a combination of shapes, colours, and symbols and challenged him to memorize the patterns. “None of the brain-teaser games had anything to do with the role,” he says, adding that the entire process felt unfair and a waste of time. Even though he scored 97 percent on the test, he was not invited for a follow-up interview. To date, he does not know why. Kroger is just one of the 98 percent of Fortune 500 companies that now use algorithms and artificial intelligence in their hiring processes. According to a recent survey by Indeed Canada, 87 percent of HR personnel are currently using AI-powered systems and tools at different stages of the hiring funnel, from generating job descriptions to scanning résumés, conducting background checks, and interviewing candidates. The statistics are equally telling on the job seekers’ side—about two-thirds of surveyed Americans would not want to apply for a job if AI were used to help make hiring decisions, according to a 2023 report by Pew Research Center. Like Adam, many job seekers have felt demoralized, dehumanized, and discouraged trying to appeal to algorithms, or even impress or be one up on them. “It had flayed me alive and exposed my greatest fears for my career and the future of my industry,” one job seeker, who was met with an automated personality assessment, told the Guardian. Their fear seems valid, given that many employers are now outsourcing high-stakes decisions, like hiring, to bots. The professional futures of millions of human beings are being treated like just another constellation of data points. Critics warn that hiring algorithms can be biased, opaque in their goals, or just plain junk, causing real harm to candidates and preventing qualified people from getting jobs. At the forefront of the AI hiring revolution is HireVue, a company founded in 2004 that has completed more than 70 million one-way job interviews. HireVue uses an analysis of candidates’ body language, word choice, and voice tone to generate an employability score, which is then passed on to its clients. Other popular AI-driven hiring tools include Plum and Harver, platforms that offer gamified tests to gauge applicants’ problem-solving skills, personality traits, and social intelligence. And then there’s a gamut of intelligence and background screening software promising hiring managers hidden insights into candidates’ social media footprints, their online history, as well as their predicted future behaviours. A cursory glance at these websites’ marketing materials gives a resounding message—they’re here to remove human bias from hiring, democratize it, and unlock untapped potential. Their audience? Companies looking to cut labour costs through automation and HR teams overwhelmed by mountains of applications. Yet, the lawsuits and criticisms they now face signal that the tools are perpetuating the same issues they promise to solve. Hilke Schellmann is the author of The Algorithm: How AI Decides Who Gets Hired, Monitored, Promoted, and Fired and Why We Need to Fight Back Now. She says, “If a company uses a résumé parser to screen all incoming applications—potentially millions of résumés—and that parser has a defect, biased keywords, or isn’t working properly, it could discriminate against thousands or even hundreds of thousands of people.” Schellmann, who tested several AI hiring tools for her book, explains how such algorithms tend to develop meaningless correlations or proxies for discriminatory variables, such as gender and race, which can lead to statistical misfires. “Maybe statistically, a lot of people in the job share the same hobby because you’ve hired people from your old high school baseball team. So now, many people in the job play baseball—but that has nothing to do with being an accountant. It doesn’t matter whether you play baseball or not.” Though some lawsuits against HR software companies deal with privacy concerns, a lack of informed consent over the collection of biometric data, or deceptive practices, many lawsuits have focused on discrimination. For example, forty-year-old Derek Mobley, who is Black and lives with depression and anxiety, has taken HR tech giant Workday to court, accusing its AI-driven applicant-screening tool of repeatedly rejecting him for more than a hundred jobs and alleging discrimination based on race, age, and disability. A few years ago, Amazon’s experimental AI recruiting system internalized proxies for gender and discriminated against women (it eventually abandoned the product). LinkedIn faced a similar reckoning when its job-matching algorithms subtly favoured men, factoring in details like the number of skills listed on a résumé or engagement levels on the platform. A 2020 investigation by the Center for Democracy and Technology reveals that most vendors’ websites fail to acknowledge that their employee selection tests may be inaccessible to certain users, or to mention the legal obligation for employers to provide reasonable accommodations to those unable to take the test. Since many one-way interview tools rely on text-to-speech transcription, those individuals with speech impairments—or those who might have an accent, such as international students seeking entry-level jobs—are concerned that automated systems might unfairly penalize them. In face-to-face interviews, they might have the opportunity to disclose their impairment, but this isn’t an option in one-way AI interviews. Even among the few vendors that do address this requirement, there is little consideration of the full range of accommodations that might be necessary. Using the case study of Pymetrics (which has since been acquired by Harver), Schellmann explains how standard hiring algorithms are engineered to identify statistical patterns among “high-performing” employees. Since individuals with disabilities are underrepresented in many companies due to historical biases, the algorithm often lacks training data from these demographics. Algorithmic bias and a lack of decision-making transparency—or “black box AI”—are the most pressing concerns for researchers and ethicists when it comes to AI hiring tools. Like Adam, many job seekers feel helpless, frustrated, and defeated when coming up against mysterious systems that determine their career trajectories. According to Matthew Scherer, senior policy counsel at the Center for Democracy and Technology, United States federal law typically does not mandate companies to disclose the AI systems they use, nor does it require AI hiring vendors to reveal details about how their systems function or what kind of anti-bias testing they have conducted. In some instances, when AI is used to evaluate résumés or analyze recorded video interviews, applicants may be completely unaware of its presence. Users are then left to rely on claims from vendors and client companies that the technology functions properly and makes fair decisions, despite the fact that some vendors themselves do not fully understand how their tools arrive at those decisions. “Having the course of your career driven by hidden, inscrutable algorithms would be a dystopian situation even if those algorithms were perfectly fair—and there is plenty of evidence that hiring algorithms tend to be far from fair,” Scherer says. Algorithmic hiring has the potential and tendency to flatten or diminish candidates to their skill set and past experience rather than realizing their potential. Jodi Kovitz, chief executive officer of Ontario’s Human Resource Professionals Association, argues that bots lack the necessary emotional intelligence, judgment, and critical thinking skills to gauge nuanced indicators of leadership potential. “Folks need to be open to adaptation, reskilling, and retraining. And while AI can detect patterns, it currently misses these intangibles, which we need humans to assess—like the instinct to evaluate leadership potential, team dynamics, and an individual’s likelihood to succeed,” she says. Schellmann’s book also points to research that indicates many résumé scanners automatically dismiss candidates with a full-time employment gap longer than six months, even though such gaps have no bearing on a candidate’s actual qualifications. Currently, any guardrails and remedies to the ethical, legal, and privacy concerns around algorithmic hiring tools are limited to patchwork legislation, which in turn has been largely ignored by companies, according to Scherer. Last March, Ontario passed Bill 149, the Working for Workers Four Act, which obliges employers to disclose the use of AI in hiring decisions. But legal experts have called out the ambiguous terminology and lack of clear definitions around key terms like “artificial intelligence” or “publicly advertised job posting”—loopholes that employers are likely to use to their advantage. New York City’s Local Law 144—a 2023 legislation mandating bias-testing for AI recruitment tools and requiring companies to disclose their use to applicants—has received similar critique: advocates point out the narrow definitions of AI hiring tools, the lack of enforcement and adherence by lawmakers and companies, and the heavy onus on job seekers to parse through the inaccessible fine print in employer disclosures. Similar criticisms have been launched at the AI bias-mitigation guidelines implemented by the Equal Employment Opportunity Commission in the US, some of which have now been removed under the Donald Trump administration. The excision followed a January 2025 executive order directing federal agencies to roll back AI regulations “that act as barriers to American AI innovation.” With fewer guardrails, AI hiring systems could become even more opaque and discriminatory, leaving applicants with even fewer recourse options if they feel unfairly rejected. To level the playing field, job seekers in turn have started using generative AI tools to craft résumés and cover letters, and even using tools that apply to thousands of jobs at a time on their behalf. Schellmann condemns the robot circle jerk. “How can hiring be meaningful if everyone is just joining this AI rat race, copying what’s in the job description? That’s really worrisome, and there isn’t a clear solution on the horizon.” She warns about a future without applications or interviews. “We’re just classified—like, ‘this person should do this, and this one should do that’—and there’s no appeals process.” Hiring could become a closed loop of machines evaluating machines—one where algorithms, not people, determine career paths. The post Companies Are Outsourcing Job Interviews to AI. What Could Go Wrong? first appeared on The Walrus.


Unpublished Newswire

 
Toronto starter Bowden Francis started his outing Tuesday with a four-pitch walk and it went downhill from there.
June 3, 2025 - 23:11 | Globalnews Digital | Global News - Ottawa
A man is dead and five people have been taken to hospital after a shooting in Toronto.The shooting took place Tuesday evening in a residential area near Yorkdale Shopping Centre in North York.
June 3, 2025 - 22:42 | | The Globe and Mail
Trea Turner and Bryce Harper hit solo homers in Philadelphia's six-run first inning as the Phillies beat Toronto 8-3 on Tuesday to end the Blue Jays' five-game win streak.
June 3, 2025 - 22:02 | Globalnews Digital | Global News - Ottawa