What began as a hiring experiment in India may now reshape how a global consulting firm decides who makes the cut. Kearney, one of the world's largest consulting firms, is overhauling its hiring system to remove any human bias that creeps in during the selection process.
Piloted from India, the consulting firm that competes with the likes of Boston Consulting Group (BCG) and Mckinsey & Co. has taken over a mammoth mapping task plotting the career trajectory of all who had applied to Kearney over the years. The company, from its India office, will also be among the first consulting firm to use artificial intelligence (AI) tools to interview candidates—again an attempt to up its accuracy.
“Kearney India has piloted a programme, wherein we are looking at thousands of resumes that have come to us over the years. We are identifying candidates and mapping their profiles to their current levels of performance, whether they are in the firm or elsewhere,” Siddharth Jain, managing partner and country head at Kearney India said. “This is helping us understand how accurate our selection and interview processes are/have been in the past in relation to the profiles we have/had selected or left out.”
There are about 800 consultants in the firm's India office, but Mint could not ascertain the global strength spread across 40 offices.
The goal: eliminate hiring mistakes
Kearney’s primary goal is to remove what is called type-1 and type-2 errors and “move to a near-perfect method of recruiting best-fit candidates.” A type-1 error is when one misses out on shortlisting a good candidate who would have fit the profile. In a type-2 error, a candidate who is not fit to join gets selected.
“The AI tools are removing social, gender and concept biases that interviewers have. Some conglomerates are now implementing an AI round, which is the third-fourth round, and are proof-testing whether the interviews conducted so far were devoid of bias,” said Kaushik DasGupta, managing partner-India for executive search firm Odgers. "The tools will also detect if the candidate is being consistent with his/her answers and are beyond the resume-sifting process."
The new direction comes at a time when there is a fierce talent war in the consulting world. The likes of Bain & Co., Boston Consulting, McKinsey and Accenture have recruited students from the batch of 2026, according to the Indian Institutes of Management (IIMs)—some of the leading B-schools in India. These offers are driven by India’s growing role in complex, AI-driven work, and many of the global projects are led by India teams.
Kearney India is also trying to get the first few rounds of interviews conducted by inhouse AI tools. “Candidates will upload their resume and video messages, and the initial screening and evaluation are taken by AI. In this, we have noted the accuracy is higher as it is devoid of human biases,” said Jain. "This pilot was initiated by Kearney India in 2025 and is currently being implemented in a few of the campuses we hire from regularly.”
McKinsey and BCG did not respond to Mint’s queries, but Financial Times reported in January that McKinsey had asked graduate candidates to use AI tool Lilli while applying for global positions.
A similar story is brewing back home. In what appears to be a move to keep up with the times, companies allowed students at business school campuses to use AI tools as part of the hiring process. Mint wrote last October about how AI tools are now widely accepted in B-schools because case study analysis forms an integral part of their recruitment process. Unlike engineering colleges, where coding exams are conducted or students are asked to build products, management students are required to develop business strategy plans. The recruiters, including consulting firms, wanted to test the thought process behind using certain prompts and drawing a conclusion.
However, there is a case for caution in leaving AI to do the hiring. AI can reduce human bias against characteristics such as age, gender, ethnicity, disability and religion, do a fairer assessment of candidates, and potentially increase diversity," said Rohini Lakshane, technologist and researcher.
"However, if the AI is trained on biased data or if the AI is unaudited for this purpose or use case, then there is a risk that it may perpetuate or even amplify these biases. For example, if a company historically hired predominantly men for a certain role, an AI trained on that data might disproportionately favour male candidates, even if it's not explicitly programmed to do so. The lack of diversity among AI developers can also lead to blind spots in the technology itself. This is a critical concern that requires careful auditing and ethical AI development and deployment," Lakshane said.
While consulting giants may be testing out how AI tools are used for hiring, the practice is more common among startups. The tools save costs and time while they sift through resumes.
Saumil Tripathi, co-founder of AI-led recruitment firm Grapevine, uses an inhouse AI tool to select candidates for startups. “The voice-based AI tool called Round 1, takes a 5-9 minute interview, and we cater to startups who have raised initial funds but have small recruitment teams. We take the first round wherein we test the candidate’s aptitude and then pass on the information to the client for interviews,” Tripathi told Mint. The three-year-old startup conducts 300 interviews a day.
The shift reflects a broader corporate attempt to bring consistency to one of the most subjective business decisions.
