Home >Opinion >Columns >The misguided use of algorithms to fix broken systems

On Friday, India’s National Testing Agency confirmed that the Joint Entrance Exam (JEE), and National Eligibility cum Entrance Test (NEET), will be conducted on schedule after the Centre said that the entrance examinations will not be postponed. The JEE Mains examinations will be held over 1-6 September while the NEET will be conducted on 13 September. A government statement said: “In our opinion, though there is a pandemic situation, ultimately life has to go on and the career of the students cannot be put on peril for long and a full academic year cannot be wasted."

It appears that India’s policy of competitive entrance exams for its elite institutions of higher learning is a smarter choice than the UK’s approach, given the trouble faced by the latter’s aspiring undergraduate students—and by extension, by those of “international" schools in India that go by British boards and examination systems instead of relying on Indian public exams.

To aid college applications to UK universities in normal times, teachers at British schools issue “predictive" grades to students who are about to take their public A-level (class 12) examinations. Apart from a student’s extra-curricular performance, predictive grades are used as one of the main factors by universities in the UK to base their admission decisions for incoming Bachelor’s degree students. These admissions are granted on a “conditional" basis, which means universities base their final decisions on the actual scores achieved on public competitive exams that are released just before universities begin their autumn terms.

Given the covid pandemic, the British government directed its Office of Qualifications and Examinations Regulation, or Ofqual, to find an alternative to these school leaving qualifications. Earlier studies by Ofqual had established that teachers’ predictive scores could be biased by gender, ethnicity and age. To move away from such biases, Ofqual decided to use an algorithm to stand in for this year’s cancelled public examinations.

That algorithm should ideally have had two goals. One, to ensure fairness and avoid grade inflation; and two, to ensure that students get assessed as accurately as possible for university admissions. Under government directives, however, Ofqual ended up focusing on just the first goal.

This itself should have sounded a warning bell, given that moving away from teachers’ predictive scores would necessarily entail arbitrary standardization constraints that would be applied to the algorithm. And sure enough, arbitrary conditions were used. For instance, the algorithm corrected not only for a student’s grade, but also for the average performance of the student’s school, done basically by choosing a standardized model that would predict a distribution for 2020 exam scores and match it with the distribution for 2019. This relentless “pursuit of the mean" simply meant that a student’s score could be ratcheted way down based on how his or her seniors in school had performed the year before, and would not be based on his or her individual performance.

To put this in an Indian perspective, it would be the equivalent of saying that a student from a government or government-aided school would be unable to shine in a public competitive exam just because he or she went to a school whose students’ performance in years past was less than the average. Discrimination as blatant as this beggars belief. In the UK, it was so widespread that 40% of all students received grades lower than their teachers’ predictions, causing them to lose their conditional spots at various universities.

Little wonder then that students and parents protested. On 16 August, hundreds demonstrated before the UK’s department of education building in London. Making matters worse was the revelation that Ofqual had chosen Public First, a company with links to Michael Gove and Dominic Cummings, both powerful politicians in the UK’s ruling party. The Good Law Project, a crowd-funded legal organization, has threatened legal action against Ofqual for choosing Public First. Within 24 hours of the protests, Ofqual reversed its decision. It now says that students will be given the higher of the teachers’ predictive grade and the one issued by the algorithm ( This has calmed protesters, but it also means that the containment of grade inflation sought by Ofqual has been blown to smithereens.

There is probably no better example of algorithmic discrimination, about which I have written before. If a system is flawed to begin with, trying to fix its flaws with a ham-handed approach like regressing everyone to a mythical mean is only going to make matters worse. Had the second objective of accurately assessing a student’s performance been paramount, it is likely that Ofqual would have found a more nuanced answer to the problem by adopting both qualitative and quantitative methods. This too would have had its share of problems, but would almost certainly not have affected as many as 40% of students.

Organizations the world over are beginning to find that applying over-simplistic artificial intelligence (AI) algorithms to what are highly nuanced and complex problems doesn’t work. Many of these problems are beyond the realm of data science, and we must accept that it will be years before AI algorithms are refined enough to be used as a cure-all for all that ails the world.

Siddharth Pai is founder of Siana Capital, a venture fund management company focused on deep science and tech in India

Subscribe to Mint Newsletters
* Enter a valid email
* Thank you for subscribing to our newsletter.

Never miss a story! Stay connected and informed with Mint. Download our App Now!!

Edit Profile
My ReadsRedeem a Gift CardLogout