Why the government’s recent report on educational outcomes is misleading

Increased government spending on education has failed to improve learning outcomes

Amidst the brouhaha over Delhi University’s undergraduate programme, a recent report on school education by the education ministry has almost escaped the radar. The annual flash statistics report released last week by the District Information System for Education (DISE) has come up with several surprising findings. Assam, which has a literacy rate of 72%, is ranked No. 1 in primary education outcomes, while Kerala, which has a literacy rate of 94%, is ranked close to the bottom on the same parameter. That’s not all. States such as Jharkhand and Chhattisgarh, widely perceived to be laggards in education, are among the top ranked states in primary education outcomes, according to the ministry.

Taken at face value, the ministry’s report suggests an education revolution in the country. A closer look at the data, however, suggests serious flaws in the way outcomes have been measured, which has led to the counter-intuitive rankings. The outcomes parameters used by the government are largely input-centric and do not capture actual learning outcomes. The variables used to measure outcomes by the government include average number of instructional days, average working hours for teachers, gross enrolment ratio, and dropout rates, among others. The only variable that measures actual outcomes is the transition rate from primary to upper-primary school, but even that variable is subject to manipulation.

A Mint analysis that compared the outcome rankings furnished by the government with learning outcomes furnished by the latest Annual Status of Education Report (ASER) published by Pratham shows that the two are negatively correlated: states which score high on government rankings are actually delivering poor quality education. Learning levels in reading and math in most states that rank high in the DISE rankings are below the national average, as the accompanying interactive chart shows. The state-wise rankings were adjusted to eliminate those states included in one report but not in the other. For example, Pratham has no data for Arunachal Pradesh, Daman and Diu, Dadra and Nagar Haveli, and Goa in 2013-14.

While Assam ranked No. 1 in outcomes in the government’s 2013-14 report, only 46% of Standard III-V students surveyed in the state could read a Standard I text in their native language, and only 30% of Standard III-V students surveyed could do subtraction. Based on Pratham’s findings, Assam ranks 25 in both reading and mathematics.

Several states that rank low according to the government, rank very high on learning outcomes according to Pratham. For example, Mizoram ranks last in outcomes according to the government (after removing states not included in ASER 2013-14), but ranks No. 1 in reading and mathematics according to Pratham’s 2013-14 report. This means that Mizoram’s primary school students are learning more than those in other states, despite schools in Mizoram scoring poorly in terms of input variables like enrolment and dropout rates.

The rank-correlation coefficient between the government’s rankings and Pratham’s reading proficiency rankings was -0.42. This number was -0.35 for the government’s rankings and Pratham’s mathematical proficiency rankings. A negative correlation implies that the higher one ranking is, the lower the other will be and vice versa.

This suggests that state governments that are able to fund more schools and are showing improvements in enrolment rates are actually able to teach less to the average school-going student. In the rush to fund more schools, the quality of primary education seems to have been compromised.

Highlighting this discrepancy between educational inputs and outcomes is critical, said Rukmini Banerji, co-founder of Pratham. Ideally, states ranked high on educational inputs should rank high on learning outcomes as well, she said.

There are indeed some states such as Himachal Pradesh and Bihar, for which both the government and Pratham rankings are similar. Himachal is in the top league in both rankings while Bihar ranks close to the bottom in both rankings. But most states diverge in the two rankings, indicating that much of the additional resources put in by the respective state governments over the past few years are not having the desired impact.

The declining standards of learning levels in India’s schools are well established by now. Studies other than ASER have also reported the abysmal learning levels among Indian school children. Even students in the relatively better performing states of Himachal Pradesh and Tamil Nadu were placed near the bottom of international rankings in a 2009 study by the Programme for International Student Assessment.

Over the past few years, the government has faced criticism for focusing only on input variables rather than on educational outcomes. The latest ministry report compounds problems by mis-classifying input variables as outcome indicators, and completely masks the crisis in India’s school education system. Earlier, the government reports at least included the percentage of children securing 60% or more marks. While this is a better measure of learning outcomes, it too is subject to manipulation. In any case, it ceased to be included in the calculations from 2010 onwards.

The first step towards solving India’s school learning crisis will be to acknowledge the depth of the crisis by using parameters that reflect actual learning outcomes.

Close