Algorithms used in AI systems deliver unfair outcomes when applied in diverse societies such as India’s, says research
Governments are increasingly using artificial intelligence and machine learning in decision-making. But are their underlying algorithms suitable for countries such as India? A recent study says they may not be, since they were designed for Western societies. For example, such algorithms may fail to recognize religious or caste biases and could treat oppressed minorities unfairly.
The study, by Nithya Sambasivan and others at Google Research, US, is based on interviews with 36 academics from various fields and activists working with marginalized communities.
Many of these algorithms work on the assumption that the available data is representative of the society. But in Indian datasets, those with internet access are overrepresented, which is just 50% of the population. This means safety apps that invite users to identify unsafe areas in a city will mark Dalit and Muslim areas as unsafe, reflecting the prejudices of the app’s middle- and upper-class users.
Using artificial intelligence is aspirational for the Indian government. Since it is seen as futuristic, it is trusted without question. But this trust can be misplaced. The study’s authors point out that when police use facial recognition to identify protestors, and the system is trained on people under trial, they could end up disproportionately targeting Dalits and Muslims. This is because more than half of the undertrials are from these communities.
The authors make several suggestions to correct for these shortcomings. One is to empower marginalized groups with low-cost devices so that they can come online, represent themselves and produce knowledge about their communities. This will make Indian datasets more trustworthy by preventing distortion of data. Another suggestion is to educate journalists, activists and lawyers, so that there is, like in the West, an ecosystem of people who have the technical training to question the use of AI systems and hold practitioners accountable.