India’s rapid adoption of AI in governance risks social inequality?

India’s rapid adoption of AI in governance risks social inequality?
X
AI in India's justice system could worsen bias against Dalits, Adivasis, and minorities. Ethical safeguards are needed to ensure inclusion and human rights.

India’s rapid adoption of AI in governance and criminal justice promises efficiency—but without proper safeguards, it risks deepening existing social inequalities.

From CCTNS to ICJS, tech-led tools are being deployed without a clear legal framework. Yet, the datasets powering AI reflect a narrow and biased view of Indian society—excluding large sections of women, Dalits, Adivasis, Muslims, and rural populations. A 2022 Oxfam report noted women use the internet 33% less than men, and only 31% of rural Indians are online—leaving critical voices and experiences missing from AI systems.

This skew in representation has grave implications. In criminal justice, where marginalised groups are already overrepresented among undertrials and prisoners, AI tools may reinforce bias. NCRB’s 2018 data shows that two-thirds of Indian prisoners are Dalits, Adivasis, or OBCs—groups also underrepresented in digital spaces. AI decisions, built on historical and social bias, may perpetuate discrimination.

Globally, the risk is well-documented. The US-based COMPAS algorithm, used for sentencing, was found to label Black defendants as high-risk twice as often as White defendants for similar crimes. In India, the use of ChatGPT in a court's bail rejection and biased AI hiring tools raise red flags about AI's unchecked use.

India’s digital and data inequality demands urgent regulatory oversight. Without ethical frameworks, AI tools can replicate caste, religious, and class-based hierarchies in policing, sentencing, and beyond. Technology must be developed with transparency, accountability, and diverse representation. Rather than reinforcing stigma, AI should empower the underrepresented and help correct systemic bias in the criminal justice system.

Next Story
Share it