
India's Draft Digital Personal Data Protection Rules promise stronger privacy but overlook structural gender inequities embedded in digital systems. Drawing on feminist theory, empirical data, and research on algorithmic accountability, this article demonstrates how gender blind consent frameworks, male-dominated datasets, and the absence of compulsory gender-disaggregated impact assessments expose women and gender minorities to new layers of discrimination. It critiques particular provisions of the Draft Rules. It outlines a five-step reform agenda informed by gender, including: mandatory bias audits, context-aware consent, intersectional oversight, heightened safeguards for sensitive contexts, and transparency mechanisms. Substantive equity, not symbolic language, must anchor India’s data regime.
Introduction
In a digitalised society, where personal data and information govern critical decisions, ensuring robust data privacy and protection standards is not only essential but an ethical imperative. The Draft Digital Personal Data Protection Rules released at the start of this year seek to do exactly this. While well-intentioned, the Rules fall short of incorporating a gender-sensitive lens, particularly for ensuring algorithmic fairness, consent process, and tailored grievance-redressal mechanisms.
At first glance, the DPDP Rules appear to be designed in a rather progressive manner, as evidenced by their use of inclusive pronouns, such as "she/her," to refer to data principals. However, as Anna Brandusescu (2018) tells us, "Data politics are not neutral. Not on a technological level, nor a socio-economic or socio-cultural level." Such linguistic shifts, though symbolically valuable, are therefore insufficient without explicit gender-responsive mandates.
Rule 12(3) outlines obligations for Significant Data Fiduciaries (SDFs), requiring annual Data Protection Impact Assessments (DPIAs) to ensure that algorithmic software "deployed by it for hosting, display, uploading, modification, publishing, transmission, storage, updating or sharing of personal data processed by it are not likely to pose a risk to the rights of Data Principals". Missing here is a requirement to explicitly identify, assess, or mitigate gender-based bias. This reflects a broader systemic oversight in India's digital governance landscape and approach. Without targeted provisions and interventions, marginalised gender identities are left more acutely vulnerable.
Why is Algorithmic Discrimination a Problem?
Structural inequalities that we see in the physical world are also encoded into digital systems and algorithms. Individuals creating these systems are not immune to biases or oversights themselves, as they are shaped by the same structures and systems that create and sustain these inequalities.
To begin with, there's a systemic divide in how men and women use technology. Indian women are 15% less likely to own a mobile phone and 33% less likely to use mobile internet services than men (ORF Expert Speak, 2021). As per the National Family Health Survey (2019-21), only one in three women in India (33%) have ever used the internet, compared to more than half (57%) of men. The lack of data from women skews the training datasets used for designing AI models and algorithms, potentially leading to biased decision-making processes that disadvantage women in areas such as loan approvals, job applications, and access to AI-powered healthcare services.
The problem extends beyond mere representation in datasets. Only 12% of AI researchers and 6% of professional software developers worldwide are women (UNESCO, 2019). In the long term, this has a compounding effect where algorithms are designed predominantly by men, trained on male-dominated datasets, and deployed in contexts where women have limited digital access and representation.
The need for algorithmic accountability to address such oversights is evidenced by studies, such as the one by Buolamwini &?Gebru (2018), which measured accuracy in commercial gender-classification algorithms and found substantial biases. They noted that algorithms consistently performed worst for darker-skinned females and best for lighter-skinned males, highlighting significant intersectional disparities based on gender and race.
In the Indian context, where social and structural barriers already constrain women's economic participation, algorithmic bias compounds these challenges by creating additional layers of discrimination, often invisible to both users and regulators.
A Feminist Reading of the Principles of Consent
The Draft Rules’ gender-blindedness extends beyond algorithmic design. Despite establishing consent management frameworks through Rules 3, 6, 7, and 13, the Rules fail to account for contextual factors that make meaningful consent challenging for women, including digital literacy gaps, lack of economic autonomy, and social pressures which create dependence and power inequalities.
Moreover, personal data — and consensual access to it (or the absence thereof) — is not something shaped in a vacuum. As Kovacs and Jain (2021) argue in Informed Consent - Said Who? A Feminist Perspective on the Principles of Consent in the Age of Embodied Data, which reduces personal data to inert commodities, neglects its deeply embodied nature. Biometric identifiers, location logs, or even intimate search histories are profoundly tied to individual identities and physical bodies.
This matters for three reasons:
One, it helps us account for the coercive nature of certain forms of consent, where no actual choice exists. This is true for welfare schemes, for instance, where collection of certain types of data — not just for women, but for everyone — is mediated by access to essential services. The quasi-coercive nature of consent in such contexts helps us identify problematic consent mechanisms that compromise individual autonomy and choice.
Second, this perspective recognises that surveillance is the historical tool of patriarchy, used to control and monitor women and marginalised communities, and therefore requires more sophisticated approaches to consent that acknowledge power imbalances and contextual factors.
Third, and perhaps most importantly for India, the idea emphasises that informed, meaningful consent can be the foundation for ensuring the rights to self-determination, autonomy, and freedom only when there’s a situation of power equality. In overlooking this critical dimension in the name of an assumed neutrality, the Draft Rules render consent as perfunctory rather than meaningful, disproportionately affecting women and gender minorities whose autonomy in digital spaces remains precariously balanced.
Articulating Data-Protection Laws Using a Gender Lens
Moving towards substantive equity requires us to recognise how laws around data protection are being designed in the first place. As Henriques et al. (2023) remind us, gender-blind design is often premised upon the inaccurate notion that “designing equally is the same as designing for equality”, ignoring that the neutrality presupposed by gender-blindedness leads to a design which ignores specific, nuanced efforts to ensure equality, effectively “favouring hegemonic values and epistemologies, which counters the purported aim of equality”.
To envision changes in data protection rules informed by gender necessitates the centring of participatory design principles. These principles underscore the participation of affected communities in the design process, recognising different forms of expertise, engaging in meaningful consultation with women and gender minorities to develop consent mechanisms, and prioritising user agency and empowerment.
A gendered approach to data protection policy-making would have five essential elements:
These recommendations are not meant to be silver-bullet answers. Instead, they represent the minimum provisions necessary for moving beyond neutrality and towards deliberate equity. As India advances its digital transformation through initiatives like Digital India and the expansion of digital public infrastructure, algorithmic systems deployed today will shape opportunities and outcomes for generations of women and gender minorities. A data protection framework that fails to account for structural inequalities will not simply perpetuate existing disparities but amplify them at scale, creating newer, more pervasive forms of discrimination that are more difficult to tackle.
The choice before policymakers is not between efficiency and equity, or about what measures are the most manageable, but between superficial compliance and meaningful protection.
References
Buolamwini, J., & Gebru, T. (2018). Gender shades: Intersectional accuracy disparities in commercial gender classification. In S. A. Friedler & C. Wilson (Eds.), Proceedings of the 1st Conference on Fairness, Accountability and Transparency (pp. 77–91). Proceedings of Machine Learning Research, 81. https://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf
International Institute for Population Sciences (IIPS) & ICF. (2021). National Family Health Survey (NFHS?5), 2019–21: India (Vol. I & II). IIPS. http://rchiips.org/nfhs/NFHS-5Reports/NFHS-5_INDIA_REPORT.pdf
Jain, R., & Kovacs, A. (2020). Informed consent—Said who? A feminist perspective on principles of consent in the age of embodied data [Preprint]. SSRN. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3788322
Ministry of Electronics and Information Technology. (2025, January 3). Draft Digital Personal Data Protection Rules, 2025. Government of India. https://dpdpa.com/dpdprulesofficial.pdf
ORF Expert Speak. (n.d.). India’s gendered digital divide: How the absence of digital access is slowing women down. Observer Research Foundation. https://www.orfonline.org/expert-speak/indias-gendered-digital-divide
UNESCO. (2019). I’d blush if I could: Closing gender divides in digital skills through education. United Nations Educational, Scientific and Cultural Organisation. https://unesdoc.unesco.org/ark:/48223/pf0000367416
Web Foundation. (2018, February 7).Gender must be central to the data protection conversation, not a side note. https://webfoundation.org/2018/02/gender-must-be-central-to-the-data-protection-conversation-not-a-side-note
Yadav, R., Henriques, S., Agarwal, P., & Pal, J. (2023). Designing for the margins: A gendered perspective on digital consent in India. Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery. https://doi.org/10.1145/3544549.3582750
Press Enter to send
