India’s DPDP law puts HR under the microscope—Here’s why that’s a good thing

When India passed the Digital Personal Data Protection (DPDP) Act, it signalled a watershed moment in the country’s data governance regime. Framed in a landscape increasingly dominated by artificial intelligence, remote work, and global information exchange, the Act redefined how personal data must be treated—with dignity, caution, and accountability. Yet in corporate boardrooms and HR departments across India, many still see it as a problem for the legal or IT team to solve. That’s a fundamental misstep.
“The biggest misconception,” says Mini Gupta, Partner, Cybersecurity Consulting at EY India, “is that the DPDP Act is just an IT or legal issue. It’s not. It’s a people issue.”
And herein lies the fault line. While the Act is often interpreted through the lens of compliance or cybersecurity, it actually calls for a deep cultural transformation—one that places responsibility not only on systems and frameworks but also on individuals and teams who handle sensitive information every day.
Let’s dive into why the DPDP Act is a people mandate at its core, the urgent actions HR leaders must take, and how businesses can navigate the Act not just to avoid penalties—but to build trust, strengthen culture, and lead responsibly.
The Myth of Tech-Only Ownership
At first glance, DPDP appears to mirror other data privacy frameworks like GDPR or CCPA. There’s talk of consent, purpose limitation, secure storage, and rights of the data principal (i.e., the individual). But the Indian legislation’s implications ripple far beyond IT configurations or privacy policies.
“Mention data protection, and it often gets handed off to the legal or IT teams,” says Gupta. “But that misses the point. Every team that touches personal data is responsible under this law.”
For HR departments, this shift is seismic. Gupta underscores how HR sits atop a “goldmine” of personal information—addresses, Aadhaar numbers, medical history, performance reviews, family details, even biometric data in some cases. And this isn't limited to employees; applicants and former workers are also in scope.
Too many HR leaders still assume the law doesn’t apply to them because they don’t handle customer data. But the DPDP Act does not discriminate. If it’s personal data, it’s protected—regardless of whose it is.
HR as Data Fiduciaries: The Invisible Centre
What’s striking about Gupta’s framing is her insistence that HR isn’t just a stakeholder in data compliance—it’s a data fiduciary in action. That means responsibility is not about being a bystander but a custodian.
So, what does this look like in practice?
-
Consent is king: HR must ensure explicit, informed, and specific consent is obtained—particularly for data used beyond employment purposes. No more burying it in contracts or relying on implied consent.
-
Purpose limitation is paramount: Data collected for recruitment can’t be repurposed for performance evaluations or training algorithms unless the employee has agreed to it.
-
Third-party vigilance is non-negotiable: Payroll vendors, background check agencies, health benefit partners—all must be DPDP compliant, and HR is accountable for ensuring they are.
“This law demands that HR leaders ask new questions daily,” Gupta says. “Do we really need this data? Are we storing it securely? Have we told the person why we’re collecting it?”
The takeaway? DPDP is not about box-ticking. It’s about a principled rethink of data handling—by design, not by default.
CHROs: From Policy Enforcers to Culture Builders
One of the more urgent messages from Gupta is a call to CHROs: move beyond policies and become architects of a privacy-first culture.
“They are at the intersection of people, processes, and data,” she notes. “That makes them uniquely positioned to lead this transformation.”
Rather than waiting for IT audits or legal briefs, CHROs must:
-
Lead with awareness: Make privacy and ethics part of onboarding, training, and leadership development—not just a compliance module.
-
Drive process redesign: HR workflows built in a pre-DPDP world often collect too much data. These must now be rebuilt to align with data minimisation principles.
-
Oversee ethical AI use: In a time where AI is used for screening CVs or tracking productivity, CHROs must evaluate bias risks, ensure transparency, and protect dignity.
-
Be the human face of compliance: Explaining data rights in plain language—not legalese—builds trust in ways no policy ever could.
If CHROs lead with purpose, Gupta argues, the shift from “data is an asset” to “data is a responsibility” becomes more than a slogan—it becomes embedded in culture.
Non-Negotiables for Compliance
The leader lays out a tactical list of actions HR teams must prioritise—immediately.
-
Get consent right: No vague checkboxes or buried terms. Consent must be informed, explicit, and separate for each processing activity that goes beyond the employment contract.
-
Communicate with clarity: What data is collected? Why? For how long? With whom is it shared? This must be explained in human—not legal—language.
-
Embrace data minimisation: If it’s not needed, don’t collect it. This applies to emergency contacts, health details, Aadhaar numbers—every piece of data must have a purpose.
-
Secure the flow: Sensitive data should not be sitting in email threads or shared drives. HR must work with IT to define strict access controls.
-
Monitor third-party platforms: Vendors used for payroll, wellness, recruitment must be DPDP compliant. Contracts should explicitly state privacy obligations.
-
Educate the frontlines: Training HR staff on the law is not optional—it’s essential. They must know how to respond to access or deletion requests, and how to spot a breach.
This isn’t just about avoiding penalties. Gupta puts it plainly: “It’s about earning and maintaining employee trust.”
Designing Training That Sticks—Not Ticks
If traditional compliance training was often a one-and-done slideshow, the DPDP Act demands something richer and more lasting.
“Checkbox training doesn’t work anymore,” says Gupta. “You can’t change behaviour with a 10-minute video and a quiz.”
Instead, HR must develop training that is:
-
Role-specific: A recruiter’s risks are different from a line manager’s. One-size-fits-all won’t cut it.
-
Scenario-based: Real-world examples—like what to do when accessing shared drive files—bring the law to life.
-
Story-driven: Sharing anonymised breaches or internal missteps helps teams internalise consequences.
-
Embedded in culture: From onboarding to weekly team meetings, data responsibility should be a recurring theme—not an annual ritual.
Above all, Gupta insists the message must be human. “Frame it around trust and respect,” she urges. “Not just compliance.”
Bridging the HR-Cybersecurity Divide
Perhaps the most practical insight Gupta offers is about collaboration between HR and cybersecurity. Today, these functions often operate in silos—a risky proposition under the DPDP Act.
“HR knows what data is being collected and why. Cyber knows how to secure it. But neither can succeed alone,” she warns.
What’s needed is alignment, not just cooperation. Gupta recommends:
-
Joint audits: Start with a mapping exercise of all systems that collect employee data. Identify where consent is missing, where unnecessary data is collected, or where vendors are non-compliant.
-
Co-design systems: Involve cyber teams from the start when procuring HR tech—not just at the last moment.
-
Fine-tune access controls: HR defines what’s appropriate access; cyber implements it. Together, they ensure data is available to the right people—no more, no less.
-
Unified communication: When data practices change, HR and cyber should co-own the message. It signals seriousness and fosters employee trust.
This alignment, Gupta argues, ensures not only legal compliance but also a more resilient and responsive organisation.
Navigating AI with Ethics and Transparency
As AI tools increasingly dominate hiring, monitoring, and performance evaluation, HR is at a crossroads.
“There’s a real risk of overreach,” Gupta warns. “Just because we can collect and analyse data doesn’t mean we should.”
Under DPDP, every AI application must be scrutinised through three lenses:
-
Necessity: Is this data essential for the intended purpose?
-
Transparency: Have employees been clearly informed? Do they have a choice?
-
Fairness: Could the system reinforce bias or impact career opportunities unfairly?
Gupta also calls for explainability. “If an algorithm rejects a candidate or flags someone’s performance, they deserve to know why.”
Ultimately, data should empower—not control. That means ensuring human oversight remains central, and that decisions affecting people’s lives are never fully outsourced to machines.
India’s Global Signal: A Ripple Across Borders
DPDP is India’s law—but its consequences are global.
With India housing thousands of global capability centres and outsourcing hubs, DPDP challenges multinationals to look inward. The emphasis so far has been on protecting customer data under global laws like GDPR. But now, internal data practices—especially around employees—are under the scanner.
“DPDP is turning the lens inward,” says Gupta. “If your GCC in India tightens data practices, it won’t make sense to be lax elsewhere.”
In this sense, the Act is setting a new baseline—one that could drive harmonisation of employee data rights across geographies. It also puts the spotlight on cross-border data flows, demanding greater accountability from companies storing data in foreign jurisdictions but collecting it in India.
“This is an opportunity,” Gupta insists. “For companies to lead on privacy—not just react to regulation.”
What the DPDP Act ultimately demands is not fear, but foresight. It’s not a punishment to be dodged—it’s an invitation to lead differently. When businesses reframe the law as a mandate for respect, trust, and ethical leadership, they unlock far more than compliance.
Gupta says it best: “If HR leads the way, it sends a powerful message—personal data isn’t just an asset. It’s a responsibility.”
The question is, who’s ready to lead?