“We have a responsibility to protect your data, and if we can’t then we don’t deserve to serve you,” Mark Zuckerberg wrote in a Facebook post, breaking his silence days after the Observer reported that the personal data of about 50 million Americans had been harvested and shared with a political consultancy.
With this data breach, the social networking giant is staring in the face of investigations on multiple fronts. While Zuckerberg has stated that the company has already changed some of the rules that enabled the breach, it raises many questions about the security of users’ personal data which they place knowingly in the hands of today’s technology organizations.
This conversation becomes even more relevant for HR organizations where an employee is trustingly sharing his personal data with the employer. Today, as HR becomes more data, analytics, and tech driven, employers are increasingly relying on many HR tech applications for engagement, wellness, talent management, and so on forth.
Right from hiring an employee by gleaning data through his social footprint to hiring him to keeping him engaged in the organization to making numerous benefits available to him to motivating him to conversing with him, the HR function is relying on technology more and more on the data shared by the employee. And with this data comes the possibility of many data breeches and facing a similar situation like Facebook.
Which brings us to the basic question of where does an employer’s ownership of the data begin and end once the employee shares it with him? Where does the employer draw the line when it comes to the security of that data?
Ajeet Khurana, angel investor and advisor to many product startups rightly elucidates the conundrum at hand.
Ajeet says, “Tragically, we seem to be having moved to a post-privacy period in human history. The recent Facebook story would have seemed like an unrealistic Star Trek like futuristic dream. Yet technology has made it possible. And it is fueled by the twin apathy of users signing away their privacy and our guardians (government) themselves violating our privacy.”
In such a scenario, organizations now need to walk the treacherous path of balancing the need for credible data-based insights of employees, keeping that data safe whilst also respecting the employee’s desire to safeguard his privacy at work.
First check: collect only data relevant for the business
Ravi Shankar, Senior Advisor, Human Capital Management at Ramco Systems believes that before looking at the privacy issues, organizations should first look at the type of data they are collecting. They should decide first and foremost that on their technology platform, what data collected could violate the privacy norms rather than looking the other way round.
So as an organization if I can pinpoint the sensitive data that falls in the realm of privacy, then even if the technology allows that data to be collected, a conscientious HR person should block that technology from collecting that data.
Case in point-collecting caste as a data point. While theoretically a company may ask for it but before doing that the company should look at its policies-does it want to collect that as a data point? Is it relevant? The HR person has to decide what does he/she wants to bring to the company as a data point.
Ravi points out that 90% of the data collected by HR are either irrelevant or hardly ever used in the employee’s life span. Hence the HR has to put a lot of design thinking into the data collection part i.e. what data is being collected for what purpose. Is the data being collected just for the heck of it? Can the organization work with a minimal set of data as far as employees are concerned?
Secondly, the data an organization collects from the employee has to be a part of the business process in turn to improve the employee process. As long as an organization can focus on that, the data collected will serve its purpose to make things better for both the employee and the organization. This will also weed out the need of collecting unnecessary data which could stray into the realms of sensitivity and privacy.
Ravi here explains how when an organization might be collecting data about the employees’ leave habits to make the leave applying process seamless. However, if the data being collected is to determine if an employee is a person with a high or low productivity, the purpose of the data gets defeated.
Another instance which Ravi cited where organizations could stay away from straying into privacy is when they collect information about emergency contacts. While it’s fine to ask the name and number of the contact, the moment an organization asks for the employee’s relationship with the emergency contact, it is venturing into the realm of privacy. Hence as a first check, organizations need to change their thinking and purpose behind collecting the kind of data being collected.
Treat sensitive data with care
While the Indian government is still taking baby steps in realm of privacy in the wake of the Supreme Court of India judgement that declared last year that the right to privacy is protected as an intrinsic part of the right to life and personal liberty under Article 21 and as a part of the freedoms guaranteed by Part III of the Constitution, HR tech firms on their end need to step up the game to ensure that vulnerabilities in their products do not become an inviting ground for those looking to exploit.
Speaking to People Matters, Tanmaya Jain, founder and CEO, HR tech firm Infeedo, which offers Amber, a smart A.I. chatbot which talks to employees and proactively finds those who are unhappy or most likely to leave, revealed that research by Insight222 shows that 81% of all People Analytics implementation are dropped over concerns of privacy, ethics and data security.
Usually data leaks or breaches happen due to a human error or vulnerabilities in the product architecture.
Which is why Infeedo takes data security and privacy concerns very seriously. To this end, he revealed that all conversations that take place with Amber are encrypted with the encryption key unique to each organization. This ensures even in case of a data breach, the chats are not viewable outside of the organization’s account.
In addition, Vulnerability Assessment and Penetration Testing happen bi-annually. To reduce the risk of human error, there are security protocols in place for handling data shared by clients if it's either manually or through integration with their HRMS.
Thus he advocates that what HR tech firms need to do is put a conscious effort in treating sensitive data and personally identifiable information with care. One way of ensuring this is that security audits by third parties should happen regularly and the reports should be transparently shared with all stakeholders.
There’s no doing away with the fact that in order to improve people analytics, organizations are capturing more and more data of their employees. Yet in the face of the debacles like that of Facebook, it is becoming clear that organizations now need to be more cautious that data once collected at their end is not rendered vulnerable in any way.
Also, an organization’s growing need for data needs to be balanced at the same time with the need to uphold employee privacy. Right from the point where data is collected, to what data is collected, to where is it deployed, and to how safely it is deployed, organizations need to relook at all aspects to their design thinking in order to avoid becoming a victim of a data debacle like Facebook’s.