How much data privacy are we willing to give up in exchange for insight and impact?

People Analytics promises better outcomes for key HR topics like hiring, promotions and performance. Will legal and regulatory constraints guide People Analytics in the right direction or kill before it can take off?

People Analytics is considered “the next big thing” by many. The ability to use data to better understand and positively influence decisions on recruitment, performance, wellbeing, retention, and promotions has great allure for a variety of companies. Hilton, for example, uses AI to conduct follow-up interviews of call center job applicants or to schedule the successful candidates’ final offer-extension calls. Especially in the United States, the limits of what’s possible are defined by the limits of technology. Rarely does someone ask the question how far the influence of technology should go in advance. Regulation and legislative frameworks in the US usually are created in a reactive manner once a problem has already become so apparent that stakeholders demand the intervention of legislators. An example of this is Amazon creating an AI for hiring decisions which eventually needed to be scrapped because of a gender and race bias.

Europe takes another approach to regulation of Technology. The General Data Protection Regulation (GDPR), which came into effect in May 2018, has severely restricted the possibilities of employers to use employee data for the purposes of people analytics in a preventive manner, i.e., anticipating possible problems related to data privacy.

To understand why regulation is so different between Europe and the US, one needs to deal with European history in more depth: all of Eastern Europe was subject to communist influence in the late 20th century. In countries like the Baltic States, Poland, former Yugoslavia, and the German Democratic Republic (today part of the Federal Republic of Germany), authoritarian governments has used information that it had on its citizens to spy on them, eliminate political dissidents and in the worst cases deport, torture, and kill people who acted against the government-dictated ideology and regime. As many people living in today’s Germany, Poland, Czech Republic etc. have still witnessed these practices themselves and are therefore suspicious of large organization that have authorities over them using their data for untransparent or unwanted purposes.

GPDR, as a consequence, incorporates seven principles that are intended to mitigate the risk of misuse of personnel/employee data. Key principles that are relevant for people analytics include transparency, purpose limitation, data minimization and storage limitation. All of those principles severely increase the cost associated with performing people analytics. Transparency, for example, states that every individual whose data is used in people analytics needs to give their explicit content to that use. To make this specific, if a people analytics team wants to use a person’s salary data or performance reviews, they need to ask the individual for explicit, written permission. The data minimization principle states that data being processed is limited to what is necessary for the purposes of the analysis. Incorporating additional details, like additional individuals or additional variables is made very hard due to this limitation.

This piece of legislation illustrates a key conflict and trade off that the field of people analytics: How much data privacy are we willing to give up in exchange for insight and impact?

The answer to this question cannot be a general one, but will depend on historical context, collective attitudes and emotions and other regional differences. The field of people analytics can best contribute to figuring out this trade off by maximum transparency and documentation: it should strengthen the insightfulness of analysis while proactively applying the most possible care to the treatment of data.

Previous:

People Analytics: Tapping into the Promise, and Warding off the Potential Peril

Next:

How to be great at people analytics

Student comments on How much data privacy are we willing to give up in exchange for insight and impact?

  1. Thanks for this, Michael! Really interesting to get the European context and highly relevant to the discussion on privacy in the video on Humu for class. What concerns me most is that, with the advancements in AI, particularly on untrained datasets, we do not have a firm grasp on all the use cases for the amount of data we now have on people. For this reason, by imposing very strict legislation like GDPR, we may be protecting employees, but we also may be overlooking huge opportunities to advance employees and make them more effective in their roles.

  2. Great post, Michael! I appreciated the opportunity to learn more about GDPR and its impact on people analytics, and I agree with your point that people analytics professionals have a chance to figure out this trade-off. In order to do this, I think leaders will have to find ways to explain the benefits of using this data in analysis so that employees understand how this work can improve their jobs, performance, etc. It may also help to tell employees what companies won’t do with the data, so employees know that there are clear boundaries as well.

Leave a comment