Dealing with people analytics’ ‘creep’ factor
Employers are finding new and different applications for people analytics, technology that can assess workers and their behaviors. Companies are tracking facial expressions during job interviews, keeping tabs on worker prescription refills, even mining their email and hiring outside companies to monitor everything they do in the online world during their off time.
People analytics uses AI and other technologies to analyze the massive amounts of data that organizations have on their workers to measure, report and understand employee performance. Not surprisingly, some of the most innovative companies, such as eBay, Electronic Arts, Google and IBM, are using it.
At IBM, for example, the HR department uses people analytics to monitor and evaluate employee communications and interactions in public forums to determine who might be ready for a promotion or career path change.
But not all companies like to talk publicly or even internally about the extent to which they’re using people analytics. The deployment of these systems has created a still largely muffled but simmering, underlying debate over employee privacy rights and how to take advantage of the latest and greatest technology without crossing legal and ethical boundaries -- and creeping out workers.
Dave Weisbeck, chief strategy officer for the people analytics solution provider Visier, says the debate raises tricky questions, “because everyone has a different view of what’s creepy.”
While the laws are very clear about employers being able to monitor employee actions on company-issued equipment, Anna Tavis, clinical associate professor of human capital management at New York University, said technology is “racing way ahead of us.”
“I envision there will be a big issue around employee privacy and their ability to have an undisclosed life from their employer,” said Tavis, who is also a member of the advisory board of several start-ups in the people analytics field.
For companies like IBM, a leader in developing big data and AI solutions to help companies with recruiting, career enhancement and performance management, the creep factor is a non-issue, says Carrie Altieri, vice president of communications for IBM’s “people and culture” division.
Although the company has a sophisticated “decision support tool” that identifies things like employee burnout and potential new career paths based on employee activity across the company’s many internal social and other platforms, she said, participation is optional, and employee email and other private communications are never mined or monitored.
“You can avoid the creep factor just by establishing up front that you are not going to violate people’s privacy,” she said.
But transparency seems to be more the aberration than the norm. And as companies gather more and more data, transparency and the motivations behind data collection are far from black and white.
For instance, in cases like those on Wall Street, monitoring is often related to security and making sure companies are in compliance with regulatory frameworks, Weisbeck said.
And some companies monitor employees’ email and calendars to understand how to make them more successful by understanding things like who they are turning to for help, he said.
“They are trying to figure out internal networks and how they work,” he said. “… [T]he goal is not to stop you from doing something bad but to help you do something better.”
The question, according to Weisbeck, is how much do companies let people know about what they are doing?
“I can’t help but advocate to be forthright,” he said.
But with technology moving so fast and the lines between work and home life blurring, transparency is far from black and white, Weisbeck said.
“I’ve seen and participated in online discussions about this,” he said. “There are those who take a very simple view: ‘It’s the company’s resources. If you don’t like it, go work for another company.’ So, you get the extreme voices, even from those who feel like they are being spied on.…Then you get the others who take a very different stance, which is there is no way you can possibly separate work and private lives anymore so we need to put in mechanisms, and a way to be forthright and honest, so we can have both of these worlds.
“I don’t know who’s right,” he said.
Tavis added that technology companies like IBM and others that are used to working in more agile, innovation-focused environments tend to be transparent and committed to best practices.
More traditional companies, however, are more hesitant to share what they are doing.
Very few are telling their employees what data they are collecting or what they are doing with it, she said, and “when they are, they are telling a positive story.”
Tavis said she learns most about what is going on in the people analytics field from vendors that come to show her their products
For example, she said, there are a lot of companies that specialize in collecting external data on employees and selling it back to the company.
“If someone is becoming particularly active on LinkedIn, posting resumes, things like that, a signal goes back to the employer that this particular employee has increased participation in these job sites. “
She also pointed to new Stanford University research into facial recognition technology that is more sophisticated than polygraph tests in analyzing video job interviews; it could, for example, identify someone’s sexual orientation.
“That could be creepy,” Tavis said. “…If they are able to identify that and they have a bias, even though it’s illegal, it would be impossible to prove.”
What is her advice on best practices?
“To be transparent,” she said. “To have open conversations about it. To be looking at the pros and cons of technological applications and share best practices as well as failures.”
Companies also need to understand the dark side and how to prevent Equifax-style failures that can expose personal data, she said.
“Behind every one of those technological glitches there are live people making decisions or not making decision,” she said. “So, it’s kind of ironic and paradoxical at a time when technology is picking up and we are talking about technology to replace humans; in fact, humans are becoming more and more critical in leadership, and information is becoming more and more critical to the function of technology.”