As we in HR work to absorb and apply the possibilities that data and analytics present for the design and administration of employee pay, we often look to our peers in Marketing - always several steps ahead of us in this game - for lessons and insights. With this in mind, it is also appropriate that we heed the cautionary tales that emerge around Marketing's leading edge push into the Age of Algorithms. Specifically, the degree to which Big Data may be creating a new potential and power to discriminate.
We're already seeing this dilemma rear its head in Marketing, as Michael Schrage highlights in his recent HBR blog post Big Data's Dangerous New Era of Discrimination. In his article, he shares a few examples of the segmentation opportunities that marketing analytics have created and raises important questions about how they might be used.
Going more granular, as Big Data does, offers even sharper ethno-geographic insight into customer behavior and influence:
• Single Asian, Hispanic, and African-American women with urban post codes are most likely to complain about product and service quality to the company. Asian and Hispanic complainers happy with resolution/refund tend to be in the top quintile of profitability. African-American women do not.
• Suburban Caucasian mothers are most likely to use social media to share their complaints, followed closely by Asian and Hispanic mothers. But if resolved early, they’ll promote the firm’s responsiveness online.
• Gay urban males receiving special discounts and promotions are the most effective at driving traffic to your sites.
My point here is that these data are explicit, compelling and undeniable. But how should sophisticated marketers and merchandisers use them?
HR systems are home to a host of demographic data about our employees. To what degree do we seek and apply the insights that quite likely exist there in managing and seeking to improve the productivity of our workforces?
Just one case in point. Several years ago, I wrote a post about a study titled "Rewarding a Multigenerational Workforce" that had just been released by WorldatWork. In the summary of findings, the study's authors noted that a majority of organizations responding (56%) did not, at that time, even consider generational differences when designing total rewards programs and gently chided employers for not realizing "the importance of evaluating the needs of each generation uniquely and rewarding them accordingly."
The study's intent was clearly not to encourage discrimination but rather to encourage a more informed and evidence-based approach to reward design. Nonetheless, I think the concerns that the report raised for me and many who commented on this post present us with an early example of the potential dilemmas we may face as the data increasingly available to us brings ever more specific revelations about the preferences for and responses to rewards in different worker segments. Sooner or later, we will find the need to ask ourselves the question Schrage raises (and I paraphrase below):
Where, in our corporate cultures and strategies, does value-added personalization and segmentation end and harmful discrimination begin?
Your thoughts?
Ann Bares is the Founder and Editor of the Compensation Café, Author of Compensation Force and Managing Partner of Altura Consulting Group LLC, where she provides compensation consulting to a range of client organizations. Ann serves as President of the Twin Cities Compensation Network (the most awesome local reward network on the planet) and is a member of the Advisory Board of the Compensation & Benefits Review, the leading journal for those who design, implement, evaluate and communicate total rewards. She earned her M.B.A. at Northwestern University’s Kellogg School, is a foodie and bookhound in her spare time (now reading Katherine Boo's "Behind the Beautiful Forevers: Life, Death and Hope in a Mumbai Undercity"). Follow her on Twitter at @annbares.
Creative Commons image "Puzzling" by jhritz
Of course problems will arise from future uses of "big data." The more you customize, the more you "discriminate," in the broad non-judgmental sense of the term. When you personalize, you can better focus your aim, but that process brings corresponding hazards. Advanced marketing techniques based on "big data" frequently produce harmful glitches like this one in today's news: http://finance.yahoo.com/news/family-tragedy-landed-retailers-mailing-004000856.html. The potential for error and abuse will rise as segmentation leads to so many separate and disparate approaches that some will surely spin out of control.
This is a very important issue because little thought has been given to how organizations should monitor and guide the use of such new powers.
Posted by: E. James (Jim) Brennan | 01/31/2014 at 01:35 PM
This isn't a recent issue. Technology has brought all kinds of questions about its use. The law can't keep up with it.
It's been going on for years with companies selling our data to merchandisers for their marketing campaigns. No one to my knowledge has raised this issue in a court of law. You don't hear people screaming about it as an "invasion of privacy" or discrimination--- NSA yes, merchandisers no.
HR has been very risk averse in the past. Maybe I'm an outlier but I don't think that should stop us from the need to tailor programs.
The main issue we should be concerned about is not tailoring programs for everyone ---- but tailoring programs for critical/key jobs that drive company strategy. Sports teams do it all the time. All the players' contracts are tailored. And the most critical ones get the most. Club owners justify their decisions by citing the criticality of certain positions.
So let's get our priorities straight. The rest of the employee population? Nice but not nearly as important. Let's put our focus where it should be and . . . full speed ahead.
Posted by: [email protected] | 01/31/2014 at 01:57 PM
"You don't hear people screaming about it as an "invasion of privacy" or discrimination--- NSA yes, merchandisers no."
I can choose not to do business with Google, Amazon, Facebook, etc. Eric Schmidt, Jeff Bezos and Mark Zuckerberg can't (legally) send people over to my house to kick down the door and kill me if I refuse to get with the program.
This is not a trivial distinction.
Posted by: Tony Bergmann-Porter | 01/31/2014 at 05:44 PM
OK Tony I hear you. But many people don't know that the Google's, Amazon's, etc. are selling their personal information. Most of the time people are not asked if this is OK (sometimes yes).
What about our medical information being looked at and sold? There are so many ways our private information can be stolen and sold. And we don't even know it. And yes it is only going to get worse.
Posted by: [email protected] | 01/31/2014 at 08:18 PM
Jacque and Tony,
Interesting sidebar conversation!
To your point, Jacque - no it isn't a new issue. But it may be about to rear its head very close to home for many of us, courtesy of the new push into data, analytics and segmentation. As it has in so many other fields of endeavor, the advent of new levels of science and technology to our work will both bring new opportunities and raise new questions. My point is simply to call attention to this reality.
Thanks for the comments and discussion!
Posted by: Ann Bares | 02/03/2014 at 09:20 AM
Interesting topic. And two oblique references to the NSA kicking down doors and killing people as a bonus. Swell . . . although I think some folks may have seen the movie Enemy of the State one too many times. . . .
So, I think there's a fine line between "discriminating" and "differentiating", and I think we're actually talking more about the latter. Whether its directed to mult-generational group differences initially, most people acknowledge we're headed down a path toward a total rewards program that is fully-customized to meet the needs of the individual employee - since what could be more (individually) satisfying?
In this area less of "big data", but really more in the zone of applied neuroscience, I do wonder how much more sensitive we'll need to be to more subtle, ethical issues and maybe nudging people toward a course or career choice that while they may be demographically and/or psychologically predisposed to, might restrict or limit them from an choice or direction that they might otherwise select (the road not taken).
Posted by: Chris Dobyns | 02/03/2014 at 04:57 PM
Chris:
Right on all counts. There is a fine line between discriminating and differentiating. Just cause we mean to do the latter doesn't mean we shouldn't be watchful about an unintended slip across that fine line. As data and science reveal more about the predispositions and tendencies of different "segments" of workers, it is conceivable to me that we will need to be sensitive to the ethical issues surrounding our possible actions in response to these "insights".
Thanks for weighing in! (And yes, can't be too careful about those NSA people....)
Posted by: Ann Bares | 02/03/2014 at 05:20 PM
"...most people acknowledge we're headed down a path toward a total rewards program that is fully-customized to meet the needs of the individual employee - since what could be more (individually) satisfying?"
It would be nothing short of revolutionary to be able to do exactly this. It would offer enormous first mover competitive advantage. But the legal and regulatory barriers and litigation risks (particularly in the US), and the inherent adverse selection/moral hazard problems it would invite make it highly unlikely that anyone will attempt it.
Posted by: Tony Bergmann-Porter | 02/04/2014 at 10:01 PM
Personalized remuneration packages are routine for executives, movie stars, sports professionals and lots others. Perquisites vary within employee groups without legal disputes. No has declared cafeteria benefits programs illegal, either, to my knowledge, and they have been around for quite a while too. Some nations like Italy almost compel personalized pay, where specific allowances are required based on the unique KSAs and specific circumstances of the worker.
Suspect that choices can be controlled and constrained without a lot of risk. Personalized compensation has been around for a long time, so it may be expanded here in the U.S. sooner than we all think. Time will tell.
Posted by: E. James (Jim) Brennan | 02/06/2014 at 03:02 PM
What can you do if "Big Data" seems off.
I am a manager at a large corporation that uses Mercer surveys to set our "Market Median" pay. The problem is that we are bleeding talent. Employees are accepting offers that are 25-50% above our "market median". Even with a comp ratio of 1.2, which is as high as we can pay for a given job, we are below current offers.
How can I argue that the Mercer data is wrong?
Posted by: Bill Roberts | 02/08/2014 at 06:49 AM
Hey Bill,
This is not, at least in my experience, an uncommon problem. There clearly appears to be a disconnect between the labor market you are capturing with the Mercer data you have chosen and the labor market in which you are actually competing for talent. The challenge is not so much arguing that the Mercer data is wrong, but putting on your detective cap and figuring out why it is so clearly wrong for you.
Great idea for a post - so I may elaborate more. Thanks for raising an increasingly important question!
Posted by: Ann Bares | 02/08/2014 at 10:52 AM