« For Want Of A Nail | Main | 3 Tips to Increase Intrinsic Motivation with Recognition and Rewards »



Feed You can follow this conversation by subscribing to the comment feed for this post.

Of course problems will arise from future uses of "big data." The more you customize, the more you "discriminate," in the broad non-judgmental sense of the term. When you personalize, you can better focus your aim, but that process brings corresponding hazards. Advanced marketing techniques based on "big data" frequently produce harmful glitches like this one in today's news: http://finance.yahoo.com/news/family-tragedy-landed-retailers-mailing-004000856.html. The potential for error and abuse will rise as segmentation leads to so many separate and disparate approaches that some will surely spin out of control.

This is a very important issue because little thought has been given to how organizations should monitor and guide the use of such new powers.

This isn't a recent issue. Technology has brought all kinds of questions about its use. The law can't keep up with it.

It's been going on for years with companies selling our data to merchandisers for their marketing campaigns. No one to my knowledge has raised this issue in a court of law. You don't hear people screaming about it as an "invasion of privacy" or discrimination--- NSA yes, merchandisers no.

HR has been very risk averse in the past. Maybe I'm an outlier but I don't think that should stop us from the need to tailor programs.

The main issue we should be concerned about is not tailoring programs for everyone ---- but tailoring programs for critical/key jobs that drive company strategy. Sports teams do it all the time. All the players' contracts are tailored. And the most critical ones get the most. Club owners justify their decisions by citing the criticality of certain positions.

So let's get our priorities straight. The rest of the employee population? Nice but not nearly as important. Let's put our focus where it should be and . . . full speed ahead.

"You don't hear people screaming about it as an "invasion of privacy" or discrimination--- NSA yes, merchandisers no."

I can choose not to do business with Google, Amazon, Facebook, etc. Eric Schmidt, Jeff Bezos and Mark Zuckerberg can't (legally) send people over to my house to kick down the door and kill me if I refuse to get with the program.

This is not a trivial distinction.

OK Tony I hear you. But many people don't know that the Google's, Amazon's, etc. are selling their personal information. Most of the time people are not asked if this is OK (sometimes yes).

What about our medical information being looked at and sold? There are so many ways our private information can be stolen and sold. And we don't even know it. And yes it is only going to get worse.

Jacque and Tony,

Interesting sidebar conversation!

To your point, Jacque - no it isn't a new issue. But it may be about to rear its head very close to home for many of us, courtesy of the new push into data, analytics and segmentation. As it has in so many other fields of endeavor, the advent of new levels of science and technology to our work will both bring new opportunities and raise new questions. My point is simply to call attention to this reality.

Thanks for the comments and discussion!

Interesting topic. And two oblique references to the NSA kicking down doors and killing people as a bonus. Swell . . . although I think some folks may have seen the movie Enemy of the State one too many times. . . .

So, I think there's a fine line between "discriminating" and "differentiating", and I think we're actually talking more about the latter. Whether its directed to mult-generational group differences initially, most people acknowledge we're headed down a path toward a total rewards program that is fully-customized to meet the needs of the individual employee - since what could be more (individually) satisfying?

In this area less of "big data", but really more in the zone of applied neuroscience, I do wonder how much more sensitive we'll need to be to more subtle, ethical issues and maybe nudging people toward a course or career choice that while they may be demographically and/or psychologically predisposed to, might restrict or limit them from an choice or direction that they might otherwise select (the road not taken).


Right on all counts. There is a fine line between discriminating and differentiating. Just cause we mean to do the latter doesn't mean we shouldn't be watchful about an unintended slip across that fine line. As data and science reveal more about the predispositions and tendencies of different "segments" of workers, it is conceivable to me that we will need to be sensitive to the ethical issues surrounding our possible actions in response to these "insights".

Thanks for weighing in! (And yes, can't be too careful about those NSA people....)

"...most people acknowledge we're headed down a path toward a total rewards program that is fully-customized to meet the needs of the individual employee - since what could be more (individually) satisfying?"

It would be nothing short of revolutionary to be able to do exactly this. It would offer enormous first mover competitive advantage. But the legal and regulatory barriers and litigation risks (particularly in the US), and the inherent adverse selection/moral hazard problems it would invite make it highly unlikely that anyone will attempt it.

Personalized remuneration packages are routine for executives, movie stars, sports professionals and lots others. Perquisites vary within employee groups without legal disputes. No has declared cafeteria benefits programs illegal, either, to my knowledge, and they have been around for quite a while too. Some nations like Italy almost compel personalized pay, where specific allowances are required based on the unique KSAs and specific circumstances of the worker.

Suspect that choices can be controlled and constrained without a lot of risk. Personalized compensation has been around for a long time, so it may be expanded here in the U.S. sooner than we all think. Time will tell.

What can you do if "Big Data" seems off.

I am a manager at a large corporation that uses Mercer surveys to set our "Market Median" pay. The problem is that we are bleeding talent. Employees are accepting offers that are 25-50% above our "market median". Even with a comp ratio of 1.2, which is as high as we can pay for a given job, we are below current offers.

How can I argue that the Mercer data is wrong?

Hey Bill,

This is not, at least in my experience, an uncommon problem. There clearly appears to be a disconnect between the labor market you are capturing with the Mercer data you have chosen and the labor market in which you are actually competing for talent. The challenge is not so much arguing that the Mercer data is wrong, but putting on your detective cap and figuring out why it is so clearly wrong for you.

Great idea for a post - so I may elaborate more. Thanks for raising an increasingly important question!

The comments to this entry are closed.