People & Strategy Summer 2017 Vol. 40 Issue 3 - 37
Driving Results in Talent Analytics
There are seven steps to fully achieve business results from
talent analytics projects. Perhaps unsurprisingly, the actual
analytic work is situated in the middle of the cycle. The seven
key steps are:
1. Asking the right question
2. Identifying the right method to answer that question
3. Locating or generating the data to answer the question
4. Effectively and appropriately analyzing those data
5. Developing insight based on the analyses
6. Taking action based on that insight
7. Measuring results to determine whether your action was
Many practitioners address steps two through four, and
partially or entirely neglect the other four steps. Each step will
be outlined with practical examples and guidance. One core
example will be used to illustrate how a single research project
progresses through the steps. This use case will be supplemented at each step with additional examples to highlight
Step 1: Asking the Right Question
Posing an effective question is a surprisingly challenging task.
While it is relatively simple to embark on the straightforward
question of what is happening, uncovering why something is
happening and the potential causal factors involved is much
Fortunately, good talent analytics research questions share
a few consistent attributes. First, they address a business need
or inform a business decision. This connection to business
decisions is where talent analytics often diverges from HR
metrics. In many cases, measurement is simply monitoring
ongoing processes, rather than informing new investments
or decisions. For example, querying how many new hires
came from which prior employers does not drive a business
decision. However, examining the prior histories of the most
effective employees and comparing those to average or below
average employees to hone future recruitment practices is
an example of talent analytics, tied to a decision or future
The second consistent attribute of a good research question is that it will cover the domain in question. That is, the
research project is complete enough to that you are unlikely
to miss any major factors. In a way, good research questions
can be thought of as having good content validity (i.e., they
are broad enough that it is unlikely that results will lead you to
a poor decision).
Finally, good research questions are specific enough to inform action. It may seem as though this is in direct contradiction with the item above. In fact, that seeming contradiction
may drive a good number of poorly formulated questions.
However, the content validity point above is about ensuring
that a research question has essentially covered all the bases.
This point on specificity exists within that broad content
For example, let's examine the question of bias in hiring
decisions. Simply saying, "I want to study bias in hiring deci-
sions" is not a useful talent analytics question. It is too broad
and does not move the organization to action. Instead, let's
apply each of the three criteria above.
First, we must frame the question of bias in hiring in the
context of a decision or strategy. In this example, we may
narrow the question to determining if blinding candidate
resumes would result in more diverse candidates being interviewed. The decision here is whether or not to blind resumes
as part of the hiring process.
Second, we must cover the content domain. In the example above, covering the content domain may include multiple
types of diversity (e.g., gender and ethnicity), as well as multiple areas within the company, such as both staff and line roles.
Finally, having determined that we are looking for bias
at a particular decision point, and across multiple domains,
we must identify how exactly we will determine if such bias is
present. This is the last step about specificity.
Getting to the final business-relevant, content valid, and
specific question should be a collaboration between the researcher and the eventual person or organization responsible
for taking action on the findings. Securing partnership at the
beginning stages also helps facilitate implementation; after all,
people tend to be invested in things that they create.
It is important to note that this approach to defining
a question is significantly different from data mining or
dustbowl empiricism. Most organizations are swimming in
data, and there is an understandable urge to make use of
it. However, simply mining your existing data is often less
useful than many imagine. Much of the data in the typical
human resource information system is transactional, rather
than particularly content rich. Thus, the data that are conveniently laying around are typically good for answering "what"
and "when" questions, but generally less useful at answering
"why" questions. Unfortunately, most good business decisions
require being able to answer that "why" question.
Step 2: Identifying the Right Method to Answer That
Once the question is clarified, it should be a relatively short
leap to the method or methods. Here, we consider how to
answer the question or questions developed in Step 1. This
step is essentially your research design. For those whose entry
point to talent analytics is via the statistics or math itself,
there is particular risk in skipping this step.
The simplest and most straightforward design is a descriptive. Descriptive designs simply seek to illuminate basic
patterns in data. While in many cases descriptive analyses
could not be considered talent analytics due to the absence of
a business question or problem, there are some cases where a
descriptive design is appropriate. For example, organizations
considering where to locate a new facility might compare
labor market characteristics in the candidate locations as part
of their decision-making processes. In this case, the business
question would be where to place the facility and a descriptive
method would be sufficient to inform that decision.
Probably the most common type of design in talent analytics is a correlational study. Here we are using the broadest
sense of the term, and including any type of study that examVOLUME 40 | ISSUE 3 | SUMMER 2017