Seismic evolutionary shifts are afoot in the talent economy, the kind that James Watt brought to the Industrial Revolution, Henry Ford to the assembly line and Google to the internet search engine. These developments have spawned the creation of tools and disciplines to help business leaders forge new insights on everything from customer lifetime value and conversion rates to future sales and earnings.
Machine learning, broadly defined as the subfield of computer science that automates learning to enable pattern recognition and predictions, is one such field that is slowly transforming how leaders evaluate their businesses and make decisions — and it is increasingly being applied to assess human behavior and labor economics.
Machine learning is fundamentally about automated systems learning to identify patterns and relationships in data. In some cases, these machines learn to predict a specific outcome such as who will buy a product or quit a job. In other instances, machine learning can help leaders determine how to effectively group people and their behaviors using thousands of variables, including demographics, job performance and online activity.
Many of these core analytic techniques have been around for years, while others are relatively new. The real difference for business leaders in assessing the risks associated with human capital is not the rise of new algorithms but the sudden availability of massive data and the accompanying processing power. What is emerging is a novel assemblage of opportunities and insights, bringing the expertise and power of computer science and machine learning together with human capital and labor allocation to ask, answer and act on new questions.
RELATED: Talent10x with John Lipinski
According to New York University economist Paul Romer, economic growth arises when people rearrange resources in ways that make them more valuable. The rearrangement of human capital — of recruiting it, developing it and retaining it — in the context of machine learning — exemplifies this desired growth. And while machine learning has already been used in many other industries and disciplines, its use in evaluating and managing talent is still relatively young.
To understand the impact of machine learning, it’s first helpful to understand the current state of human capital and talent analytics. Briefly, human capital and HR analytics can be divided into three levels: descriptive, predictive and prescriptive. Descriptive analytics answers questions about the history and state of talent within the organization. In terms of traditional human resources, this usually comes down to reporting what already happened in terms of annual turnover, internal hire rate, number of promotions, and location of top talent. These descriptive metrics are where most human resources analytics functions typically focus their efforts.
Predictive analytics takes a step beyond the descriptive and helps leaders answer questions about what is likely to happen in the future. Predicting future events such as employee turnover can help guide retention, promotion and recruiting efforts. But predictive analytics is more than just predicting who will stay or leave. It can also be used to predict who is likely to succeed in a role; how changes in compensation strategies will impact the overall workforce; which employees are likely to be injured on the job; or even how a reorganization is likely to impact employee movement and communication.
The third tier is prescriptive analytics. Prescriptive analytics tells executives what they should do. In some industries, the line between the predictive and the prescriptive is clear. In the airline industry, predictions about regular and discounted seat sales directly feed optimization models that prescribe how fares should be changed and marketed to optimize profit per flight.
‘Machine learning is fundamentally about automated systems learning to identify patterns and relationships in data using hundreds of variables.’
But this line is not always so clear when it comes to talent. For example, if a predictive model says that a subset of applicants are likely to quit within the first month, this may suggest a clear course of action. Without the aid of any formal optimization model, those applicants probably shouldn’t be hired. The underlying challenge for leaders is that people and organizations are messy. Optimizing airline ticket sales is one thing; changing behavior at the level of individuals, teams and organizations to achieve optimal performance is another.
Given this messiness, it makes sense to define prescriptive talent analytics in practice as a combination of the descriptive and the predictive, of bringing subject matter expertise and analytics together to develop and rigorously test ideas. Said differently, prescriptive talent analytics is less about formal optimization and more about systematic experimentation and honest assessment. Try it, measure it and see what happened to figure out what works best.
Knowing what executives using this data should do means figuring out the impact of decisions and actions within an ongoing, dynamic process — not hunting for a predetermined, permanent endpoint on a linear path. In many ways, successful leaders have always been using prescriptive analytics. The only change now is the availability of better data and new techniques to understand it.
So what about machine learning? If almost all traditional human resources functions are still focused on reporting yesterday’s events (descriptive analytics) how can data-driven leaders move further toward more predictive models?
To maximally leverage the potential of machine learning, leaders need to do three things: capture context, collect dynamic data and obtain the talent capable of uncovering insights. If these conditions are met, machine learning can deliver a measurable competitive advantage and a new way to approach the talent economy both inside and outside the organization.
Talent analytics often begins and ends with the individual employee. On the surface this seems reasonable, because performance is often seen as occurring at the individual level. It’s also easy because most human resources data files are broken down into rows according to individuals.
The problem is that organizations are not just collections of isolated rows of “people units.” They are made up of people who are nested within a team within a company within a local economy. Organizations employ living, breathing people with leaders, direct reports, colleagues, mentors and, believe it or not, competitors and rivals. Overlooking the contexts that shape, constrain and unleash individual actions will lead to poor analytics outcomes that obscure reality.
Suppose an executive wants to understand the impact of education level on talent retention. When we look at our data we might see that having an MBA does not, on average, impact longevity with the company. Yet digging deeper might reveal something quite different, such as low turnover for MBAs in small labor markets like Fargo, North Dakota, but colossally high turnover for those in New York, Chicago or Atlanta. Unsurprisingly, the opportunities that an MBA affords depend on context.
Our metrics need to reflect this. Larger companies can subscribe to services providing detailed labor market data. For smaller companies on a tight budget, even a few additional data points on regional unemployment rates from the U.S. Department of Labor can prove meaningful. In either case, these data can be integrated into established reporting processes that then feed into predictive and prescriptive efforts.
To further understand the potential role of context, let’s consider a second example centered on company tenure. Let’s suppose that our data show that those with more than five years of company tenure are less likely to leave their jobs. That might sound reassuring, but team-level context might qualify this highly generalized observation. For example, is company tenure still a protective factor when leaders or co-workers bolt for greener pastures? Alternatively, maybe a disproportionate number of those in the five-plus years group are actually poor performers in a part of the company with a strong tolerance for weak performance. In that case, they are sticking around for all the wrong reasons.
‘The final piece of the puzzle for bringing talent analytics and machine learning together is finding the right analytical minds to do this work.’
These examples tell us that talent analytics professionals need to be more creative and aggressive in constructing contextual data and folding it into their descriptive and predictive analytics practices. Massively powerful machine learning algorithms for prediction are nice, but they won’t help much if the data is not sufficiently rich.
Even if leaders don’t yet have the talent to implement machine learning processes, simply analyzing data with an eye toward nested contexts will enliven leader discussions and open up channels for leveraging more of what a talent analytics practice can truly offer.
The second critical piece for bringing machine learning to bear on talent analytics is understanding that behavior and decision-making unfold and emerge over time. To see the contrast between the standard, static view of talent and a truly dynamic one, let’s imagine that our retail segment is currently undergoing a massive reorganization. Some top leaders will go, reporting relationships will change and roles will shift. Intuitively, we know the internal dialogues would change for virtually all of our employees, and many of their behaviors would follow suit.
But what do our standard, static human resources measures like age, tenure, role and education look like in the face of this uncertainty? Exactly the same. That’s a problem.
Sudden changes in context change what people think about their jobs today and how they behave today — not seven months from now when their annual job satisfaction survey rolls around. The impact of our machine learning efforts will be blunted if our measures can’t detect these shifts. Fundamentally, adopting a dynamic mindset means shifting our gaze away from static demographics and categories and toward data that changes as finer-grained attitudes and behaviors change.
Where should leaders look for dynamic data sources? This will certainly vary by industry, but oftentimes the data is there waiting to be tapped. One avenue is to look at the flow of talent within an organization. As a first step, leaders should quantify the historical flow of talent between different levels and different segments of an organization for the past two or three years. This can include new hires and terminations but also the movement of talent up through the hierarchy or lateral movements. Having established a baseline, leaders can then compare more recent movement patterns to that baseline. This will allow them to detect meaningful, if subtle, changes. This kind of analysis can not only provide critical data about the culture of talent movement and development, but also yield early, measurable indicators that employee and organizational behaviors are changing in response to shifts in internal or external signals.
Dynamic data is not just about putting out fires either. Some roles inherently require more fine-grained performance tracking that can be more effectively used to guide decisions and development.
Call center metrics, for instance, can provide an incredibly rich set of dynamic insights that are often boiled away when reducing these data to a single measure like mean call handle time. Dynamically sensitive performance insights might be gained by instead asking what our best performers looked like early in their roles, how their performance metrics changed over time, and who among our new hires shows similar or different performance profiles over time. Sudden changes in individual or team-level performance relative to historical patterns might reflect issues that demand leader attention or intervention. It is also worth highlighting here that machine learning clustering algorithms are particularly well suited to identifying clusters of performance using such behaviorally rich data.
Learning functions represent yet another largely untapped source of dynamic data, especially for larger organizations. Instead of looking at the number of courses completed or the number enrolled, leaders should instead ask who is signing up for online learning opportunities and actually completing the course. Is there measurable evidence of performance boosts after this learning? Are these learning-oriented employees looking to build new skills in preparation for movement to a new position inside or outside the organization? Are leaders actively supporting their career development, or do our most ambitious learners leave when they realize no one cares about their development efforts?
Perhaps the most basic piece in all of this is simply identifying who in an organization even has access to this data. If leaders are unsure, start some conversations, identify those responsible for these process streams and learn about their analytics needs and concerns. Often the insights gained from analyzing dynamic data is of value to both leaders and the process owners themselves. Bringing machine learning together with these dynamic data sources can dramatically improve what leaders can reasonably predict about the workforce. More important, it can also reveal something vital about the people, processes and the current state of a firm’s collective outcomes.
The final piece of the puzzle for bringing talent analytics and machine learning together is finding the right analytical minds to do this work. While the popular press is right to suggest that there is currently a shortage of data scientists in the market, the situation should self-correct over time as more students shift their studies toward the field.
Still, analytical talent is out there. Where should leaders look? First, identify the sources of analytics talent already in the organization. Areas with a long-standing focus on analytics, such as marketing and any research function, stand out as likely candidates. Even if leaders don’t yet have an open data science role for their organization, it pays to start conversations with those who currently apply their data science craft within the company. For small companies without any analytics talent, look to local data science network meet-ups or nearby universities with recently established analytics programs.
Second, if there are cultural or organizational barriers to moving in-house analytics into the talent sphere, look for opportunities to strategically develop existing talent. If there isn’t bandwidth or budget to develop an intensive training program, start small. Find a handful of employees with backgrounds in programming and statistics and invite them to help grow a people analytics enterprise. This can be done in part through any of a number of free, online courses.
Beyond coursework, leaders should also seek out a knowledgeable, willing analytics mentor in the organization who can provide practical learning guidance. This will help mentees quickly overcome the frustrations that come with learning a complex set of new skills. Leaders should work with this mentor to identify a small, manageable project that will have a clear, practical application when completed. This will provide a tangible goal to keep the learner motivated and a concrete outcome that leaders can highlight as evidence of a growing analytics capability.
When that project is completed, identify a new, slightly more challenging project that again pushes trainees without overwhelming them. Over time, their competence and confidence will grow and the capacity of the talent analytics capability will too.
Finally, if finding or developing in-house talent is simply not possible, look externally. When it comes to job postings, be sure to get the titles right. When looking for a “data scientist,” be sure that is the actual title of the position. This matters for the applicants’ job search tendencies but also in how they view the role in the broader context of their career.
One might also do well to think broadly about the true possibilities of the role and use that as a selling point. The kind of people leaders will want to hire will be looking to use, develop and apply a broad range of skills, not just focus on a single, narrow need. Moreover, unlike some more established analytics functions, the questions in need of answers and the methods used in human capital analytics are not yet determined. This can be a major recruiting strength if properly emphasized.
Finally, executives can also use the interview to broaden their understanding of what the role might become. It’s important to be honest about the likely growing pains inherent in these new analytics-intensive roles, but the right candidate may very well have a vision that goes beyond initial considerations.
The age of machine of learning has arrived, but its contribution is lagging in human capital. While many companies are stuck at the descriptive level, a move to embrace what machine learning offers doesn’t mean abandoning regular reporting. Indeed, more complex machine learning methods depend critically on a clear question, solid data and accurate, descriptive analyses. Have these ground-level activities down before moving on to the complex techniques of the predictive and prescriptive levels.
Once those are covered, there are three necessary conditions for a successful move up the analytics hierarchy: the addition of context-based data, dynamic data, and the talent to ask good questions and leverage machine learning to get actionable answers. Talent typically gets the most attention, but the quality of the question and the data is where it all begins.
John Lipinski is a human capital analytics data scientist at life insurer Humana Inc. He is also the co-founder of HR analytics101.com, a venture that aims to help HR professionals build capability in people analytics.
This story originally appeared in the Fall 2016 issue of Talent Economy Quarterly. View the digital edition here.
- 5 Forces Shaping the Future of HR
- Why ‘Leaders Eat Last’
- Developing a real strategy for on-the-job learning
- Video: Overcoming the narrative of racial difference: Why the controversy?
- Mitigating the effects of implicit bias
- What it takes to become a collaborative leader
- It’s time to update your evaluation strategy