Special Report: Metrics and Measurement

Some say sophisticated learning measurements are critical to ensure key organizational stakeholders understand the impact on the bottom line. Others say that ‘we just know it works.’ The truth may lie somewhere in between.

By now, virtually all learning leaders know they should provide programs that affect their organization’s ultimate goals: boosting revenue, streamlining expenses, increasing customer satisfaction, improving safety or making sure there’s enough talent to take the reins when the time comes. But how should learning leaders measure how well they do those things?

The top challenge for learning professionals is aligning their activities with their organization’s business goals, according to a 2014 survey by the Human Capital Media Advisory Group, the research arm of Chief Learning Officer magazine. But a group of practitioners and industry experts believe they have found a way to more accurately crack those nuts by devising a formal reporting process based on generally accepted accounting principles, or GAAP. In a 2010 white paper, they named their version of GAAP talent development reporting principles, or TDRp, and created template reports and statements geared for both learning leaders and their senior-level peers across the enterprise.

TDRp
In addition to the reporting documents, the group also devised a process for learning professionals to talk to their business line and functional leaders to find consensus on what constitutes a desired outcome, how learning can help achieve it, and how the effectiveness and efficiency of those efforts should be measured.

A key component of these reports are columns to forecast metrics, to give everyone an idea of what the organization could achieve if learning comes through on its agreed-upon promises. Then, if actual metrics fall short, future discussions can focus on how to tweak learning programs to produce better results.

“HR leaders can get 90 percent of the value of TDRp by at least putting the process in place to have the right discussions with the right people, to get agreement upfront on what they want to accomplish for the year, and then actively manage throughout the year to make sure they deliver what they promised,” said David Vance, executive director at the Center for Talent Reporting in Windsor, Colorado, a nonprofit created in 2012 to better disseminate TDRp concepts.

With more than 250 members and 13 sponsors, the center’s website offers step-by-step guidance on how to implement TDRp across all HR processes. The center also conducts workshops and held its first conference, where practitioners shared how the implementation process has been going for them.

Those outside this endeavor see value in many TDRp concepts, but use alternative methods that they contend achieve similar results without requiring such a formal process.

Does TDRp Work?
Leslie Stein, director of learning strategy and planning at Bridgepoint Education Inc. in San Diego, said the TDRp process has helped identify which learning activities each of the company’s educational institutions should emphasize, depending in part on how well each activity can be measured. Working with the institution’s subject-matter experts, her shared-services team uses industry-recognized measures and calculation formulas obtained from the center’s online library. If quantitative metrics can’t be developed, a “low, medium or high” contribution is reported.

“TDRp has also changed the questions we ask during our training needs assessment,” Stein said. “Our questions are much more targeted to help us identify the crux of the performance issue to ensure that our planned intervention will indeed be impactful in a measurable way.”

The process also revealed how underutilized Bridgepoint’s learning management system was, and now more complete data is gathered from the field. The institution’s leaders are also more engaged.

“A while ago, we were considered as order-takers, but now we’re seen as true business partners who improve employee productivity,” Stein said.

Carrie Beckstrom, vice president of learning and performance at Automatic Data Processing in Pleasanton, California, and a center board director, said the TDRp implementation has been going so well for two of ADP’s business units that it is being adopted across the enterprise. The key has been a gradual implementation — namely, omitting the forecasting columns in the first year.

The TDRp annual strategic planning process has helped the company’s learning function allocate resources to help achieve fiscal goals. Further, since TDRp demonstrated past success in meeting business outcomes, learning received additional funding to hire contractors to help develop a learning program for a new product.

Beckstrom said the process makes learning less susceptible to budget cuts, and she has been granted more autonomy to reallocate resources, such as hiring a data scientist well-versed in predictive analytics, “enabling us to determine possible outcomes of a particular action and make more-informed decisions.”

It would have been better if the learning function had developed a comprehensive communication plan at the outset, as many business line leaders don’t align learning with business outcomes, she said. “When it doesn’t exist, it makes it more challenging to implement, but it also provides an opportunity for us to serve in a consulting capacity to facilitate discussion.”

Karen Kocher, chief learning officer for global health services company Cigna Corp., said TDRp has helped to improve the company’s learning programs. Mid-2013, the learning function rated lower than desired on certain business outcomes, including productivity and management support. The Cigna learning team identified hundreds of programs with dated content, or that employees rated poorly, and either eliminated or replaced them.

By the end of first quarter of this year, learning effectiveness and efficiency metrics improved, increasing another four percentage points in the second quarter. “TDRp doesn’t solve all problems with aligning the learning function with business performance, but we’ve come much closer,” Kocher said.

Alternatives to Formalization
Implementing a formal process like TDRp might be tough if learning organizations don’t have the resources, in-house expertise or centralized operations, said Alicia Shevetone, vice president of strategic operations at Clarity Consultants. However, she said adopting a set of standard definitions and metrics that both learning and other leaders can use makes sense.

For example, leaders can agree to measure the time it takes employees to perform certain tasks using a new software application, comparing the time it took before implementation. “But there are so many things that could affect that metric, including project management of that particular software installation,” she said. “That would make it harder for learning professionals to demonstrate some sort of ROI back to the company, how they were better for that training.”

Further, many leaders prefer to hear more qualitative than quantitative feedback on whether learning programs are affecting business outcomes, said Susan Dunn, a partner at Mercer in Portland, Oregon.

Leah Minthorn, acting director of North American learning operations at Boston-based Iron Mountain Inc., chose not rely on an existing 15-page formal document full of metrics when she pitched an expanded peer coaching program to senior leaders in July. “I thought to myself, would they really read all of this? I didn’t think they’d want to be bogged down in minutia.”

Instead, she presented leaders with a one-page executive summary briefly outlining what she needed and why, and had the supplemental material ready in case anyone wanted more details. Minthorn knew the C-suite already “got it” because the initial peer coaching program for front-line staff resulted in less document scanning errors and fewer worker injuries. The learning function subsequently wanted to expand that program to managers, and only a summary of updated metrics was necessary. Leaders would balk at monthly or quarterly reports: “It’s only when we need more money that they really care to see these metrics again.”

For traditional metrics about employee participation in learning activities, senior leaders can review weekly progress reports on the company’s intranet site if they want to. “But I don’t think it’s something we have to send in a formal document,” Minthorn said.

Rob Lauber, vice president at Yum Brands Inc. in Louisville, Kentucky, said he “can’t really get that excited about the idea of standardized statements” and only provides business outcomes metrics to stakeholders who ask for them. For example, no one asked him to measure the business outcomes of an e-learning course on the leadership principles outlined in Yum chairman and CEO David Novak’s book, “Taking People With You.” Instead, his team collected informal anecdotes of manager success stories.

However, Lauber said he might work with a business line leader to develop a metric that gauged an increase in sales after employees participated in a new learning method, compared to learning using an older method. “In that instance, the TDRp process may be worthwhile, but I wouldn’t want to have a charted statement of accounts approach, showing income statements, L&D summary and programs,” he said.

Steven Rath Morgan, responsible for Xerox’s global learning process and innovation in Washington, D.C., said only the most compelling metrics should be presented to senior leaders.

For example, certain metrics within the company’s competency development impact study that demonstrated the effectiveness of sales training opened the door for conversation with the C-suite on additional, incremental investments.

“In a large company, not every data point needs to be elevated to business leaders, but it is important for the learning organization to monitor learning’s alignment with the business and surface opportunities for improvement,” Morgan said.

It’s also important to make sure learning and development metrics and benchmarks use comparable measurements. “At the end of the day, business leaders are not as concerned about the ratios in and of themselves, but rather whether or not you are comparable to industry benchmarks and delivering against business needs,” he said.

TDRp Might Not Suit Formal Learning
Using the TDRp process to gauge business outcomes might be easier for standardized learning programs than it is for more informal learning, said Xerox’s Steven Rath Morgan.

With Xerox’s XstreamVideo and other user-generated content to share knowledge among the workforce, learning and development reviews utilization, stickiness, content quantity and quality. “But how do you demonstrate the business impact of informal learning and knowledge sharing?” Morgan said. “You can do it anecdotally and it can pass the sniff test, but how do you demonstrate proven results to report back up to the C-suite?”

Many learning and development professionals find it hard to reconcile informal learning metrics with the metrics within their learning management systems, said Todd Tauber, vice president of learning and development research at Bersin by Deloitte in New York.

“Companies at the front end of this process are looking for ways to connect those systems, or patch together ‘good enough’ systems to capture some of the impact — such as through simple, qualitative employee surveys and interviews to understand how and what they are getting out of it,” Tauber said.

Learning leaders should strive to develop metrics to demonstrate improved performance after training, he said. By looking at outcomes rather than inputs — which is what most learning management systems track — the data captured can be more complete and useful.

“I’m not really sure TDRp is the whole answer for reporting on those metrics,” Tauber said. “However, it does give L&D people some clear, simple-to-follow guidance on how to reframe the results for the business audience.”

Cigna’s Karen Kocher said her team has been able to incorporate informal learning metrics into their TDRp business outcomes reports. They gather employee ratings for learning assets such as videos that are built into those systems, and then include those ratings in a set of other learning metrics. At the same time they record whether the company performed well, based on income statements and productivity metrics.

“Granted, it’s a little intuitive whether or not the two connect together, but we think it does,” Kocher said. “If the learning activities were not rated very highly, that would not be the only piece causing the company not to perform well or the productivity scores to be low, but it certainly would be one indicator that our programs need to be given some serious reconsideration.”