In response to rapid change, L&D measurement is transforming. To stay ahead of the curve, learning practitioners must familiarize themselves with current trends.
by Site Staff
October 25, 2009
<p>In today’s volatile economy, the learning and development function has experienced its share of transformational change. Measurement in this area is no exception. To get a clear picture of where things are headed, let’s take a look at the emerging issues around learning and development measurement.</p> <p><strong>Making the Business Case</strong></p> <p>Organizations are in need of a way to validate a formal measurement process. To achieve this, they must focus on three core areas.</p> <p>1. Performance gain: Look at the increase in human capital performance that is attributable to learning. Conservatively, a $1 billion organization can see $2.5 million in performance gain if basic training occurs.</p> <p>2. Waste avoided: There is a lot of waste in training and unrealized value in companies today. If measurement can help identify these factors, such as poor instructors or lack of manager support, it can reduce this waste significantly. In a $1 billion organization, it is easy to recover at least $250,000 in scrap learning through proper measurement.</p> <p>3. Administrative savings: The non-value-added elements of measurement range from data collection to reporting. In a $1 billion company, it is conservative to expect more than $100,000 in savings using formalized measurement versus ad hoc measurement.</p> <p>A recent study by KnowledgeAdvisors and Bassi Investments illustrated that a group of companies with high learning and development measurement acumen outperformed the Standard & Poor’s 500 Index in terms of share price appreciation by more than 15 percent. </p> <p><strong>Dealing With Economic Uncertainty</strong></p> <p>It’s a given that organizations use training to become more competitive. But they can use measurement to prove competitive advantage and to better manage a more limited resource. </p> <p>For example, basic measurement done consistently can identify underutilized e-learning resources, poorly performing instructors or curricula not supported by managers and vendors. Further, measurement sends a strong message to senior executives that the learning resource is being managed well. </p> <p>That said, most executives today realize their people are a strategic tangible asset. As a result, training is being called upon during this time to cultivate people to become more productive and competitive. For example, one company stated that the economic downturn meant its junior personnel weren’t getting the same level of experiential learning on the job simply because there was less work. So learning staff created simulation programs to keep the junior personnel engaged and sharp in lieu of on-the-job experiences.</p> <p><strong>Maximizing Informal Learning</strong></p> <p>Various factors have led to the growth of informal learning — and therefore the measurement of it as well. Informal learning includes high-touch tools, such as coaching or communities of practice, as well as high-tech tools, such as virtual knowledge sharing and performance support systems. </p> <p>Measurement is vital to the continuation and growth of such programs. After all, organizations that measure the quality and satisfaction of the tool and the outcome of it on the individual and the community have higher funding for informal learning than those that don’t. </p> <p>Quick yet effective measurement techniques being deployed for informal learning include surveys or evaluations conducted periodically on these tools, as well as Web analytics. The use of case studies to single out how a part of an organization benefited from the tool is also helpful. For example, a sales organization that used a community of practice might want to articulate how it helped sales professionals find information to respond to proposals quickly. </p> <p><strong>Talent Development Integration</strong></p> <p>Learning and development is becoming more aligned with the holistic talent development function. One might argue that there are six core talent development processes: recruiting, learning, competency or talent management, leadership, engagement and performance management. As a result, measurement is taking a broader view, too. Dashboards and measurement strategies are no longer limited to learning, but are being viewed from this broader human capital or talent development view.</p> <p>Being aware of broader human capital metrics is important to practitioners’ role of creating a high-performing workforce. For example, imagine that a learning team conducts a competency review and finds that financial acumen is the largest skill gap in the workforce. This data is shared with the learning and recruiting functions. The learning department could build a course to help employees read financial statements and do basic budgets, while recruiting could use the data to look for candidates with financial skill sets. </p> <p>Another example might be a talent manager in charge of leadership training who needs to identify high performers for the program. The manager must view performance evaluation data, competency data and transcripts to determine the best candidates. Therefore, metrics should go beyond learning metrics and look at the other talent development processes as well.</p> <p><strong>Increasing Manager Support</strong></p> <p>Measurement data typically indicates that 60 percent or more of training is never applied on the job. The root cause of this is less likely due to poor training and more likely due to lack of manager support. Proper measurement in this area prevents unnecessary changes to the training itself or dropping of the training altogether. It yields a focus on the barrier to impact and cultivates collaboration between training and the business on increasing manager support.</p> <p>For example, consider a telecom company that realizes the major cause of low impact is lack of manager support. It automates a pre-training survey that goes to the managers of employees who register for a costly leadership program. The survey asks the managers if they have met with the learners to set expectations for the training, if they have identified a project or program for application of the training and if they will allocate resources to support the training on the job. If the data returned is contrary to these principles, the learning function suggests the learner not attend the training at that time. </p> <p><strong>Turning Smile Sheets Into Smart Sheets</strong></p> <p>Learning and development organizations are realizing that transformational change in measurement is not always necessary or appropriate. Simply replacing a smile sheet with an evaluation that leverages predictive indicators about quality and satisfaction, but also effectiveness, impact, results, alignment, manager support and value can result in a quick win. When benchmarks are applied to the smart sheet indicators, they become reasonable, credible, timely and powerful points for decision making. All the while, no significant resource changes were needed.</p> <p>Imagine an enterprise resource planning (ERP) technology company was challenged to yield actual business results from training its customers. The company deploys a smart sheet that collects self-evaluation scores for productivity, quality and customer satisfaction. The scores are then isolated to the training and adjusted for self-reported bias. Findings reveal a 20 percent increase in productivity due to training. This statistic could then be shared with the company’s customer base and found to be a constructive data point to help customers understand the value that training could provide. While not precise with hard numbers, it is a reasonable estimate that could be used to make more informed decisions.</p> <p><strong>Customizing Dashboards</strong></p> <p>A learning and development dashboard can serve as a place for managers to go to receive concise summary data. Four core elements of dashboards include: financial data, such as the learning budget to actual expenditure; operational data, such as completion percentage; performance data, such as training linked to business results; and cultural data, such as manager support. </p> <p>These four areas represent a balanced approach to dashboard models. Further, programs exist to integrate source systems with dashboard technologies to automate the compilation of this data so it can be effectively maintained.</p> <p>Consider the following example. At a major insurance corporation, a custom dashboard was developed to look at financial, quality and cycle-time indicators. The performance metrics were based on what the learning function needed to manage the operation as well as strategic initiative alignment. Once built, the organization piloted the dashboard for a three-month period by manually entering actual results. After the pilot program has worked out a consistent and reliable set of data, it is ready to be automated; the originating system will auto-populate the dashboard.</p> <p><strong>Keeping Integrity Through Integration</strong></p> <p>It is important that systems such as a human resource information system (HRIS) and an LMS feed the learning measurement systems in such a way that the integrity of the source data is maintained. Learning functions are using standardized XML schema to extract data from the source system and import it into the measurement system. This can initiate an evaluation, import completed evaluation answer data, or import financial, operational or demographic data. Integrations require an upfront investment, but in the long run produce significant efficiency gains while mitigating risk of data integrity issues.</p> <p>At one international organization with multiple LMSs, integrations were created with each into a single learning measurement system and process. So although there are unique systems to register learners, they can all be evaluated with a consistent and comparable evaluation and process. This has greatly streamlined the administrative burden of using three separate processes that would have existed with the three unique LMSs.</p> <p><strong>Creating Actionable Metrics</strong></p> <p>A primary objective of measurement is to create actionable data. It’s not enough simply to report data; practitioners must instill a performance management process within the learning function to make the data actionable. There are four main ways to do this.</p> <ol><li>View the data in trends. This can help the user understand the general direction of performance. </li><li>View the data against challenging yet attainable goals. This can help the user understand the proximity to performance expectations. </li><li>View the data against internal benchmarks. This can help identify best practice areas and improvement opportunities within the function. </li><li>View the data against external benchmarks. This can help the user motivate a team by external example and provide a relevant point of reference when using the information for decision making.</li></ol> <p><strong>Taking a Six Sigma Approach</strong></p> <p>Six Sigma is a widely recognized technique for improving the managerial process. While it began in manufacturing operations, it quickly spread to back-office operations and has historically yielded significant results. </p> <p>Six Sigma follows a model called DMAIC: Define, Measure, Analyze, Improve, Control. So a learning practitioner in applying Six Sigma would first define a process defect, such as a poorly performing instructor. Then, there would be measurement to quantify the defect — perhaps an evaluation of the instructor’s performance. After that, the practitioner would analyze the data; for example, he or she might compare instructors internally, externally, trended, and against goals to understand the severity of the defect. Next, the practitioner would implement process changes in the form of improvements, such as coaching the poorly performing instructor to apply more examples when teaching. Finally, controls are put in place. These are likely to be regular metrics that validate that the improvements are leading to positive change. In this example, they might be a review of evaluation scores as a trend on a monthly basis. </p> <p>Unlike the historic trend by practitioners to "prove," Six Sigma takes a self-critical approach to "improve." Improving over time can have significant benefits in transforming unrealized value into realized value.</p>