Learning analytics technology is an enabling tool that can significantly assist learning organizations to understand how to better educate and develop employees, partners and customers. The building blocks of learning analytics technologies include the way analytic data is collected, stored, processed and reported. Mapping out the points of automation for each of these is helpful when designing a learning analytics tool.
Data collection is how the organization obtains original data for storage, processing and reporting in the analytics tool. Online and e-mail-based data-collection devices can be used to improve reporting timeliness and decrease costs, though response rates may drop. For example, Eaton Corp.’s Eaton University uses e-mail to collect post-class, follow-up and manager evaluations, according to Joyce Gilman, operations manager. “This saves us significant time and cost while continuing to provide Eaton University a significant amount of data to analyze for continuous improvement opportunities,” she explained.
What if your organization cannot collect the data electronically? Scanning technologies can be leveraged to import raw data into analytics technologies. “Although we collect our learning evaluations in a paper-based manner, we have systematized the processing of the data via scanning technologies,” said Robin Killeen, a project manager for Discover Card. “The raw data is batch-imported into an analytics tool that processes and reports the data. As a result, our learning organization focuses on the results and quality of the training, not compiling numbers.”
It is important to be mindful of security, as well as data privacy and protection. Physical security controls should be in place to prevent unauthorized access. Network security is equally important. Data privacy and protection gives respondents the option to submit data anonymously and provides them with the knowledge of how data will be used. Conforming to rules, such as U.S. Department of Commerce Safe Harbor legislation, helps provide comfort to users that the data-collection site has been certified as safe by a regulatory body.
It is important to ensure credibility and integrity of data as well. Hibernia Bank’s learning organization knows exactly the format and source of real business results produced by various systems and technologies, according to Robert Baer, design team leader. “We use this data to accurately and credibly link the results to training programs,” he said. “This makes the analysis well received by our senior management.”
Configurations and settings are a final consideration. Establish minimum configurations for commonly used browsers and operating systems. For example, supporting only Windows XP when the majority of the workforce uses Windows 2000 is not a flexible minimum-supported option.
Data storage describes the environment used to maintain an organization’s learning analytics data. Data structure and use are significant technological considerations. GTECH, a provider of gaming and technology solutions, has a detailed outline of how its training courses fit into hierarchical curricula and programs. “Knowing that our analytics technologies can support how we organize our curricula in its data structure is very important,” said Margaret Lamb, a GTECH instructional designer. “This means that we can measure segments of our course offerings, as well as the total program. This strategy really helps us understand and manage our learning programs.”
Storing all collected data centrally in a single data warehouse is critical to the processing and reporting elements of analytics. Large IT hardware and software vendors often train high volumes of end customers through third-party partners around the world. Many leverage a third-party learning analytics tool that acts as a clearinghouse, gathering data from multiple decentralized locations and then storing it in a single database. This allows the vendors to access data easily and in a timely manner to better manage quality, performance and value aspects of training.
Analytics tools rely on large volumes of data to allow users to slice and dice in multiple ways. Each data point must be stored in its raw format in the database. Basic tools, such as a spreadsheet application, may have size limitations. Therefore, it makes sense to store the data using more powerful tools, such as online analytical processing (OLAP) systems. For example, New Horizons Computer Learning Centers has collected more than 1.5 million evaluations in the past few years. When responding to a recent request for proposal (RFP), the company was able to query the OLAP database and quickly find out that the prospective customer had received training from New Horizons more than 100 times and in 30 locations around the world. This made for a more impactful response to the RFP.
Data retention coincides with storage. Over time, large amounts of training data can slow down even the most powerful databases. Establish a data-retention policy early in the analytics process and communicate it to users of the analytics technology.
Finally, technology comes with the risks of data loss or corruption. In order to mitigate risk, backup procedures should be established. An hourly transactional backup of data is important, and nightly full backups are recommended. Ensuring that the backups cannot be overwritten is a further precaution. Finally, taking the backup files to off-site locations is a physical security procedure.
Data processing describes the actions taken on stored data to transform them into business intelligence for analysis. Data-formatting issues are a key consideration here. Understand the native format and origin of data before including it in an analytics engine. This will help in determining the right option for exporting and manipulating the data. For example, data in a relational database management system (RDBMS) is easy to de-normalize and analyze in other RDBMSs, whereas data imported for processing from a static Excel spreadsheet may be more challenging to upload and analyze.
Car manufacturer Audi worked closely with technologists to appropriately format data derived from these systems to provide a common file feed for the analytics technology. “This file feed is now being customized for import into an analytics tool. Once done, Audi will be able to analyze their data in a single platform with multiple filter options and make the analytics tool accessible to their instructors and designers for quality analysis,” said Annette Eagle-Dull, formerly of the Audi Academy.
Using sophisticated OLAP tools to process data increases the power of analysis. There are myriad OLAP tools that range in price and functionality. The key element is to build the right structures within the OLAP tool to allow a functional user to easily and self-sufficiently manipulate data. This often requires front-end interfaces built on top of these tools, which eliminates the need for IT departments to write the reports each time a user needs them. Nonetheless, OLAP tools are important in providing the ability to “slice and dice” data by key learning and organizational elements. Examples of filters in the learning space include: instructor, course, class, curricula, program, location, learning delivery, start and end dates, type of evaluation, question or question category, business unit and customer or client.
One leading professional services firm’s training operation views learning metrics data through an Internet-based OLAP tool that can compare performance across multiple attributes of the learning and business operations. According to a director of the firm’s global learning, queries can be run by filtering course, instructor and delivery components. They also can be filtered by line of service, subgroup and industry type, which helps the learning organization understand the impact education has not only on training attributes but also on key components of the business.
Netxel Communications Inc. also leverages the technological consideration of processing consistency in its analytics tools. “All learning metric data is processed from a single point of origin,” said Danny Brown, manager of evaluation and metrics. “This is made possible by the analytics technology’s preconfigured queries that reach back to the stored raw data to perform queries, not calculating from previously calculated data which can skew results.”
Brown explained the importance of relying on analytics tools that conform to processing consistencies. “You need to be comfortable that the technology will always run the same query in the same way no matter what the data set or time frame is going to be,” he said. “Ensuring the queries are preconfigured across the technical architecture of the analytics tool to be consistent is important to system integrity.”
Technologists will need to understand the syntax languages of the various analytics tools that are being utilized to store and process the data (Access, SQL, Excel, SPSS, SAS, etc.). Additional needed resources must be knowledgeable about extract, transform and load (ETL) data. ETL is a data-integration function that extracts data from outside sources, transforms it to fit business needs and loads it into a data warehouse. This is where data is transformed from one native language to another so that it can be processed and analyzed in an analytics tool. If technologists are not well-versed on ETL and syntax, there is a higher risk of misinterpretation and misuse of data types in the analytics tools.
There are various feeder systems to a learning analytics tool, such as human resource information systems (HRIS) and learning management systems (LMS). Leveraging a standardized XML schema allows the automated extraction of data in one system and import and storage of that data in another. For example, Caterpillar Inc.’s learning organization used a standard schema to export learner demographics and class attributes housed in its LMS to an analytics tool. “The IT organization was provided a standard XML schema by the analytics vendor to extract details from the LMS (registration, completion and demographic data),” said Veronica Schmeilski, the Caterpillar project manager on the integration. “Nightly, this data is sent via FTP to the analytics vendor where it is automatically picked up and imported into the analytics tool that then initiates evaluation collection, storage, processing and reporting protocols.” The schema changes have occurred over time as the business and learning evolves to further customize it to Caterpillar’s needs.
Data reporting describes the formal presentation of the results of a processed request. Ease of use is a major consideration. This may seem functional rather than technical, but ensuring that users see results in an easy-to-interpret manner is important. Using tables and charts with appropriate language surrounding the results mitigates risk of misuse. Charting software or other packages can appropriately display results. Defense Acquisition University (DAU) uses easy-to-read charts in red, yellow and green, which any user can quickly understand, to identify strengths and areas of opportunity. Mark Whiteside, DAU director of performance and resource management, ensures that easy-to-interpret reports can be run across multiple areas of DAU training. “We want to maintain our position as a world-class learning organization,” he said. “Ensuring we have data that can be quickly interpreted on the key elements of our quality process is critical. The format of our stoplight-type reports makes them easy to evaluate performance in an efficient and effective manner.”
Quality in the learning analytics technology ensures credibility. Changes to the reporting environment should first be made in a safe test site, then migrated to the production site, the official reporting site. This is done to ensure functionality and security are not adversely affected. Regression-test all new code releases prior to implementation. This means prior functionality should be tested, even if it’s not expected to be affected. This minimizes the negative impacts new functionality can have on existing functionality.
Aligned with the quality process is communication of maintenance windows and escalation paths for customer support. If the reporting application needs planned maintenance, communicate the scheduled downtime to users and make sure the change takes place during the least obtrusive time. Support the reporting application with technical help. It is advisable to establish appropriate escalation paths that define the levels of support. For example, first-tier support could be basic technical glitches that do not impact all users and that have workarounds. A more significant escalation could be an application outage that affects all users. The owners of the tool should establish a support process so users of the learning analytics technology can report and receive feedback and have the appropriate points of reference for support depending on the technical problem.
Learning analytics has emerged as a powerful business decision-making tool for learning operations managers. If built with the correct technological considerations, these tools can offer timely and accurate business intelligence to help answer questions about learning activities and performance.
The key with any analytics technology is to ensure it can automate as much of the data collection, storage, processing and reporting as possible. Doing the automation in a way that makes learning analytics practical, reasonable, cost-effective and repeatable is the benchmark for successful implementations of this technology.
Jeffrey Berk is vice president, products and strategy, KnowledgeAdvisors, and is responsible for designing and implementing the company’s suite of products and services. Scott Magee is vice president, technology, KnowledgeAdvisors, and has developed numerous systems and managed implementations in the corporate learning environment. They can be reached at email@example.com.