Airline travel is the safest transportation system in the world and being in an airplane is one of the safest places you can be. You are a hundred times more likely to be struck by lightning on a golf course than killed in an airplane crash.
How did it get this way? It happened through the concept of black box thinking. Each airplane has two black boxes, one for systems recording and the other for voice recording. When a plane crashes, these boxes usually identify the cause of the crash. The analysis of data results in changes in design, procedures, processes or training. Black box analysis is required by regulations and standards and is enforced globally.
Mathew Syed, in his book “Black Box Thinking: Why Most People Never Learn From Their Mistakes — But Some Do,” argues that black box thinking should permeate the health care industry. With dramatic examples, Syed describes the dire state of today’s health care situation, where errors in the system kill thousands of people every month. This book has prompted many discussions and inspired some action, but, due to resistance, little progress has been made in the health care industry.
Scott Shackleford’s June 5 article in The Wall Street Journal, “We Need an NTSB for Cyberattacks,” suggested that black box thinking should also be brought into the cybersecurity industry. Shackleford asserts that when there is a cybersecurity breach, it must be investigated, and that investigation should lead to legal and regulatory changes, dramatically reducing the number of cyberattacks. Currently, when an attack occurs, the organization will investigate and adjust, but rarely do these events prompt industrywide changes.
The innovation field has adopted a different approach with mistakes. Failures are celebrated and innovators are encouraged to make mistakes. While the intention is good — don’t let failure discourage innovation — some would argue the wrong message has been sent. Recent reports suggest that even venture capitalists think it is ill-advised to celebrate mistakes. A February Harvard Business Review article by G.P. Pisano, “The Hard Truth About Innovative Cultures,” commented on the issue, suggesting that mistakes should not be celebrated. Instead, they should be tolerated, and we should learn from them. The key issue is learning from mistakes.
Black box thinking is not a new concept. It’s process improvement and has been practiced for years. Unfortunately, people don’t focus on process improvement. When there is a mistake, it is usually ignored, hidden or diminished for various reasons.
We have our own track record of mistakes in the talent development field. It is not unusual to find multiple programs implemented for the wrong reasons. Consequently, these programs fail to deliver value. It is not surprising that these programs are not always supported by the management team.
That’s why it’s important to begin with the end in mind with very clear business measures. To select the right solution, and to achieve success with clear objectives, including impact objectives. To collect data to make sure a program is relevant to the individuals involved and adjust if it is not. To ensure that learning is applied and delivers an impact in the workplace. And if it doesn’t, to improve it.
For too long, learning and talent development teams have been slow to change their habits.
We need to design for success in the beginning and measure success along the journey to ensure it is delivered. Remember, we are evaluating the program and not the individuals. If it’s not working, there is something wrong with the program and not necessarily the participants. When this becomes routine, we are optimizing the return on investment in talent development and this makes a great case for more budget allocation.
In short, black box thinking is needed throughout the learning and talent development cycle.