Coaching, whether it’s life coaching, executive coaching, or performance, skills or career coaching, seems inherently personal and humane. After all, it involves opening oneself up to feedback, conversations and, ultimately, connection. But what if your career or executive coach wasn’t actually a coach at all — or even human?
Artificial intelligence made its way into learning and development with the start of the Fourth Industrial Revolution, and more recently, it’s been showing up in the coaching space.
Traditionally, coaches are hired from the outside to coach within a company or are even part of a company’s internal human resources office. They may be tasked with retaining and developing top performers or asked to act as a sounding board for a strategic business matter. On a personal level and in the workplace, coaching is a useful tool that allows an individual to practice new skills, work toward achieving goals or simply boost their confidence.
AI-enabled coaching aims to do essentially the same thing, though it comes with its own set of benefits and challenges.
One of the first AI-enabled “coaches” — a chatbot named ELIZA — was built during a study conducted in the MIT Artificial Intelligence Laboratory in the 1960s and programmed to simulate a session with a psychotherapist using basic natural language processing technology. When ELIZA was first introduced, some users thought they were actually chatting with a real human.
More and more, AI-enabled coaching apps are on the rise in the L&D space. Learning leaders who are interested in using an AI-enabled coach can choose from several, including LeaderAmp, Orai, Butterfly.ai, QStream, GiantOtter and, most recently, LEADx’s Coach Amanda. Coaching apps like BetterUp and Torch.io are mobile platforms that use AI to connect users with human coaches.
Learning leaders have a lot of questions to consider before adding an AI-enabled coach to their L&D repertoire. However, many of these questions should be the same ones they would ask about a learning tool that didn’t use AI, said Matt Barney, an industrial organizational psychologist and founder and CEO of LeaderAmp.
“What’s the business case? Where’s your organization trying to go?” Barney said. “Where are the gaps, and what part of that comes from people? The same sort of organizational diagnosis needs assessment that a mature, responsible CLO would do is front and center first, before they ever get to AI, because you could solve the wrong problem to the third decimal place with AI, and what’s the point of that?”
The Case for AI
Historically, coaching doesn’t scale well because it tends to be expensive (the average human coach can run $500 an hour, with the high end nearing $3,500 an hour) and most people don’t have enough time in their schedules to fit it in. As a result, coaching opportunities often are limited within organizations or only offered to high-level and C-suite executives.
But when rooted in data and science, AI can be a useful tool to address these issues, Barney said. After facing these problems himself as the chief learning officer at Sutter Health, and later as the VP and director of the Infosys Leadership Institute in India, he decided to launch LeaderAmp as a tech-based solution.
“LeaderAmp is a high-touch, high-tech combination approach to making coaching both more effective, because it’s more embedded in people’s daily lives, and more scalable, so it’s responsibly lowering the cost by relying on some new kinds of artificial intelligence,” he said.
Barney said two types of AI have been widely used to address the coaching industry’s problems: Expert systems and deep learning “flight simulators.” The former is calibrated to be like an assessment tool, allowing users to schedule reminders to reflect on a lesson or practice learned behaviors and skills. For example, LEADx’s Coach Amanda will send its users daily notifications based off the user’s completed personal assessment. The user is reminded to interact with Coach Amanda at least once a day, and perhaps pick a new goal to work toward.
Deep learning allows users to practice a skill or lesson in a low-risk environment, on their own schedule, Barney said. Last year, his team won an award for an instant persuasion coach, which was programmed to be capable of having a back-and-forth conversation in which they practice being persuasive in any given situation, using the research of scientist Professor Robert Cialdini. The coach is able to identify how well the user used Cialdini’s persuasion principles and will even suggest how the user can improve so they can practice persuasion before having to use it for real.
Last July, LEADx partnered with SurveyMonkey to ask U.S. managers questions about using an AI-enabled coach. The results revealed that 53 percent of managers didn’t want one, citing technophobia and a lack of confidence in the AI-enabled coach’s ability to teach lessons on soft skills and emotional intelligence the way a human coach can as reasons. However, most changed their tune after they had the chance to work with one.
“Any time you roll out an e-learning tool or an AI platform, you’re never going to get 100 percent who want it, just because they don’t want anything,” said Kevin Kruse, founder and CEO of LEADx. “I think the problem here is our understanding, our beliefs about AI and robots. We’re not all technical, we’re not all scientists, so it comes from the movies. Of course, traditionally in most movies the AI or robot is evil, mean. It’s killing us, chasing us or taking over the world.”
But despite AI lacking human elements such as having emotions, Kruse said it can actually be very good at teaching soft skills and EQ because of its ability to send daily reminders about practicing a soft skill or following through on a behavioral action plan. Human coaches, like the ones through BetterUp and Torch.io, can also use AI to help regularly track and flag changes in a user’s behavior because of how it collects that data.
“As long as it’s grounded in the science, which suggests why a soft skill works and how to learn it, then it’s completely appropriate to use with AI because even a very expensive coach cannot be with a person they’re coaching all the time,” Barney said. “But the AI can. We know the more a person practices, the better, whether it’s a soft skill or some other skill.”
Where Human Coaches Shine
There are many ways, however, in which human coaches continue to excel beyond the capabilities of AI-enabled ones.
For starters, building out AI requires a lot of data. When there’s enough, AI can be very good at mimicking the coaching process, which is where both a human and an AI-enabled coach can appear to be doing similar jobs. But the “conversation” between AI and human user is still only surface-level. Kruse said there hasn’t been any form of AI created yet that is able to simulate a real, deep, back-and-forth coaching conversation.
Soulaima Gourani, career development expert and co-founder of Women Reignite, said AI-enabled coaches won’t be capable of listening, understanding and asking questions like a human coach for at least 10-15 years.
This means that personalizing the AI experience is critical in order to be effective, Kruse said. When setting up the LEADx app, Coach Amanda users take personality and strength assessments that help ensure the coaching is tailored to them as much as possible.
However, when the AI doesn’t have enough of the right type of data to learn from, its coaching attempt can completely miss the mark. Barney said collecting data is a long and crucial process that can also be very expensive, and the industry needs a better approach that doesn’t require building AI using so much data.
“Not to disparage coaches, but sometimes coaching doesn’t work, and other times it actually backfires and makes people worse,” Barney said. “So, we don’t want to scale with AI something that’s not grounded in something that we really know works.”
Additionally, machine learning can potentially lead to creating biased AI. Because AI is subject to what it learns, it can learn bias from skewed information, Barney said, referencing instances reported in the media in which AI-enabled chatbots from large tech companies such as Google or Microsoft had to be shut down after “learning” racist and inflammatory language from tweets and GroupMe and Kik messages.
Another issue inherent in AI-enabled coaching is finding what Barney called the “Goldilocks Zone.” In other words, even when based in science, the coaching still needs to be appropriate for each individual — “Not too hard, not too easy, but just right for each person on a mass, scalable level,” he said.
“Take swimming as an example. It would be crazy for an AI to suggest to a non-swimmer to jump into the deep end. And conversely, it would be almost as bad for an AI to tell any Olympic gold medalist swimmer to go blow bubbles in the children’s pool.”
AI and the Future Coaching Space
In a 2019 survey of 305 global executives titled “Human AI Is Here,” conducted by Forbes Insights in collaboration with Accenture, Avanade and Microsoft, companies using AI reported more significant growth than companies that did not.
Gourani said she expects a huge rise in popularity of AI-enabled coaches in the near future, as many people are already looking for good, unbiased coaches — adding that a robot only has bias if it has been coded incorrectly.
“Robots can also actually be very nice,” she said. “A lot of robots out there already are actually programmed to educate kids with learning disabilities.”
Kruse said he believes AI will be able to begin replicating a real coaching conversation on specific topics within the next five years. However, he would still recommend a human coach to a manager in most circumstances, but only if one can be afforded.
As AI makes itself a permanent fixture in the coaching space, human coaches needn’t worry about the industry becoming fully automated anytime soon. In fact, human coaches should prepare instead for further collaboration with AI, Barney said.
“The human coach experience is still important, but the AI can complement that by making sure that the goals are appropriate and that the suggestions are embedded into their daily lives,” Barney said.