Posted by Caveo Learning ● January 22, 2019

Are you answering the question, “Did training work?”

This is part of our ongoing series, Interviews with Learning Leaders.

Kevin M YatesKevin M. Yates, Learning Technology Manager at McDonald's, uses his expertise to answer the question, “Did training work?” with facts, evidence, and data. Kevin's work spans the globe and industries, and he is a sought-after subject matter expert and international speaker. His experience reaches across training, learning, and talent development, having served in roles for facilitation, instructional design, learning solutions, learning technology, program management, operations, strategy, measurement, and analytics.

Kevin's guiding principle is, “Find one thing about a person's behavior or performance you can attribute to training or learning, and let that lead to the facts about impact.”

How did measurement and metrics become a focus for you?

I was at a crossroads in my career a few years ago. After being in L&D for nearly 20 years, I asked myself the question, “What difference have you made in your profession?” The programs I designed and led received favorable feedback. I was recognized as being a high-impact L&D leader, but I questioned the tangible impact of my work on people’s performance and business goals.

Around the same time as I was reflecting on my career, I met Dick Handshaw and Patti Phillips. Dick Handshaw is a world-renowned expert on human performance and influenced my thinking about me really being an L&D practitioner using L&D solutions to impact human performance. Patti Phillips, who founded the ROI Institute along with her husband Jack Phillips, showed me that there is a credible, reliable way to measure the impact of training.

This was a pivotal moment in my career. I literally shifted my entire focus to answering the question, “Did training work?” and using measurement to collect facts, evidence, and data that answers the question. I am now laser focused on investigating training and learning’s impact on human performance and business goals with measurement, metrics, data, and analytics.

How would you describe your work?

When I engage with senior leadership and stakeholders, I don’t talk about “training.” I avoid using the “T” word altogether, and I don’t talk about topics like adult learning theory, the four levels, learning styles, and the “stuff” executives really don’t care about.

I ask questions about business goals. I have a conversation that connects me with the strategy the business has set forth to win. I’m asking what skills, behaviors, and capabilities people need to help the business win. And most important for me, I’m asking how the business is measuring what it means to win.

It’s a different kind of conversation than what L&D is used to. We’re used to taking the order for training where it sounds something like, “I’ll take one web-based training, two eLearnings, and a side order of classroom training. And make that to go please.” I’m changing the conversation, so I can focus on discovering how to align learning strategy with business goals for measurable outcomes.

You say, “Find one thing about a person’s behavior or performance you can attribute to training or learning, and let that lead to the facts about impact.” Tell us more.

L&D fulfills its highest purpose when it helps people use their performance in a way that helps them and the organizations they serve win. This is what I believe to be inherently true about the work we do. If we aren’t doing this, I question what we’re doing.

For me, it’s as simple as being clear about performance requirements that help the business achieve a goal. When we’re clear about performance requirements, we can measure what that looks like. If we’ve purposefully created training and learning solutions aligned to performance outcomes for achieving business goals, we can measure that.

Here’s my path to successfully finding one thing about a person’s behavior or performance that you can measure and attribute to training and learning:

  1. Identify and be clear about the business goal.
  2. Identify the skills and behaviors that help people achieve the goal.
  3. Create a plan for collecting facts, evidence, and data that shows the extent to which people are using skills and behaviors that are critical for achieving goals.
  4. Design learning experiences and solutions that build the skills and behaviors that help people achieve business goals.

Did you notice how I started with business goals first as the number one step? That’s where I believe most L&D teams miss the opportunity for measurable outcomes. They start with training first.

We have to start with business goals first, link that to performance expectations for achieving business goals, and let that inform decisions about training and learning solutions. If I can find at least one thing about a person’s behavior or performance that comes from training and learning purposefully designed for a performance outcome tied to business goals, I can show impact for training and learning.

What’s the difference between learning data and business data, or are they the same?

I love this question! This is where I believe we have an opportunity to be more clear about what we call learning data versus what we call business data. All data are not created equal.

For me, learning data is what we get from our post-program learning evaluations, LMS, LRS, and other learning technology and platforms. This is where we get data that answers questions about operational efficiency and effectiveness for L&D.  

Operational data for L&D tells one part of the story. Business performance data tells another story and is the one stakeholders and senior leaders are most interested in hearing.

For me, the story is in the combination of business performance data, people performance data, and learning data. I’m looking for correlations among the three and whether or not anything changed as a result of training. And truth be told, it’s not always easy.

The reality is there are times where I scratch my head, am not sure about where the truth is, and I’m overwhelmed with the amount of data with which I have to work, and it just seems too difficult. But let’s be clear; difficult does not mean impossible.

I am driven by the idea that you can indeed use measurement to get facts, evidence, and data for the impact of training and learning. There’s always a story, be it good or bad. Sometimes the story is that training made zero impact on performance and business goals. Sometimes the story is that goal could not have been reached without training.

The ability to combine learning data with business data to tell L&D’s story can be tricky. I am constantly working to find the right mix of data sources to tell our story.

Why don’t more L&D teams measure impact?

There are many reasons, but I believe it boils down to three, and I actually made a video about this. I believe it’s because of accountability, capability, and deniability. And I’ll add mindset to that mix as well.

I guarantee you that if the CEO says to the L&D team, “I want to see data that shows how training and learning are changing people’s performance and helping us achieve business goals,” the L&D team will measure impact and produce that data. I’m not seeing L&D held to that level of accountability.

I see other parts of the organization held to that level of accountability, but not L&D. My theory is that many CEOs and senior leaders have not seen L&D’s story told with facts, evidence, and data that show impact and don’t know it’s possible. Add to that the fact that training has traditionally been viewed more as a service and less as a driver of results. I predict that when expectations for accountability changes, so will L&D priority for measuring impact on performance and goals.

There’s also a capability gap on L&D teams for measuring impact and producing evidence for impact. We do a great job at measuring how many people we’ve trained, how many courses we’ve offered, and the different types of modalities through which we offer training. That’s pretty easy to do. Measuring impact is something totally different.

Measuring impact is an art, a science, and a skill—it is not a traditional capability for L&D. I’m excited to see more and more L&D teams either building the capability from within or hiring the right talent to focus on measurement, data, and analytics for training, learning, and talent development.

The last reason more L&D teams don’t measure impact, and probably the scariest, is plausible deniability. As long as there’s no evidence that shows training is not making an impact, L&D can hide behind the tangible evidence of training classes, web-based courses, the LMS, etc., as the results of our work. We can deny the truth when training did not work.

I have a guiding principle that comes from Linda Hainlen, a measurement and evaluation expert I met a few years ago. She says don’t be afraid when measurement tells you training didn’t work. Be afraid of not doing anything with the results. I say, let’s stop hiding behind plausible deniability and start using data and measurement to inform decisions that helps us deliver high-impact training and learning solutions.

What are you most excited about when it comes to measurement, data, and analytics for L&D?

When I started my career in training nearly 20 years ago, we weren’t talking about impact. We weren’t focused on performance. And we certainly were not using measurement, data, and analytics to inform decisions about learning strategy.

I’m excited because the game is changing. Not only are we talking about measurement, data, and analytics for L&D, we’re doing it! It’s a slow shift, and there aren’t many L&D teams doing it, but it’s happening nevertheless.

I see more thought leadership and solutions for measuring impact. I see tracks specifically for measurement, data, and analytics at workshops and conferences. And I see the emergence of roles on L&D teams focused on measurement, data, and analytics.

If you were to ask me to make a prediction, I’d say that 10 years from now, using measurement, data, and analytics to measure impact and inform decisions will be deeply embedded in L&D teams. I’m excited to see what the future holds. And I’m excited to be one of the crying voices in the wind proclaiming the value of facts, evidence, and data for showing the impact of training, learning, and talent development on people’s performance and organization goals.

Connect with KevinLinkedIn Twitter Instagram Facebook

Interested in measurement, reporting, or management of L&D or HR? Don't miss CTR Week 2019! 

New call-to-action

Topics: Metrics & Measurement, Interviews with Learning Leaders