Justify the Learning Ritual

Be ready to answer questions about your L&D practices By Elliott Masie

Elliott Masie is CEO of The Masie Center, an international think tank focused on learning and workplace productivity, and chairman and CLO of The Masie Center’s Learning Consortium.
He can be reached at


LOs — be prepared! You and your team may be asked to justify some of your most familiar rituals with evidence and business data. Business leaders and even boards of directors are looking for radical shifts in approaches and processes — and learning and training are ripe targets to be examined.

Here are a few learning ritual challenges from my conversations with senior executives in the past year, as well as considerations for responding should you be presented with similar challenges in your organization.

Does leadership training actually create and keep better leaders — with better business results?

If you are presented with this question, be prepared to examine the concrete skills, competencies and readiness levels that your leadership programs yield. Examine the six-month, one-year and three-year patterns of graduates of your leadership academies.

Imagine running a three-level experiment with the next set of candidates: A third go through your current program, a third are given a grant to buy their own leadership programs externally and a third are not given any program. What are the differences in their performance?

Consider the timing of when a leader is trained (upon promotion, early in their career as a high potential or perhaps one year into a leadership role). Additionally, ask whether you should separate the “induction” dimensions of welcoming people into the leadership ranks from more focused skill development aimed at observable shifts in competencies and readiness.

Does tracking learning help learner engagement, and do we use the data to improve business results?

At the Masie Center, our LMSs collect a massive amount of data about what every learner selects from our formal learning offerings. But we are not tracking most of the content, context, collaboration and resources that workers access from other sources. And most organizations are not using the data from the LMS to radically improve learning options, personalize learning for a specific employee or compare the impact of one program versus another. We track consumption but rarely use learning systems to monitor impact.

It’s worth considering whether tracking the microlearning choices of an employee helps or hinders their natural curiosity. What if employees were aware that their bosses were looking at the web searches they did throughout each day? I would imagine more searches would be made from personal smartphones.

Be prepared to defend or reframe the role that your LMSs have in driving business results.

Do live webinars accomplish higher engagement and bigger business results?

Most organizations have a default duration for live webinars, regardless of content or complexity. Most webinars are one hour long and have only a few activities that take advantage of the actual live presence of employees. What if we substituted asynchronous segments for live webinars? Durations could be stacked for overviews, basics or deeper content, allowing the learner to select their optimal timing and depth of material.

Learning and training are ripe targets to be examined.

Also consider: If everyone had to answer a few predictive, quick questions to show understanding, how many hundreds of thousands of wage hours would a large enterprise save?

Once again, imagine a split test project with three different versions of content: live webinar, asynchronous only and a blended model. Compare the participation, retention and actual business applications/results that each version yields.

CLOs will likely be asked to respond to additional questions in the near future, as well: To what extent are our learning programs used by workers who are not meeting work expectations? Are many of our programs attended by motivated and already engaged workers? What are the demographics of those who participate versus those who don’t?

How do we test for potential hires’ willingness to learn? What are our metrics for tracking the success (or failure) of line managers in supporting transfer of new skills to business practice? Who in the learning organization has the analytical data skills to drive shifts in assessment and follow-up strategy? How do we leverage the knowledge of retiring employees to impact business results?

These questions are coming. Let’s be ready and open to answer them.