Chasing the Return on Learning: Shifting from Program-Based to L&D as a Service Delivery Function

by | Apr 19, 2023 | 0 comments

From the 1800s, when training consisted mostly of apprenticeship and on-the-job learning (think smiths and artisans), to the 2000s and the use of social media and networking as learning delivery tools, workplace learning has come a long way indeed. With technology continually disrupting the space of workplace training and development – from e-learning, to gamification and augmented reality – it pushes us to challenge ourselves to provide relevance to a new generation of learners and align to ever-changing business needs.

It was in the 1900s that training experienced a ‘boom’ period. During this period Donald Kirkpatrick came up with the now-ubiquitous ‘Four Levels of Learning Evaluation.’ The concept was a subject of his study in the 1950s but gained much wider popularity upon publication of the book ‘Evaluating Training Programs’ in 1994.

In his book, Kirkpatrick outlines the four progressive levels of evaluation used in measuring training program effectiveness. Level 1 (Reaction) measures overall learner satisfaction, Level 2 (Learning) measures knowledge/ skills increase, Level 3 (Behavior) measures behavior change (true learning), and Level 4 (Results) measures the impact of training to the bottom line.

The 4 Levels of Evaluation has become a widely embraced framework utilized by many companies and learning professionals to this day.

In 1977, Jack Phillips came up with a complementary add-on to the framework called the 5th level of evaluation, for measuring Return on Investment (ROI) in training.

Phillips asserts that when we implement a training program, a chain of impact events occur as newly-learned skills and knowledge are applied on the job, which translates to business impact and ultimately ends in ROI.

Though many find value in measuring levels 3-5, very few organizations do so. Measuring training program effectiveness can get cumbersome, especially with the higher levels. With budgets for training continuing to be slashed, and delivery demands on the rise, learning teams are finding it more challenging to take the time to instill processes that allow capture of each level of evaluation for all their training programs.

Some of it has to do with the way we view training in the workplace. Most of the time we operate under the premise that we are only as good as the last (successful) program we have run. We fail to acknowledge that Training is a Service Delivery Function. Our existence is dependent on the value we bring to Operations and other business units – as a service and not as a set of programs.

Not to diminish the value of the Levels of Evaluation – only that the obsession with calculating ROI has got Training departments in a spin, justifying their very existence. There are specific programs that Do require an ROI analysis (especially if there is a significant cost attached), but this process should be baked in at the beginning of the project, checked at mid-point, and again after implementation. To do otherwise is not an ROI practice but cost justification. Think of the term itself: Return on Investment – ROI Analysis happens before you invest. It is an aid in decision-making – should I invest or not?

In his book ‘The Training Measurement Book: Best Practices, Proven Methodologies, and Practical Approaches,’ Josh Bersin recommends measuring training as a support function, alongside your current standard evaluation processes.

Bersin uses the IT department as an example– would we ask them to measure ROI of the company’s e-mail system? No – we assume its value. We measure functional value through the level of support provided, alignment to business goals, high levels of customer service, contribution to overall business strategy, etc.

An overall measurement process could be done alongside your current evaluation practices and captures an overall service delivery viewpoint. Some basic principles to help you get started:

1. If you must compute ROI for your training programs, do so before program development. You may start off by ensuring alignment. Use a business sign off form indicating:

a. A description of the business problem stated in business terms (for example, we are 15% below target sales).
b. Quantified loss if the problem is left unresolved (or potential gain if solved). For example, this is costing the business 5M in quarterly sales.
c. Timelines for solving the problem (e.g., this needs to be resolved by the end of the year).
d. If applicable, the budget allocated by the business unit to solve the business problem.
e. Identified training audience.
f. Signoff by the manager, director, VP of business unit requesting the training.

2. Move beyond the existing program evaluation models. Kirkpatrick developed his model at a time when training organizations were comprised mostly of trainers. Nowadays, delivery may be only half of the services offered by our function, as we move into performance consulting, technology management and content development.

3. Measuring Training as a Service Delivery Function is a Process, not a Project. Think of it as a continuous journey. Start with one step today, and build as you go. In our organization, we started with Quarterly Stakeholder Surveys focusing on alignment, collaboration, and satisfaction. Some sample statements include:

a. The L & D Team provides a valuable service to our team.
b. The L & D Team consults and gathers input from the appropriate subjects before making recommendations.
c. The L & D Team shows the ability to solve problems economically, with optimal use of resources and time.

We aim to show the value of Learning and all the services we offer as a whole. Think of your stakeholders first and foremost, and you can never go wrong. At the heart of all that we offer, the Return on Learning will always be positive when we put our customers front and center.

Want to learn more about evaluating the effectiveness of training? Sign up for our program Measuring the Value of Learning, under the Philippine Society for Talent Development on May and July 2023.

 

References:

Ferriman, J. (2016, April 6). History of Training and Development. Retrieved from https://www.learndash.com/history-of-training-and-development/
Camm, B. (2011, August 19). Training Evaluation: Jack Phillips and ROI. Retrieved from https://www.dashe.com/blog/evaluation-2/more-on-re-evaluating-evaluation-jack-phillips-and-roi/
Bersin, J. (2008). The Training Measurement Book: Best Practices, Proven Methodologies and Practical Approaches. San Francisco, CA: Pfeiffer.

About the Writer

 

ANNA MAY VILLALUZ

Learning & Development Manager,
Vista MNL

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

Facebook
Twitter
YouTube
LinkedIn