In response to my blog post titled, “Kirkpatrick’s Four Levels of Training Evaluation: A Critique”, Wendy Kirkpatrick wrote a comment directing me to a white paper that she co-authored with Jim Kirkpatrick, "The Kirkpatrick Four Levels: A Fresh Look After 50 Years 1959 - 2009." That document describes updated thinking that the Kirkpatrick’s call “The Kirkpatrick Model”. They’ve added some critical elements to the
original “Four Levels”, including: starting evaluation planning by clarifying intended results of training; taking into consideration what happens before and after a training event that facilitates LearningAllianceGoogle learning; the supervisor/mentor/coach partnership in learning; and "organizational drivers" to results.  I applaud these additions to their model; they are all factors that Rob Brinkerhoff and I described in our 1994 book, The Learning Alliance: Systems Thinking in Human Resource Development.

As we did then and as we do now, Rob and I have great respect for what the Kirkpatricks have done to heighten awareness of the importance of evaluation to the field of workplace learning. However, the problem persists in organizations in the U.S. and around the globe: because people have a label for something, they believe it has value. So, by labeling a measure “Level One”, “Level Two”, “Level Three”, or “Level Four”, training managers believe they are collecting useful information when, in fact, they most likely are not. I wish I had a dollar for every time I've asked training managers about evaluation and they have said to me, "We do Level Ones and, when we can, Level Twos."

I concede that any measurement of training has some reinforcing effects on learning and, by bringing organization attention to the training, may have internal political benefits, as well. But until training and learning specialists measure the impact that a learning intervention has on the business and evaluate the organizational factors that either drive those results or prevent them from occurring, organizations will continue to learn little from evaluating training.