Most of us use some form of testing at the end of each training initiative we offer. That testing occurs, often in paper form, at the end of a traditional lecture/reading course. It also occurs electronically after an individual comes to the end of most media courses.

We tend to look at the testing results as “proof” of a particular training initiative’s effectiveness. “Leslie” scored a 43 on the pre-test and an 81 on the final exam; ergo, “Leslie” has learned a lot!

Likewise, vendors often trumpet the pre- vs post-test scores associated with their media courseware offerings as “proof” of the superiority of their offerings.


Short-term retention means close to nothing — if you are interested in skills acquisition that actually results in markedly increased on-the-job performance!

Without question, all of us need to look much more carefully at the efficacy of testing in the training environment.

Initially, you should recognize that courseware incorporating truly meaningful exams is built around a bank of questions that are randomized but, at the same time, asks at least one question for each of the learning objectives.

Usually, it would be wise to follow up that exam with a hands-on test that will give you a clearer picture as to the transfer of learned knowledge into correct work practice. The following are some suggestions you might use to give a more accurate picture of the longer term effects of your training initiatives:

On-the-Job Follow-Up:

The trainee and the appropriate supervisor are given a copy of the performance checklist for each completed lesson. The supervisor assigns the student to perform tasks on the checklist and evaluates that performance. Through discussion, the supervisor can then augment the training activities with site or equipment specific identification and information.

Shop/Laboratory Activity Integration:

Shortly after the completion of a lesson, a shop/lab activity is conducted allowing trainees to practice the activities covered in the lesson. For example, your trainees finish a lesson on laser alignment and then go into a shop to practice laser alignment. That shop/lab activity can then be followed up with a hands-on performance test administered by the instructor.

Follow-Up Testing:

If you are truly interested in measuring longer term retention, you might well want to repeat the initial testing 6-9 months after the completion of your training initiative. If nothing else, it will open your eyes as to the misleading results you garnered from that first post-test. (If “Leslie” scores well this second time, you can be fairly confident that she actually did learn something!)

Just a few ideas to help you affirm that the learning initiatives you offer actually translate successfully to the plant floor. And when you can defend your offerings with proof of longer retention and better on-the-job performance, you can take genuine pride in the training regimen you offer!

More on Tuesday – – – – –

— Bill Walton, Founder, ITC Learning
www.itclearning.com/blog/ (Tuesdays & Thursdays)
e-Mail: bwalton@itclearning.com