We all know that an evaluation survey won’t do much for you if you don’t do something with the data. We also know that simply asking some questions when an employee completes a training class isn’t enough to tell us whether that employee’s performance will improve, or if the business has been impacted.
But those poor little smile sheets…Level One reaction surveys…whatever you want to call them, they seem to get a lot of negative press, don’t they? Even the way we say it, in our condescending voices…those SMILE sheets…as if they are scribbled in crayon and covered in stickers. They, too, are an important part of the evaluation process. We NEED to know what was thought of the course. We NEED to capture what was on our attendees’ minds while they were participating. We NEED to review that data and compare it to the evaluation data we collect further along in the process. Sweet, little unassuming smile sheets are the gateway to evaluation success. So, we’d better be making the most of this learner touchpoint, don’tcha think?
1. Ask good questions to get good answers.
Don’t worry about the temperature of the room, especially when it’s beyond your control. If a room was too hot or too cold, they will let you know. The same thing goes for superficial items like food preferences. Honestly, do you need to know that one of your learners would have really preferred turkey instead of the ham sandwich he received? Didn’t think so.
Ask GOOD questions. Meaningful questions. Relevant questions. Here are some examples (and yes, I know these are really “statements”, not “questions”):
- I found the training course to be relevant for my role.
- I enjoyed the training.
- The instructor was knowledgeable about the subject matter.
- The content was delivered in an appropriate method.
- The timing was (too long, too short, just right).
- I will be able to apply what I have learned.
- The materials, handouts, etc were helpful and easy to understand.
- The course was time well spent.
- I would recommend this course to others.
These questions are nothing new – I’m certainly not trying to reinvent the wheel. If you’re even remotely familiar with the tried-and-true Kirkpatrick Model, you know this well. It’s just a reminder for all of us (myself included) to take stock of the questions we’re asking in our Level One surveys. You’ve gotta ask good questions to get the information you’re seeking.
2. Make it easy.
A few years ago, I was sent to a training certification course with a couple of co-workers. I knew going into this brutal, poorly-facilitated, 5-day, intensive, homework-laden, lecture-heavy, 4-hour-essay-question-exam-at-the-end-that-you-had-to-get-a-90%-to-pass class (tell us how you really felt, Michelle) that it would be completely irrelevant for my role. “I know training,” I argued (diplomatically, of course) to my boss, “and I know that sending me to this course will not be a good use of money. Send so-and-so instead…he’ll get soooooo much more out of it than I will.”
Well, my plea didn’t work.
I had to go to the course, as the boss-lady wanted us all to have this certification (don’t get me started on what I think of the “one-size-fits-all” approach). On the upside, the course was held in a swank resort on the Atlantic in sunny Florida during the fall. And I had an oceanfront suite, where the sparkling sunrise woke me up each morning. A girl could do worse, I suppose.
The **one** thing I took away from this course was about gathering data in surveys. As long as I live, I will remember this statement:
Use a five-point scale, with a neutral midpoint.
There you go. The course, travel, hotel, and meals for that week cost more than a semester at a public university, and THAT is what I remember. But whatever. I have used that little nugget several times over the past 3 years, and it has significantly improved and simplified the surveys/evaluations I have created.
Don’t confuse your participants with long, arduous survey processes. Ask pointed, meaningful questions that can be answered quickly and easily. When those questions are to be rated, use a simple, five-point scale; something like:
Strongly Agree | Agree | Neutral | Disagree | Strongly Disagree
Very Good | Good | Neutral (Average) | Poor | Very Poor
Give the respondent an opportunity to elaborate, if you like, but the standard answer should be something s/he can do without taking much time. In spite of keeping it simple, you can certainly include a couple of open-ended questions to capture specific feedback from your participants. But take heed – the more thought and work that is required, the fewer responses you will get. The less data you will gather. The less you will discover.
3. Keep it consistent with your overall evaluation process.
This post is focused on Level One. The Smile Sheet. Immediate Reaction. Instant Gratification. If you want to be seen as a true Performance Consultant (and really, that’s what we should all be…or striving to be, eh?), then you need to make sure your evaluation process doesn’t stop at Level One. What are you doing for Levels 2 and beyond? The questions you ask at the different post-training touchpoints should blend seamlessly with one another. Your data capture methods should make sense (not redundant), continue to be simple for the participants to give you the unbiased data you need to effectively evaluate and prove value to the organization. Don’t fret, dear ones, we’ll dig into the other levels on another day. For today, it’s about Level One. Le Gateway.
Since I know you’re wondering…that awful training course? By some miracle, I rocked a 94%. And I was kind, but honest, on my Smile Sheet about what I thought of Five. Solid. Days. of monotone lecture and the jargon-rich content. Bam!
Your turn: Tell me about your Level One evaluation process. What information do you gather from your training participants?
Like it? Share it!