When is “really good” really, “good enough” for training?

forrest-gump-production

Unless you’ve recently emerged from hibernating in a cave for the past two decades, you’ve probably seen the movie Forrest Gump. Come on, even if you’re not a big fan of movies, you’ve still probably seen that movie. It has gone down in movie history as a classic; Forrest’s extraordinary life story told by Forrest himself, in one of Tom Hanks’ Oscar-winning performances.

This movie brought in an estimated $55,000,000 at the box office, garnered numerous awards, including 6 Academy Awards. Not too shabby.

If you go to the IMDb page for this movie, you will see that there are literally hundreds of names listed for cast and crew…hundreds. So many people had their eyes, ears, hands, heart and soul poured into the creation of this film, and guess what?

It’s not perfect.

The other day, Forrest Gump was on T.V. Right in the middle of the scene where Forrest visits Jenny’s apartment (after he finishes telling his story to people at the bus stop), a little goof caught my eye:

iron-mistake
In one shot, the iron is up – in the next shot, the iron is down. Hmmm. So, I was curious – was this the only mistake in the movie? Turns out, there are websites dedicated to pointing out movie flaws and bloopers (these folks must have a lot of time on their hands). And guess what? There were actually a lot of factual errors and continuity issues like the iron. Again, it’s not perfect. But we still love that movie. No one took away the Oscars because of these flaws.

So, if a film that had a team of hundreds, one that inevitably went down as one of the greatest films of all time, has a few errors…why are we so hard on ourselves?

We live in a world of flaws. We work in organizations full of flaws. Yes, it’s our job to disseminate workflows, processes and procedures to enable employees to learn, develop and succeed. But it will never be perfect. Never. Furthermore, it’s likely that you don’t have hundreds of people on your team to scrutinize every detail. Many of us are part of a small team, or possibly even a “team of one.” We do the best we can with the resources we are provided.

Keep on keepin’ on, friends. 

Forrest Gump is complete. A done deal. There’s no assembling the production crew 20+ years later to “fix” that pesky iron scene. But our training-leadership development-onboarding-eLearning (etc) projects? The good news is, so much of what we do allows for continuous quality improvement. As processes update, employee job requirements change, or even when we find a more effective way to facilitate learning, we can do it.

A few tips:

  1. Audit your courses regularly (a minimum of once per year) for accuracy and relevance. Do they still address the learning need? If not, determine what updates are necessary, or consider eliminating the program/course altogether.
  2. Monitor your metrics – what data are you getting from participants and stakeholders that validates the content or approach?
  3. Don’t make changes to your program just for the sake of change – ensure that the change addresses learning needs, business drivers or other organizational goals.
  4. Keep your eye on the content – efficiency, relevance and accuracy should trump “pretty.” Sure, a beautifully designed course is ideal, but don’t lose sight of your higher-priority tasks and responsibilities in pursuit of perfection.

Now, to quote Forrest himself, “That’s all I have to say about that.”

 

Your turn: How do you audit and review your programs to ensure they are accurate and relevant? Leave a comment below to share your own tips!

Attending the ATD International Conference & Expo (ICE) in Denver next month? I’d love to see you there!

Advertisements

Improving Training Programs with Feedback

employee-training-feedback

As learning professionals (or whatever hat we might be wearing at any given moment), it is our responsibility to assess a learning need and provide a solution. And, tipping my cap to my passionate learning cohorts around the world, I’d say we do a fine job.

But, you know what? We don’t always have the answers. Or the perspective. Or even the right questions to ask. So we need to engage others.

This might be a pow-wow with a SME or project manager, to learn more about a task, process or system. It might be meeting with a supervisor to better understand a team’s skill or knowledge gaps.

But what about the employees themselves? How often are we asking them what they want out of training? What they need? How we can help them become a stronger employee today…and maybe-just-maybe, help prepare them for future opportunities?

The same goes for orientation and onboarding programs…consider doing a brief survey to poll your workforce, and see what you can learn about your new employee experience. A few questions might include:

  • When you started with (company name), what was the most helpful part of your onboarding experience?
  • What was your biggest challenge when you started in your role?
  • What advice would you give a new employee starting with (company name)?
  • What tools and resources are the biggest help to you?
  • Who was your go-to person when you were getting started in your role?
  • How can we improve the new employee experience at (company name)?

These simple questions can give you perspective that can help you strengthen your process and program. You can use these questions as a foundation, and tweak or expand them based on the program – these examples focus on the new employee experience, but just imagine how a few strategic questions can help you evolve your other training initiatives, leadership development programs, employee transitions and more.

The important thing is to stay curious, friends. We should continuously seek out feedback and suggestions from our various stakeholders, from the executives to the end users, and from all cubicles in between.

Your turn: How do you engage your organization beyond the standard needs analysis or evaluation process? What information have you gained from employees that have impacted your learning programs?

Like it?  Share it!

Are Smile Sheets Misunderstood?

smile-sheets

We all know that an evaluation survey won’t do much for you if you don’t do something with the data.  We also know that simply asking some questions when an employee completes a training class isn’t enough to tell us whether that employee’s performance will improve, or if the business has been impacted.

But those poor little smile sheets…Level One reaction surveys…whatever you want to call them, they seem to get a lot of negative press, don’t they?  Even the way we say it, in our condescending voices…those SMILE sheets…as if they are scribbled in crayon and covered in stickers.  They, too, are an important part of the evaluation process.  We NEED to know what was thought of the course.  We NEED to capture what was on our attendees’ minds while they were participating.  We NEED to review that data and compare it to the evaluation data we collect further along in the process.  Sweet, little unassuming smile sheets are the gateway to evaluation success.  So, we’d better be making the most of this learner touchpoint, don’tcha think?

So, how?

1.  Ask good questions to get good answers.

Don’t worry about the temperature of the room, especially when it’s beyond your control. If a room was too hot or too cold, they will let you know.  The same thing goes for superficial items like food preferences.  Honestly, do you need to know that one of your learners would have really preferred turkey instead of the ham sandwich he received?  Didn’t think so.

Ask GOOD questions.  Meaningful questions.  Relevant questions.  Here are some examples (and yes, I know these are really “statements”, not “questions”):

  • I found the training course to be relevant for my role.
  • I enjoyed the training.
  • The instructor was knowledgeable about the subject matter.
  • The content was delivered in an appropriate method.
  • The timing was (too long, too short, just right).
  • I will be able to apply what I have learned.
  • The materials, handouts, etc were helpful and easy to understand.
  • The course was time well spent.
  • I would recommend this course to others.

These questions are nothing new – I’m certainly not trying to reinvent the wheel.  If you’re even remotely familiar with the tried-and-true Kirkpatrick Model, you know this well.  It’s just a reminder for all of us (myself included) to take stock of the questions we’re asking in our Level One surveys.  You’ve gotta ask good questions to get the information you’re seeking.

2.  Make it easy. 

A few years ago, I was sent to a training certification course with a couple of co-workers.  I knew going into this brutal, poorly-facilitated, 5-day, intensive, homework-laden, lecture-heavy, 4-hour-essay-question-exam-at-the-end-that-you-had-to-get-a-90%-to-pass class (tell us how you really felt, Michelle) that it would be completely irrelevant for my role.  “I know training,” I argued (diplomatically, of course) to my boss, “and I know that sending me to this course will not be a good use of money. Send so-and-so instead…he’ll get soooooo much more out of it than I will.”

Well, my plea didn’t work.

I had to go to the course, as the boss-lady wanted us all to have this certification (don’t get me started on what I think of the “one-size-fits-all” approach).  On the upside, the course was held in a swank resort on the Atlantic in sunny Florida during the fall.  And I had an oceanfront suite, where the sparkling sunrise woke me up each morning. A girl could do worse, I suppose.

The **one** thing I took away from this course was about gathering data in surveys.  As long as I live, I will remember this statement:

Use a five-point scale, with a neutral midpoint.

There you go.  The course, travel, hotel, and meals for that week cost more than a semester at a public university, and THAT is what I remember.  But whatever.  I have used that little nugget several times over the past 3 years, and it has significantly improved and simplified the surveys/evaluations I have created.

Don’t confuse your participants with long, arduous survey processes.  Ask pointed, meaningful questions that can be answered quickly and easily.  When those questions are to be rated, use a simple, five-point scale; something like:

Strongly Agree | Agree | Neutral | Disagree | Strongly Disagree

or

Very Good | Good | Neutral (Average) | Poor | Very Poor

Give the respondent an opportunity to elaborate, if you like, but the standard answer should be something s/he can do without taking much time. In spite of keeping it simple, you can certainly include a couple of open-ended questions to capture specific feedback from your participants. But take heed – the more thought and work that is required, the fewer responses you will get.  The less data you will gather.  The less you will discover.

3. Keep it consistent with your overall evaluation process.

This post is focused on Level One.  The Smile Sheet.  Immediate Reaction.  Instant Gratification.  If you want to be seen as a true Performance Consultant (and really, that’s what we should all be…or striving to be, eh?), then you need to make sure your evaluation process doesn’t stop at Level One.  What are you doing for Levels 2 and beyond?  The questions you ask at the different post-training touchpoints should blend seamlessly with one another. Your data capture methods should make sense (not redundant), continue to be simple for the participants to give you the unbiased data you need to effectively evaluate and prove value to the organization.  Don’t fret, dear ones, we’ll dig into the other levels on another day.  For today, it’s about Level One.  Le Gateway.

Since I know you’re wondering…that awful training course?  By some miracle, I rocked a 94%.  And I was kind, but honest, on my Smile Sheet about what I thought of Five. Solid. Days. of monotone lecture and the jargon-rich content.  Bam!

Your turn: Tell me about your Level One evaluation process.  What information do you gather from your training participants?

Like it? Share it!