No Image Available

Patty Conneen

N/A

Trainer

Read more from Patty Conneen

googletag.cmd.push(function() { googletag.display(‘div-gpt-ad-1705321608055-0’); });

Evaluating Evaluations

default-16x9

The current evaluation that is used in my company needs to be revamped. It currently has 12 questions ranging from understanding the course objectives, information being clearly presented, enough time for the training, trainer helped me, can apply the information to my job, training was beneficial, and how likely has your skills, knowledge and attitude change from the training. The answers are based on a rating, but there are different ratings depending on the questions (strongly agree, agree, disagree, strongly disagree, exceeded expectations, met expectations, below expectations, very significant, significant, insignificant, very insignificant).

We feel that it is too long, and that the questions repeat themselves (only using different words), and that we are not getting enough value from the range of answers. What do you feel are the most important questions to be asked on an evaluation so that L&D can measure the significance of the training and training material, and the performance of the trainer? Are there other questions tat need to be asked and evaluated right after the training is completed?

6 Responses

  1. Evaluate Learning

    I am assuming that your evaluation is some from of 'happy sheet' i.e. evaluation following a learning event of some kind.  If that's so, I would recommend that you concentrate on  evaluating learning, not the training.  So, how much learning has occured at the event?  You should measure how much the learners know about the subject before and after the event to see if things have moved forward.

    In terms of things like trainer's performance, the question should be set around how the trainer's style has enabled learning to happen.  Questions around learning material should be about how the material has supported learning – that's what they are there for. 

    I'm not sure you should be measuring how much the training has helped me in my job at this stage as I'm not sure people can truly say until they have applied new knowledge – this should be later on.  Similarly, a change in attitude can only be evaluated once new knwoledge has been applied – you should do this as a further stage in your evlauation.

    Hope that helps.

  2. Evaluation form

    You address some common problems with 'course evaluation forms' –

    1. Questions ask for information that seem to have limited usefulness

    2. Rating scales varied/inconsistent

    3. Survey very long & repetitive: respondents lose the will to finish

    Result = low value survey.

    The only advice is really the tough advice: what is important for you/your boss/your managers to know about the training.  in other words, who needs to know what?

    If no-one cares about training materials, don't ask about training materials. If line managers want to know whether the training provided any action-plan or strategies for improving job performance, then ask those question. If you want to know about trainer competence, then ask about the quality of the training, etc etc

    Regarding survey development tips I have a downloadable job-aid with some practical tips/advice off the link below.

    best wishes

    John

    http://bit.ly/1bnsXMP      (click Survey Tips)

     

  3. I think that Clive raises a

    I think that Clive raises a valid point- the effectiveness of the training can only be gauged when it is put into action. 

    If you are looking to evaluate trainer effectiveness then this cannot be done in the room with a happy sheet and an enthusiastic trainer standing over everyone. These are often done as the last task of the session when people are eager to leave. They do not provide valid feedback.

    Personally I would send out surveys 14-28 days after the course and get them completed anonymously. Anonymity gives the person leaving feedback reassurance that they can be truly honest, and the delay gives them some time to evaluate their learning in an applied context.

  4. I would go further than Clive

    I would go further than Clive Boorman and A P Chester, and suggest you consider abandoning your "evaluation sheets" altogether.

    I don't think there are any questions that "have to be asked … right after the training has been completed".

    Clarify your aims: what do you want to know about, and decide from that what you will measure and how – not least who you are going to ask. If you want to know what learners think about training, interviews or focus groups with a small sample will probably give you the answers you need. You shouldn't go about pissing off large populations of learners by carpet-bombing them with reactionnaires or 'happy sheets' either immediately after training or even after a respectful interval. How much do you ever use this data anyway?

    The focus of your evaluation should be shifted to what performance improvements you gain, and what business results training/learning contributes to. Evaluation should be primarily about measuring impact of learning, not learner satisfaction – far too much of our effort goes into the latter and far too little into the former.

    Have a look at the free papers at http://www.airthrey.com/papers.htm – all previously published in TrainingZone.

  5. Evaluation

    The evaluations I collect and collate tell the stakeholders if the outcomes that were set for the training are being met or not. If not they also indicate what needs to happen to get the change we are seeking.

    I use Kirkpatrick (all 5 levels) and we spend as much time on the outcome, evlauation and follow up design as we do desiging the materials/content and methodology.  

    The additional levels of traingulation we add is to involve line managers and mentors, where used, to agree or dissagree on the outcomes being acheived.

    This approach has produced the best feedback I have ever received and helped to secure significant funding for major programmes on an on-going basis by showing the outcomes in action.     

     

  6. Training evaluation

    I think evaluation needs to include knowledge/skill improvement as a result of the course. For whatever course I am running I design a list of competencies being developed. The delegates self assess against the competencies at the start and end of the course. The initial value of the course is measured by taking a percentage increase in knowledge/skill (I have never experienced a percentage decrease!).

    See some samples of the knowledge/skill checklists at our free trainer resources pages: http://www.abctrainingsolutions.biz/evaluation_skills_free_download.html Hope that helps

    Bryan

No Image Available
Patty Conneen

Trainer

Read more from Patty Conneen
Newsletter

Get the latest from TrainingZone.

Elevate your L&D expertise by subscribing to TrainingZone’s newsletter! Get curated insights, premium reports, and event updates from industry leaders.

Thank you!