Training Evaluation – a waste of time, or is it?
It seems appropriate to celebrate and reflect on the value of training this year, the 50th anniversary of the introduction of Donald Kirkpatrick’s model of training evaluation.
This article explores what evaluation is, why we should evaluate training and the underlying role different stakeholders play and the challenges faced. It concludes with a checklist to share some practical ideas that can be put into place within your organization.
When you google training evaluation, you will receive over 67 million responses which shows the enormity of the knowledge available to us.
This article doesn’t profess to possess all the answers or any secret recipe for success for training evaluation but intends to reinforce and summarise some of the key themes of debate and thinking and share some practical tips.
What is Evaluation?
The evaluation of training is essentially finding out whether the training delivered be it a formal workshop or an informal approach such as coaching was worthwhile or not and was it of value to those individuals involved or the organisation as a whole.
A broader perspective can incorporate learners’ colleagues, managers, senior managers, and the resulting direct or indirect impact on their customers. Those stakeholders identified need to be clear about the purpose of the training and here trainers are reminded of the importance of their role in setting robust and SMART learning objectives for their learners in order to have a benchmark for measures of success and broader metrics in place to evaluate against.
Evaluation has no specific timeframe and will have a different relevance for the different stakeholders; for the learner they may evaluate during the sessions; the trainer may consider the happy sheets and their performance; the manager may make changes in sales; the HR team may later think about attrition rates and its impact on motivation.
The question of why it is so important to evaluate training isn’t difficult to answer in the midst of a financial crisis when there are challenges on people’s time, with fewer people doing more work and budgets cut so the ability to justify costs and the benefits is even more crucial.
In a business climate which has undergone such a fundamental shift in the past 18 months this question almost seems irrelevant with resources – people, time and budgets at a premium. Ultimately has the organisation had a return on its investment in the training intervention or could the investment be better spent on something more worthwhile? However, here as trainers we have to consider at what levels to evaluate which interventions and the depth of our analysis for specific training and who should be involved in the organisation.
For example, do we really want to spend valuable time and resources evaluating an English or IT programme when time could be better spent evaluating the worth of a new Management Development or Leadership programme or delivering another workshop which our internal customers perceive as our role as trainers.
As trainers with integrity, justifying our role and presence is something we expect to at least try and do in some shape or form. Working with clients at the PeopleFirst consultancy wanting to nurture relationships and also provide the best practice consultancy training evaluation is a ‘must’ but that draws us to some of the challenges.
Clients and non HR professionals don’t always realise the importance of post training activities such as evaluation, practicing and applying the new skills and knowledge, further development coaching, conversations with managers, performance management, embedding of learning and changing behaviors.
What methods can we use?
The Mind Gym produced a useful summary table in 2004 outlining all the different models over the past 45 years (View PDF) which is a good starting point for investigating the history and different models.
A personal favourite and probably one of the most popular model referenced by trainers is Kirkpatricks four levels. It gives a clear structure and approach which can be adapted for the specific requirements of the intervention and the data available. The Kirkpatrick model has a fresh look this year and you can find out more from the whitepaper written by Jim Kirkpatrick, PhD and Wendy Kayser Kirkpatrick.
As trainers we must think systematically about learning and training following a cycle of identifying learning needs based upon the business needs, design and development, delivery of an appropriate solution to meet the skills, knowledge or attitude gap and then finally evaluate whether the training has been worthwhile from different perspectives of the organisation and whether a change in behaviour has occurred which impacts on the individual and organisation’s performance.
In my experience, the most difficult of all the elements is evaluation and although it deceptively comes at the end of the cycle it is something we have to consider first if we are to have any useful data at the end of the process.
We are also reliant upon working relationships with our colleagues and internal customers in other areas of the business to provide data, time and effort.. This responsibility particularly falls on supervisors and managers being the nearest in observing whether changes in behaviour have occurred as a result of the training provided.
Our challenges lie in access to and the amount of relevant data in the organisation pre and post the training event and driving learners and managers to collect data when in their view the learning experience may have finished. It’s time consuming to decide what information is going to be useful, may possibly be available, to find it and then put in mechanisms to collect it if they are not already in place. We have to decide where is our time best spent? The struggle is when our stakeholders and clients aren’t themselves interested or see the value – do we continue?
I belive we have to lead these organisations and start to help our clients, take their learning seriously whilst remembering what’s valuable to one organisation, client or individual isn’t necessarily of the same value for the other.
We know our happy sheets are not enough….It’s admirable that we want to focus on data collection and demonstrating worth but a reality check is needed about what is feasible and possible balancing the needs of the business, our time and energy versus truly demonstrating how valuable the training has been to the organisation and those individuals involved and in times of continuous change how change has come about.
In these times I take a pragmatic approach pushing for data, encouraging clients and individuals and trying to take a collaborative approach.
Therefore the education of managers and individuals to understand better the learning process and complexities of learning so that it’s not solely taking place and being measured in the classroom and encouraging them to taking responsibility for their learning and development with a long term approach on the job and keeping ongoing records and sharing stories so that data is captured throughout the learning process.
In conclusion, evaluation isn’t a waste of time however aside to having clarity about the objective of any training event from the outset there also needs to be clarity about if and why the training is being evaluated, the depth of the evaluation and data required with a view to what data is already in place and what needs to be created and who should be involved in the evaluation process.
Our roles and accountability is to drive the evaluation process across the organisation, working with and communicating to all involved at each stage and creating and using tools that are simple yet needing the minimum of resources in time, people and money but with maximum effect to demonstrate the enormous value that we know as trainers we play in driving individuals capability and the overall business performance and its capacity to adapt and change. It’s definitely not an easy role but one which no doubt we will again make more progress in the next 50 years.
Here are a few practical tips using Kirkpatricks four levels to help you get started on training evaluation:
Level 1 Reaction
- Don’t assume that the completed happy sheets will have all the answers. Talk to learners during and after the workshop to find out their reaction to the training.
- Alternate between both anonymous and named happy sheets dependent on the training intervention and what you are trying to achieve and find out from the evaluation process.
- When piloting training give at least 30 minutes at the end of the workshop to discuss the delegates reaction to the training in order to make amendments before going live
Level 2 Learning
- Use formal assessments and fun quizzes to measure if learners have achieved the necessary knowledge, skills and attitudes from the workshop
- Ask participants to rate themselves from 1 to 10 overall or on different areas of knowledge, skills and attitudes at the beginning and then at the end of the learning event.
- Recognize achievements throughout the learning event by congratulating and encouraging and reinforcing learning
Level 3 Behaviour
- Encourage managers to discuss any learning activity before and after the event to ensure everyone’s time away from their job is spent in a valuable way and as much learning as possible takes place
- Collect quantitative or anecdotal feedback from managers and colleagues about how a participants performance has improved and has been impacted by the training intervention
- Educate managers to understand that they have a key role in developing their people and check its on their job descriptions
Level 4 Results
- Use metrics that are already in place in the organisation or try and influence senior managers to create them rather than work in isolation as an HR or training function
- Sounds simple but the organisation and representative stakeholders in the learning event need to be clear about the intended objectives to measure the success
- Ask for help from the finance team who are the experts in the organisation
Please feel free to email firstname.lastname@example.org if you wish to share any thoughts or practical experiences about evaluating training.