I feel as if I’ve been a bit late with this week’s assignment but also a part of me saw that it was about assessment and I’d be lying if I said I was happy about it.
I know that assessment and feedback is critical in our field but in my experience in the corporate world, between that, analysis and evaluations – these three are but poor, distant cousins to development and implementation of learning programs.
Reflecting on my work with clients in the past, I recall the conversations on whether content will need to be assessed to ensure that the learner can do the role. In the majority of the cases, despite good intentions including an assessment or an on-the-job observation and practical assessment to ensure the learner can do the skill or task, it’s the first thing that gets stripped out of the program to reduce the duration or an uncertainty whether team leaders will also actually carry out the assessment on the job.
There was always discussion around terminology too. For some reason, the words, “assessment”, “test” or “learning checks” are considered taboo. I’ve lost count the amount of times clients have said to me that their team members have an aversion to being ‘tested’ or ‘assessed’ because it leads to a negative connotation. What happens if they ‘fail’? Will they need to be counselled? Will they require extra observation?
So it gets left out.
I recall working on a <Role>Accreditation Program some years ago . It was a 12 week modularised self paced program with a Learner Guide and a Team Leader guide. Each week, the learner in that new role would undertake a blend of formal education (10%), coaching (20%) and experiential learning activities (70%) on the job.
I enjoyed developing this program but using the word “Accreditation” in its title didn’t sit well with me because the learner really wasn’t getting any formal recognition for this training. I argued against using this word in the title but the client didn’t want other titles I had offered as it reminded her of previous programs her predecessors had implemented into their department. She wanted her program to stand out as her initiative – and it had to be different than the others. That difference was the inclusion of weekly assessments where the learner demonstrated in front of a team leader what they had learned under the conditions and standards of the work environment before going onto the next module.
I created some real life scenario assessments with the support of subject matter experts and built a 12 week program that ensured anyone new to role would be proficient to the basic standards of knowledge and application of processes, systems and policies into their role – all on the job – all with the support of the team leader.
She stuck to her guns regarding the assessment despite protestations from other members of her team and I finished the program and just about ready to hand it over to the pilot group were eagerly awaiting it. Unfortunately, it wasn’t to be. A restructure resulted in her team being broken up, people moving on or out. The project was shelved – but in the meantime, I had the pilot group still interested in what I had developed because the training need was still there. Business must still go on.
So I handed over the program and they took it with glee and rolled it out to new people in roles and their team leaders. I am not sure whether they do the assessments but if it gave them a level of comfort because it met their training needs then I’m happy for them.
Now assessment is completely different with my experience of online compliance course ware. When I worked with clients to develop their online learning course ware, when the conversation came to ‘pass marks’ they immediately jumped to ‘100%’ because the organisation must show compliance to the topic – it’s not understanding whether the person has learned the content and how to apply it but really an audit that everyone is aware of their roles and responsibilities for compliance to policy.
It goes against all that is good in learning – and I don’t agree with it.
If you saw me two weeks ago trying to complete my annual mandatory online compliance training on our LMS, you would have understood.
Picture the scene. This is how I complete online assessments…
- Curse when I get the email stating, “You are now required to complete your Mandatory Online Compliance Course XYZ by COB Friday 30 May”
- Register for the XYZ course
- View how it is structured – Glance through the module titles – Yawn.
- Debate whether I should read the modules first and then do the assessment. Split second decision, no – go straight to the assessment
- Enter assessment
- Curse again because it’s 100% pass mark (brings up memories of the above discussions) – which means that the test is DEAD EASY because they want everyone to pass (If you’re really lucky, the course designers have not randomised the assessment)
- Do the assessment – taking written notes of what answers I responded to
- Check Final Assessment score. If I didn’t get 100%, repeat process, ad nauseum.
- As soon as I get 100%, I exclaim joy that I don’t have to do this again for another year!
What did I learn?
I don’t know – but I know that if I have any doubts about it, I can find the information I need in the policy section of the intranet or speak to my manager when I need this information.
So reading through the required readings this week, in particularly Effective Assessment in a Digital Age (JISC, 2010) much of the content didn’t seem to apply to my experience in the corporate context in the development of training programs in the past. However, it did give me some ideas around enhancing the feedback experience, peer and self assessment which undoubtedly will play a more of a major role in organisations who value the 20-70%.