Monday, March 26, 2012

Post Session Reflection Part 3 - Evaluation

It's all about feedback and I had created my evaluation form using the free version of Survey monkey and this was shared with the group post session. There have been a number of responses to date and I will share some of the feedback later in the post. 

A considerable amount of time was spent deciding on the questions types to be included in my evaluation form and if you need to find any literature on this there is enough out there and here is a link to a number of useful resources - The Evaluation and Transfer of Learning. As your peruse through the evaluation -  Guest Speaker session 14 March 2012 most of you will notice that the questions posed aimed to evaluate the following. 
  • Reaction 
  • Learning 
  • Behaviour 
  • Results 
These four elements should not be new to you as they are the 4 levels of  Kirkpartricks Four level evaluation model (1959).  Kirkpatrick has a god like status, but I was interested to see if there were other evaluation models that could be used? I typed Alternatives to Kirkpatrick's evaluation model in Google and was provided with a host other alternatives, but I found the following quite interesting.



Stufflebeam's (1993) model seems to provide a systematic way of looking at the elements of the curriculum development process.  How do you use the model? The model requires that you ask questions in your evaluation about the elements that make up this model 
  • Context 
  • Input 
  • Process 
  • Product 
 I took a look at some of the questions that were suggested  and these were nothing new
  • Is the time adequate? 
  • Do the objectives derive from aims? 
  • How well/actively do students participate? 
This model could easily be adopted to a blended or online course. If you are interested to understand how to use this model take a look at the following link: 


What else did I consider whilst creating the evaluation form? The questions needed to be easy to complete. I had my audience in mind whilst create the questions. The overall layout was also important and I decided to cluster similar questions, which would save space and time for my audience. 


My experience to date has told me that end of session questionnaires provide  you valuable information on the event itself.  They record the participants' perceptions at one moment in time.   It is all about outcomes, which I did not share with the group during this session, but have noted as an element to be shared in the final session this week.  It is also important to not just consider outcomes in terms of knowledge gains, there are several other outcomes that could be foster and here is a list of them 


  • Expanded Understanding 
  • Increased insights into what was already known 
  • Clarified things that had been learned 
  • Refocused attention to a topic
  • Challenged thinking 
  • Stimulated interest to learn more
  • Provided ammunition  to use in an argument 
  • Stimulated new thinking 
When I glance over this suggested outcomes and can certainly see where my previous session fulfilled many of them.  The final post will look at the topic of engagement again and I will consider how other frameworks could be used to support engagement in blended and online education, my previous guest speaker sessions and my final guest speaker session. 

Resources: 
  • Kirkpatrick D. L. (1959). 'Techniques for evaluating training programs.'pp21–26. 
  • Stufflebeam, D. (1983/1993). The CIPP model for program evaluation. In G. Madaus, M. Scriven, & D. Stufflebeam, (Eds.), Evaluation models: Viewpoints on educational and human services evaluation (pp. 117-141). Boston: Kluwer-Nijhoff Publishing.

No comments:

Post a Comment