UOW logo
UOW logo

Evaluating learning events and experiences

In higher education, you may be involved in developing a variety of learning events, including courses, subjects, lectures and tutorials. But how do you know that what you are doing is effective and meeting the desired results? We may evaluate learning events for a variety of purposes, such as course review, research or preparing evidence for continuing professional development.

The Kirkpatrick Model of Evaluation provides a framework to think about the evaluation of learning events and the types of evidence you might need to collect to measure that you are meeting your desired goals and outcomes. Ensure that the types of evidence produced by your chosen evaluation strategies allow you to shape future learning and teaching; evaluation is only as useful as the change it directs.


Why?

Evaluating the learning events and experiences that you design is a great way to collect evidence to ensure that you are meeting your aims and goals. The ways in which you evaluate really helps to determine the level of evidence and the assurances you have that you are meeting your learning and teaching targets. For example, a survey of learners can provide some insight into how learners perceive an event or experience. However, learner surveys can be limited in what they can tell you about what learners have learnt. Therefore, additional data, such as learner performance on an assessment task or learner application of the skills 6-12 months following a learning event will provide stronger evidence regarding its efficacy and impact.

 

How?

Typically, we may think of learning evaluation as getting feedback from learners by survey at the end of a session, or how learners perform on assessment tasks. However, that is not the only way to evaluate and obtain data to determine learning activity success.

Kirkpatrick Evaluation Framework

The Kirkpatrick Model of Evaluation provides a framework through which you can begin to consider a variety of evidence to evaluate learning and teaching experiences. The model consists of four levels of evidence to evaluate:

 

Level 1: Reaction

This is the fundamental level and where most learning evaluation stops. This assesses learner perceptions of and feelings about a learning event.

Possible Strategies: Learner polls (e.g., Slido poll in Webex, or Zoom poll) or surveys, informal feedback (e.g., thumbs up/down, or confidence scales), and teacher, subject or course evaluations.

 

Level 2: Learning

The degree to which learners achieve the desired learning outcomes and demonstrate the required skills, knowledge and attitudes. Evaluation at this level may be achieved through formative and summative assessments conducted as part of a learning event.

Possible Strategies: Assessment tasks, informal content quizzes, and additional brain teasers, scenarios or case studies.

 

Level 3: Behaviour

The degree to which the learners demonstrate a change in behaviour following a learning experience. The purpose at this stage is to measure transfer of knowledge and application outside of the learning context.

Possible Strategies: On-the-job observations, follow-up interviews, participant focus groups, portfolios and reflective journals.

 

Level 4: Results

This looks at the overall success of a learning event. This level can include measures, such as return on expectations, return on investment and organisational or cultural change.

Possible Strategies: Alumni surveys, industry feedback and awards and recognition.

 

Using the Kirkpatrick Framework to Create and Evaluation Strategy

The Kirkpatrick Framework is a great starting point when you are planning for the design of learning events/experiences and an evaluation strategy. Considering the goals and aims of your learning event, try to plot at each level of the Kirkpatrick Framework evidence that you might collect for evaluation purposes (see Praslova, 2010, pp. 222-223 for an example of this mapping).

 

Related information

The following resources link to to specific UOW initiatives and processes in place to evaluate learning and teaching and to facilitate quality assurance in UOW courses and subjects:

 

References

Cahapay, M.B. (2021). Kirkpatrick model: Its limitations as used in higher education. International Journal of Assessment Tools in Education, 8(1), 135-144. https://doi.org/10.21449/ijate.856143

Mind Tools. (n.d.). Kirkpatrick’s four-level training evaluation model: Analyzing learning effectiveness. Retrieved February 17, 2022 from https://www.mindtools.com/pages/article/kirkpatrick.htm

Praslova, L. (2010). Adaptation of Kirkpatrick’s four level model of training criteria to assessment of learning outcomes and program evaluation in higher education. Educational Assessment Evaluation and Accountability, 2010(22), 215-225. https://doi-org.ezproxy.uow.edu.au/10.1007/s11092-010-9098-7

Contact Learning, Teaching & Curriculum

Request support

Contribute to the Hub

Provide feedback

UOW logo
Aboriginal flagTorres Strait Islander flag
On the lands that we study, we walk, and we live, we acknowledge and respect the traditional custodians and cultural knowledge holders of these lands.
Copyright © 2023 University of Wollongong
CRICOS Provider No: 00102E | Privacy & cookie usage | Copyright & disclaimer