Fostering reflective practice via an authentic multimodal assessment

Mrs. Brooke Russell | Science, Medicine and Health
Brooke designed an authentic, multifaceted assessment for EXSC200. It includes an initial multimedia submission of a telehealth consultation, a self and a peer review, as well as a reflection. The self review and the feedback received from multiple peers all formed distinct, valuable data points which allowed for a richer final reflection. This diverse combination of elements in the assessment design allowed Brooke to assess whether students were prepared for professional placement, which in turn led to stronger adherence to ESSA requirements.
Brooke Russell: My name is Brooke Russell. I currently teach clinical exercise practicum, and musculoskeletal clinical exercise physiology, and I previously taught Exercise 200, which is professional practice.
In Exercise 200, we need to show that we are appropriately preparing students for professional practice, and so we needed an assessment that was an authentic assessment design and met multiple subject learning outcomes, but also able to meet accreditation requirements.
This multifaceted assessment task provided students an opportunity to participate in work-like practices (Embedded WIL) and demonstrate essential professional skills, such as effective delivery of telehealth, professional communication, a clear understanding of the scope of practice, and the ability to provide and reflect on constructive feedback.
Through multiple iterations, Brooke has been able to streamline the assessment and find greater efficiencies in marking and the provision of feedback to students. In the most recent iteration, Brooke used the Peer Review tool in FeedbackFruits to implement this assessment task.
Designing the assessment
Brooke took a holistic approach to the design of this assessment, not only ensuring constructive alignment to the learning outcomes of the recently created subject and to ESSA standards, but also assessing student readiness to attend their first placement.
To achieve this, Brooke came up with a multifaceted assessment consisting of four parts:
1. Video recording of a client interview
As exercise scientists, students need to be proficient in effectively using telehealth, demonstrating not only appropriate use of technology and professional communication skills, but also obtaining client consent and maintaining confidentiality. These skills have become even more important with the increase in telepractice after COVID-19.
In the first part of this assessment, students are asked to role-play an initial telehealth consultation and are assessed on key professional skills, such as effective communication and scope of practice.
2. Self review
Another critical aspect of professional practice is one’s ability to reflect on the quality of their own work. For students, rubrics can be a powerful tool not only in allowing students to check their own work, but also to consider improvements in the quality of their work (Andrade & Du, 2005).
In this step of the assessment, students are asked to assess their interviews using the rubrics they will be marked against.
3. Peer review
Being able to provide specific and actionable feedback to peers is also a skill required from graduates in the Bachelor of Exercise Science. Research in student feedback literacy also suggests that students consider peer feedback as a means of personal development, and that providing and receiving feedback is key in helping them learn (Karal et al., 2022). As per the UDL Guidelines 3.0., through collaboration, interdependence, and collective learning, students “create partnerships that can push and extend each other’s thinking and practice care for one another” (CAST, 2024).
In the third part of the assessment, students use the same rubrics they have used in the self review to review their peers, providing specific feedback for each criterion. This allows students to practice providing appropriate and constructive feedback to peers, which is both a key learning outcome and an important ESSA standard.
4. Reflection
Being able to constantly reflect on their practice and thoughtfully consider the feedback received is essential for students to make the most of their professional placement. Reflective practice also allows the practitioner to produce an actionable outcome, leading to personal and professional development, as well as better quality service and improved client outcomes (Galutira, 2018).
In the final part of the assessment, students need to write a reflection considering their initial self review and the feedback provided by their peers. They use the Borton’s model of reflection, which proposes three questions: “What?”, “So what?”, “Now what?” encouraging them to evaluate what has happened and to consider a way forward should the event happen again (Sharp et al., 2024).
Implementing the assessment
Brooke carefully designed the four stages of this assessment, creating a staggered approach to feedback and thoughtfully scaffolding each step with a learner-centred approach. In this part of the showcase, Brooke discusses the rationale behind her assessment design decisions and the learning activities her students engaged with in preparation for this assessment.
A. Video recording of a client interview
Brooke used a blended learning approach to scaffold the knowledge and skills students needed for the client interview. This approach combined asynchronous theoretical resources with synchronous practical application to support student learning.
A1. Asynchronous theoretical resources
Firstly, students asynchronously engaged with online resources to cover key communication skills, including:
- the differences between open-ended and closed questions
- the role of paraphrasing and summarising in ensuring person-centred care
- clarification techniques and their importance for accuracy and completeness of information
A2. Synchronous practical application
Students participated in workshops and practical sessions where they applied these communication skills and practised using the necessary technology for the client interview. They also engaged with a pre-exercise screening tool to support risk stratification and ensure they worked within their scope of practice.
B. Peer Review
Understanding that providing feedback to peers is a skilled process and that attention is required “to what students should do in the role of a feedback provider” (Little et al., 2025, p.1), Brooke created time and space in her workshops to help students use their evaluative judgement to provide feedback to their peers. She designed a series of learning activities in alignment with Little and colleagues’ classification table on what is required for students to provide effective feedback. Using their evaluative judgement and the reflective skills required for better feedback provision also empowered students to craft and deliver a suitable feedback message (Little et al., 2025).
According to Tai et al. (2016), the ability to critically evaluate the work of others in appropriate ways is a key aspect of lifelong learning, and students need clear guidance to understand the notions of quality and standards of practice to be able to provide effective peer feedback. In the video below, Brooke discusses the activities she used to develop students’ feedback literacy and shares the positive student feedback she has received.
Brooke Russell: In order to prepare the students for the peer review portion of the assessment, we developed a workshop that the students actually learned how to use the marking rubric. So they were told about or provided information on each of the criteria and so that they understood what those criteria were assessing. So that was really effective in setting the expectations for what they actually needed to submit themselves, as well as what they needed to peer review.
And then we did a workshop where they were required to assess two different videos that we had put up on Moodle. So that was a video that was an example, initial consultation that was not so great. And then one that met expectations or was exceeding expectations, but they weren't told which one was which; and they had to use the marking rubric to actually assess those, as they would be doing in the peer review in the assessment task.
So once they had learned how to use the marking rubric, then we ran a separate workshop about providing effective feedback, because as part of the Feedback Fruits Peer Review tool, they had to provide a comment on each of the criteria to justify why they were giving the marks that they were giving.
But, in order to make sure that that feedback was able to be used by the student who received it, and also was relevant to the assessment task, we had to show them how to do that. So we taught them how to provide effective feedback and that was really important because when they go out on placement, they will need to be able to reflect on feedback that's given to them by supervisors.
Feedback that we received from the students was that they were - they felt like they were more involved in the process of receiving their marks, so they were more invested in the process because they knew exactly what to expect from being marked - being marked by their peers because they were also marking their peers. So I feel like it definitely - the process of teaching them how to use that rubric strengthened the assessment task because the students were more invested, and they understood exactly what they were supposed to be doing.
Brooke used the Peer Review tool in FeedbackFruits to implement this authentic assessment, and she designed a series of learning activities to prepare her students to use the tool effectively. These activties were delivered by the demonstrators in practical classes, and involved:
- A PowerPoint presentation – with step-by-step guidance on how to complete the task.
- A practice submission – an opportunity for students to practise downloading appropriate YouTube videos and completing the entire submission process.
- A practice peer review – allowing students to practise providing feedback on videos with demonstrators’ support, ensuring their feedback was appropriate and constructive.
Impact and reflections
In the video below, Brooke reflects on her experience designing this assessment and implementing the Peer Review tool in FeedbackFruits:
Brooke Russell: It was quite easy to go back after they had submitted things to look at the analytics so that as a marker, if there were any anomalies in terms of the peer reviews and you had a student who potentially had really low mark and two really high marks; as the instructor, you could go in there and moderate that really easily. So I felt like that was a really positive of using feedback fruits.
The other thing that was very beneficial when you compare it to say the Workshop tool was that, there wasn't that need to wait for the whole cohort to submit before the peer reviews could happen.
Whereas with the Workshop tool, we needed to wait till everyone had submitted before we could then allocate those peer reviews to all of the different students. Whereas this was like an automated system that just did it as the students submitted, so there was no waiting for all those late comers for the rest of the cohort to progress.
So I think that was one of the major benefits of the FeedbackFruits tool and also just the ease at which students were able to use it. There was limited instructions that needed to be given. It was quite clear and simple steps to follow.
And then making sure that the people who were marking the reflection portion of that assessment task, where the students had to then go and reflect on their feedback. That was really important to get consistency within the markers. So what we did as a team is, we sat down and marked a few of the submissions and then workshopped our answers to see whether there were any common themes coming out of the reflections; and then we came up with a few quick comments that could be stored in FeedbackFruits so that you could access those and use them over again in terms of providing a consistent message with feedback. So that was really important and really effective within the tool to be able to do that, to get that consistency across.
In the first iteration, they had to reflect on their personal, their submission before they received feedback. Whereas this time, because the tool allowed them to do a self assessment, they didn't need to do that reflective piece; and so that reduced marking, which - marking burden - which was great. But also, it helped them see in the same rubric how they were rating themselves against their peers. So then, if that was vastly different, then they maybe were able to reflect on their performance and that feedback in their final reflection. So they only did the final reflection this time because of that self assessment.
In terms of looking at the students submission and the quality of their work, this second iteration, I feel like because they weren't those challenges with uploading and submitting, the students were able to spend time submitting quality work. They knew how to do it, so that process wasn't daunting for them, and the expectations were set because of the rubric and instructions that we gave around that. So I feel like the quality of the submissions was more broadly, it was of a higher quality, and it was because I think they actually really knew what they were supposed to be doing because they had been stepped through it, and I guess as a teacher, that was what I realised was really important was that that preparation.
Not only are you developing skills in providing and giving and receiving feedback and being able to review somebody else's work, but it's actually following those processes and making sure that what you're providing in terms of feedback and your peer review is actually appropriate and going to be able to be used when the other student reflects on it. And then how to receive feedback and reflect on that. So they're using, they're developing tools that they're going to be using in their professional practice. Whereas before when we used other tools or the assessment tasks that when we ran it first, I'm not sure that that link really was there and that students knew about that link.
The only other thing that I would change moving forward is that this year they were because FeedbackFruits, the tool allows the students to submit an actual video file or a shared link; I think I would set the tool to only allow the video submission, because the only issue that we had was with students, and there was only a handful of them, that submitted or shared a link but they didn't change the access, so then we had problems with students being able to access videos. So I would not allow students to provide a link.
But, in saying that, the chat function on FeedbackFruits, which is right there in the tool, was really effective and I just sent a message off to FeedbackFruits and within minutes you'd have a reply and they would help you sort things out. So that was across the board, if there were any files missing or anything happened, the FeedbackFruits team was just really fast and responsive to anything that I had a problem with.
Students’ perspectives
"Some of the things I liked about Feedback Fruits is that it was much more straight forward to use compared to submitting a video through Moodle or YouTube, for example. Also having the option to link a comment to a specific time stamp on the video was very helpful, both in giving and receiving feedback, it made it much easier to understand the comments given/received."
- Student feedback
Advice for colleagues
Brooke shares her advice for colleagues considering a similar approach to assessment redesign:
Brooke Russell: Get someone external to the - to the immediate course or subject to read through your instruction documentation; because, I found sometimes that what I thought was really clear in an instruction sheet perhaps didn't make sense to others because I knew the tool and I knew how to use it. So, I guess we overcame that in those pracs by having a really structured "step one, step two, and this is how you do it", but if you weren't able to do that, then it would be really helpful to get an external person to read that so that they understood, you know, it was easy to follow.
And then the second thing that I would suggest is that spending or investing time in showing the students how to use the marking rubric, so that their peer reviews are valid and also useful for the person who's receiving the feedback. It's really important for that feedback to be time stamped in the video so that it has relevance to that person at that time and what they're actually doing; and you can see that when students really give or provide quality feedback, that they've actually really thought about it, and that's because they've been trained how to do it.
So I think the two pieces of advice would be get your instructions checked, and then make sure you invest time in teaching the students how to use that rubric, which is a skill that can be transferred to professional practice, so certainly not something that's just wasted on a an assessment task.
References
Andrade, H., & Du, Y. (2005). Student perspectives on rubric-referenced assessment. Practical Assessment, Research & Evaluation, 10(3), 1–11. Google Scholar
CAST (2024). Universal Design for Learning Guidelines version 3.0. Retrieved from https://udlguidelines.cast.org
Galutira, G.D., (2018). Theory of reflective practice in nursing. International Journal of Nursing Science, 8(3), 51-56.
Karal, Y., & Sarıalioğlu, R. Ö. (2022). The development of student feedback literacy through peer feedback in the online learning environment. European Journal of Open, Distance and E-Learning, 24(1), 36–52. https://doi.org/10.2478/eurodl-2022-0004
Little, T., Dawson, P., Boud, D., & Tai, J. (2025). What does it take to provide effective peer feedback? Assessment & Evaluation in Higher Education. https://doi.org/10.1080/02602938.2025.2475059
Sharp, S., Snowden, A., Stables, I., & Paterson, R. (2024). Ensuring robust OSCE assessments: A reflective account from a Scottish school of nursing. Nurse Education in Practice, 78, 104021-. https://doi.org/10.1016/j.nepr.2024.104021
Tai, J.H.-M., Canny, B.J., Haines, T.P., & Molloy, E.K. (2016). The role of peer-assisted learning in building evaluative judgement: Opportunities in clinical medical education. Advances in Health Sciences Education: Theory and Practice 21(3), 659–676. https://doi.org/10.1007/s10459-015-9659-0