Examining student learning and the utility of 'open-book' authentic case-base online assessments
Overview
This Learning Enhancement project has been funded through SATLE (Strategic Alignment of Teaching and Learning Enhancement) with the support of the National Forum / HEA.
PROJECT TITLE: | Examining student learning and the utility of 'open-book' authentic case-base online assessments |
---|---|
PROJECT COORDINATOR: | Associate Professor Sue Rackard, School of Veterinary Medicine |
COLLABORATORS: | Associate Professor Joseph Cassidy, Professor Bryan Markey, Assistant Professor Diane Cashman, Ms Holly Drain, Julianna Gallo, Dr Pratiksha Nagar, Dr Sean Lacey, MTU |
TARGET AUDIENCE: | Veterinary Medicine Students |
Background
Assessing veterinary students' developing clinical competency is widely conducted through the conceptual model of Miller’s Pyramid1. Many studies have explored the validity, reliability and defensibility2 of assessment methods that align to levels of Miller’s Pyramid, e.g. Single Best Answers and Objective Structured Clinical Examinations, which have become prominent in veterinary education assessment strategies3,4. Researchers have argued that these and other assessment types (script concordance testing, think-aloud protocols) have limitations in assessing varying cognitive processes of clinical reasoning5,6,7 that are required to manage clinical problems in real-world settings8,9,10,11. Furthermore, Scott12 argues that we cannot assume that all assessments drive learning.
Limited studies exist that explore alternative assessment methods designed to assess students' clinical reasoning in a summative, ‘open-book’, large-class invigilated environment. Therefore it is important to examine how assessments can support student learning whilst ensuring its validity in determining individual student performance.
The School of Veterinary Medicine aimed to explore this under-examined area through this project.
Goals
The overall aim of this project was to evaluate a novel assessment format that was developed to evaluate student clinical reasoning through an online authentic, case-based, and ‘open-book’ approach.
The key objectives were to:
- Explore the utility of an online, authentic, case-based, ‘open-book’ assessment format designed to assess clinical reasoning.
- Examine learning strategies students adopted to prepare for this assessment
The project aimed to support the development of students' clinical reasoning skills and approaches to studying.
Approach
A new assessment strategy was developed for two modules in the School of Veterinary Medicine in the middle phase of the veterinary medicine programme. Assessments were designed to be in-person, invigilated, time-limited, and open-book. The assessments were delivered through BrightSpace and were taken on a student's own device. Students were allowed to bring paper copies of their notes into the examination to help discourage rote learning as their prime study strategy. Students were not allowed to search the internet or use digital notes during the examination.
Students were provided several formative and low-stakes opportunities to familiarise themselves with the examination format in advance of the end-of-trimester assessment. In addition, group case-based learning was embedded into the module to support students' development of clinical reasoning skills. Assessment questions were designed in the format of cases. Students had to make a decision or evaluate a problem following the analytic decision-making process required of a veterinarian to complete the case.
A key focus for this project was to engage students as partners. A key work package of this project was designed for two student partners to investigate. This aimed to support the development of students’ research competencies, as this is a key programme outcome for the veterinary medicine curriculum. Students were supported to develop their research skills in social sciences. They presented their findings at the International Symposium of the Veterinary Schools Council - VetEd Symposium in July 2024. In addition, the student partners provided feedback on the overall project.
Results
Results
This project delivered a new teaching and assessment strategy to promote students' clinical reasoning and decision-making skills. A research project was conducted to evaluate the impact of this initiative.
The project identified a number of considerations for educators when choosing to adopt similar initiatives in their own modules and provided evidence of impact on student learning.
Firstly, our project has provided evidence that our approach to teaching and assessment of clinical reasoning skills impacted the student learning experience. Results of our evaluation highlight that students perceived the assessment strategy to be more authentic to prepare them for veterinary professional life. Students highlighted the low-stakes assessments were important for them to develop new study approaches for open-book examinations while building their confidence to successfully complete this new assessment format.
Secondly, our project identified a range of local considerations that faculty and staff need to consider when planning and successfully delivering open-book assessments. Specifically, we established a list of design principles for other educators to consider when embedding this approach into their own modules. We propose how assessment questions should be designed to support the development of clinical reasoning (for students in the early stages of their programme); module design considerations; considerations for the local administration of open-book examination, e.g. setting up in BrightSpace, room bookings, student instructions, considerations for students with special accommodations etc.
Resources
An educator's guide was developed from the outputs of this project. It provides key design principles that educators should consider when implementing similar initiatives in their own modules or programmes. This resource is available in the National Resource Hub.
Contacts:
Associate Professor Joseph Cassidy (joseph.cassidy@ucd.ie)
Associate Professor Sue Rackard (sue.rackard@ucd.ie)
Assistant Professor Diane Cashman (diane.cashman@ucd.ie)
References for this project include:
- Miller GE. (1990) The assessment of clinical skills/competence/performance. Acad Med. 65(9):S63-7.
- Kent H, Violate (2009) Validity, Reliability, and Defensibility of Assessments in Veterinary Education. J Vet Med Educ. 36(3), 271-275.
- Baillie S, Warman S, Rhind (2014) A guide to assessment in veterinary education. 3rd edition. Available online: https://edarxiv.org/kectf/. Accessed: 21Feb 2023.
- Rhind, SM, Baillie, S, Brown, F, Hammick, M, Dozier, M (2008). Assessing competence in veterinary medical education: where's the evidence?.J Vet Med Educ. 35, 407-411.
- Daniel, M; Rencic, J; Durning, S; Holmboe, E; Santen, S; Lang, V; Ratcliffe, T; Gordon, D; Heist, B; Lubarsky, S; Estrada, C; Ballard, T; Artino, A; Sergio Da Silva, A; Cleary, T; Stojan, J; Gruppen, (2019) Clinical Reasoning Assessment Methods: A Scoping Review and Practical Guidance. Acad Med 94(6), 902-912.
- Witheridge A, Ferns G, Scott-Smith (2019) Revisiting Miller's pyramid in medical education: the gap between traditional assessment and diagnostic reasoning. Int J Med Educ. 25(10), 191-192.
- Sam AH, Hameed S, Harris J, Meeran (2016) Validity of very short answer versus single best answer questions for undergraduate assessment. BMC Med Educ. 16 (1), 266.
- Connor, D, Durning, & Rencic, J. (2020). Clinical Reasoning as a Core Competency. Acad Med, 95 (8), 1166-1171.
- Neill, Vinten, C. Maddison, J. (2020) Use of Inductive, Problem-Based Clinical Reasoning Enhances Diagnostic Accuracy in Final-Year Veterinary Students. J Vet Med Educ. 47(4):506-515.
- Eva, W. (2005), What every teacher needs to know about clinical reasoning. Medical Education, 39: 98-106.
- Kassirer, Jerome (2010) Teaching Clinical Reasoning: Case-Based and Coached. Acad Med, 85(7): 1118-1124.
- Scott, I,M, (2019) Beyond ‘driving’: The relationship between assessment, performance and Med Edu. 54(1), 54-59.
- Van der Vleuten, (1996). The assessment of professional competence: developments, research and practical implications. Advances in Health Sciences Education. 1(1), 41-67.
- McKenny S, Reeves TC (2012) Conducting Educational Design Research. Routledge. London.