Video Recording in Veterinary Medicine OSCEs: Feasibility and Inter-rater Agreement between Live Performance Examiners and Video Recording Reviewing Examiners

4. Education Video Recording 610 Reproducibility of Results 600 04 agricultural and veterinary sciences 3. Good health 0403 veterinary science Animals Feasibility Studies Clinical Competence Educational Measurement Education, Veterinary
DOI: 10.3138/jvme-2019-0142 Publication Date: 2020-08-06T20:44:09Z
ABSTRACT
The Objective Structured Clinical Examination (OSCE) is a valid, reliable assessment of veterinary students’ clinical skills that requires significant examiner training and scoring time. This article seeks to investigate the utility implementing video recording by OSCEs in real-time using live examiners, afterwards examiners from within outside learners’ home institution. Using checklists, learners (n=33) were assessed one five on three OSCE stations: suturing, arthrocentesis, thoracocentesis. When stations considered collectively, there was no difference between pass/fail outcome (χ 2 = 0.37, p .55). However, when individually, 16.64, < .001) interaction station type 7.13, .03) demonstrated effect outcome. Specifically, being suturing with had increased odds passing as compared their arthrocentesis or thoracocentesis stations. Internal consistency fair moderate (0.34–0.45). Inter-rater reliability measures varied but mostly strong (0.56–0.82). Video spent longer assessing than raters (mean 21 min/learner vs. 13 min/learner). Station-specific differences among may be due intermittent visibility issues during capture. Overall, learner performances appears feasible, although time, cost, technical limit its routine use.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (32)
CITATIONS (13)