When I was managing evaluations for a large residency training program, there was an event I looked forward to and dreaded in equal measure: Objective Structured Clinical Examinations.
OSCEs are designed to test clinical skill performance and competence in a range of skills in a practical, real-world approach to learning and assessment. Our OSCEs included a plethora of Standardized Patients, faculty examiners, and 120 nervous residents rotating through four stations (on four concurrent tracks) over two nights.
It was a huge event, and while I enjoyed coordinating it, the aftermath left something to be desired.
Preparation included creating binders for each station on each track (16 in total) with enough printed copies of the assessment tool for each resident – plus extras just in case – as our exam was wholly paper-based. More than once, we had last minute changes or corrections to the tools and had to print new evaluation tools at the 11th hour.
How we collated and released the results
After the exam, there was inevitably missing data that had to be collected from examiners. It’s not ideal to ask examiners after the fact to try to recall a resident’s performance on a specific task. However, since it was intended as a formative exam to help residents prep for more high stakes exams, it was deemed acceptable.
For the stations that had complete answers, I began the task of data entry into excel spreadsheets, all with formulas to calculate the average for each question for each station. Transcribing the feedback was another monumental task, though between our small team, we were able to translate 99% of the handwriting.
Using a mail merge, I created an output file for each resident, and saved it as a PDF to be emailed to them individually, which also posed a risk since it was easy to attach the wrong results to an email.
This process, end-to-end, often took 2-3 weeks to complete. The potential for data entry and excel calculation errors was high, and resident feedback was clear that waiting so long between the examination and the results was not helpful.
How we overcame our biggest barriers
We wanted to move to collecting data electronically, but we did not have access to software that was purpose-built to support OSCEs. That said, as a One45 user, I knew I could easily collect the data I needed in One45.
We also knew that asking 16 examiners to remember to bring a (working) laptop or tablet was risky, and we did not have enough in our department to provide them. Additionally, since we ran the exam on campus, all 16 had to know their university credentials to log into the wifi, and with an evening exam, there was no IT support to help them if there was a problem.
In a stroke of luck, one of our colleagues happened upon a service that rented tablets for events.
We rented 18 tablets for our next exam – and we were glad we ordered two more since we couldn’t connect one to wifi the night of the first exam.
Examiners were provided with a notepad for in-the-moment notes and questions, and this was collected and shredded at the end of the exam. They also wrote out their feedback on a separate form that was then distributed to residents the next day, with a copy retained for the program.
What we did differently
Our faculty OSCE coordinator had access to view forms in One45, so once each station was finalized, they were able to check the assessment tool in the system and verify it was correct.
We elected to do single sends rather than create an evaluation workflow, and sent each examiner an evaluation form for each resident an hour before the exam. Since we send out individual results to learners that included the average score per question by PGY level and overall, we did not allow learners to see results as soon as they were available.
If an examiner found a discrepancy or error on the assessment tool the night of the exam, it was easy to fix on the spot. The assessment tools had mandatory questions, so examiners couldn’t miss a question, resulting in no data gaps.
Data reporting
Working with support, I realized that the way we needed to structure our results, I’d be best suited to exporting my data into Excel, and work with the raw data, rather than releasing the forms themselves, leveraging the auto-calculation of Reports by Form, or use the Grades/Marks module.
So after the exam, I exported the results in excel format using the Form Data Report, and added my necessary calculations. I still needed to use a mail merge to send the residents their results, but the process from exporting to emailing took me less than a day, rather than the 2-3 weeks it previously took.
We were also lucky that our internal IT team had also created a script that would email the PDF’d results to residents as well, reducing potential errors and time spent.
Final thoughts
After three years of running paper-based OSCEs and one year using One45 for collection and reporting, I can confidently say there’s no going back to paper! While I can say that the shift meant substantially less work for me as an administrator, it benefitted residents equally, if not moreso. They had their feedback within days of the exam, instead of weeks, and because the feedback was timely, they could connect it easily to their performance and interactions as they remembered them. The reduction in errors was also substantial, and could be further reduced if we leveraged One45 for reporting instead of Excel.
This year, with the pandemic still changing the way programs assess their learners, my old program is taking the spring OSCE from in-person to a virtual exam, and I’m so glad we moved to electronic data collection so there is one less process that needs to be adapted. I can’t wait to hear how it all plays out for them!
The post Using One45 for OSCE data collection and results appeared first on One45.