Discriminating Features of Narrative Evaluations of Communication Skills During an OSCE
Abstract
Construct: Authors examined the use of narrative comments for evaluation of student communications skills in a standardized, summative assessment (Objective Structured Clinical Examinations [OSCE]). Background: The use of narrative evaluations in workplace settings is gaining credibility as an assessment tool, but it is unknown how assessors convey judgments using narratives in high-stakes standardized assessments. The aim of this study was to explore constructs (i.e., performance dimensions), as well as linguistic strategies that assessors use to distinguish between poor and good students when writing narrative assessment comments of communication skills during an OSCE. Approach: Eighteen assessors from Qatar University were recruited to write narrative assessment comments of communication skills for 14 students completing a summative OSCE. Assessors scored overall communication performance on a 5-point scale. Narrative evaluations for the top and bottom 2 performing students for each station (based on communication scores) were analyzed for linguistic strategies and constructs that informed assessment decisions. Results: Seventy-two narrative evaluations with 662 comments were analyzed. Most comments (77%) were written without the use of politeness strategies. A further 22% of comments were hedged. Hedging was used more commonly in poor performers, compared to good performers (30% vs. 15%, respectively). Overarching constructs of confidence, adaptability, patient safety, and professionalism were key dimensions that characterized the narrative evaluations of students’ performance. Conclusions: Results contribute to our understanding regarding the utility of narrative comments for summative assessment of communication skills. Assessors’ comments could be characterized by the constructs of confidence, adaptability, patient safety, and professionalism when distinguishing between levels of student performance. Findings support the notion that judgments are arrived at by clustering sets of behaviors into overarching and meaningful constructs rather than by solely focusing on discrete behaviors. These results call for the development of better-anchored evaluation tools for communication assessment during OSCEs, constructively aligned with assessors’ map of the reality of professional practice.
Collections
- Pharmacy Research [1316 items ]