摘要

OBJECTIVES: This study describes a fast and efficient method that uses a prevalidated videotape of an objective structured clinical examination (OSCE) in a fracture scenario to evaluate raters and to measure the consistency of raters from different subspecialties and with varying levels of seniority. %26lt;br%26gt;STUDY DESIGN: We videotaped clinical scenarios for the purpose of evaluating residents%26apos; communication and clinical assessment skills. All orthopedic staff used prevalidated checklists to assess residents%26apos; performance in the videotape at 3 different time points. Cronbach%26apos;s alpha was calculated to evaluate the internal consistency of the OSCE checklist construct. Kendall%26apos;s Wand KR-20 were used to investigate rater agreement. Expert validity was calculated to compare OSCE experts with the present raters. %26lt;br%26gt;RESULTS: A high Cronbach%26apos;s alpha for the 23-item scale regarding global assessment in all 3 tests confirmed construct validity. Kendall%26apos;s W showed only moderate inter-rater reliability. KR-20 was 0.96 for the pretest, 0.968 for the posttest, and 0.892 for the long-term test, indicating high internal consistency. The p-value for expert validity was 0.626 (independent t-test, n.s.). %26lt;br%26gt;CONCLUSIONS: This efficient and fast video-based assessment of raters was reliable and yielded satisfactory rater consistency and some evidence for validity. (J Surg 70:189-192.

  • 出版日期2013-4
  • 单位长春大学