A Method and Resources for Assessing the Reliability of Simulation Evaluation Instruments

Document Type


Publication Date



Aim. This article describes a successfully piloted method for facilitating rapid psychometric assessments of three simulation evaluation instruments: the Lasater Clinical Judgment Rubric, the Seattle University Evaluation Tool, and the Creighton-Simulation Evaluation Instrument™.

Background. To provide valid and reliable evaluations of student performance in simulation activities, it is important to assess the psychometric properties of evaluation instruments.

Method. This novel method incorporates the use of a database of validated, video-archived simulations depicting nursing students performing at varying levels of proficiency. A widely dispersed sample of 29 raters viewed and scored multiple scenarios over a six-week period. Analyses are described including inter- and intrarater reliability, internal consistency, and validity assessments.

Results and Conclusion. Descriptive and inferential statistics supported the validity of the leveled scenarios. The inter- and intrarater reliability and internal consistencies of data from the three tools are provided. The article provides information and resources for readers to access in order to assess their own simulation evaluation instruments using the described methods.

This document is currently not available here.