Simulation and educationWritten evaluation is not a predictor for skills performance in an Advanced Cardiovascular Life Support course☆
Introduction
The American Heart Association (AHA) Advanced Cardiovascular Life Support (ACLS) course was created in 1973 and introduced as an AHA sponsored program in 1974.1, 2 The ACLS course has served as the model for other short-course format resuscitation courses.3 Since its inception, passing both a written evaluation and a practical skills assessment were required for successful completion of the ACLS course. The validity of both the written and practical testing mechanisms currently in use have been demonstrated in previous studies.4, 5 There is however no published research that directly addresses the question of whether student performance on written tests in advanced life support courses correlates to objectively measured performance of resuscitation as a whole or its individual required subcomponents.
The current study proposed the following research question: is a written evaluation in an Advanced Cardiovascular Life Support course a predictor of students’ skills performance in a simulated cardiac arrest?
A single study of paediatric residents suggests that the correlation of written tests and objective performance measures may be poor as the mean score for the standardized Pediatric Advanced Life Support (PALS) written examination was 93.2% (±5.5%), but none of the 28 third-year paediatric residents were able to successfully perform both basic and advanced airway maneuvers.6 Only 11% of these same trainees were able to adequately perform both vascular access procedures (intraosseous and femoral central line placement using the Seldinger technique). Unfortunately, correlation data was not published between written and performance based tests as it was not the focus on the study.
An earlier study involving 45 paediatric residents described the residents as performing well on a cognitive written test with 80% also being able to achieve the end-point of four predefined resuscitation skills (bag-valve-mask ventilation, intubation, defibrillation, and intraosseous line placement).7 Again, this was not designed as a correlational study and although learners achieved the end-point of each skill, they performed quite poorly at many of the essential subtasks. For instance only 27% of residents verified suction was functioning and only 34% verified the bag-valve-mask was available and functioning prior to intubation. The equivocal findings and lack of true correlations make it difficult to make any generalizations from this paper relevant to the research question we have undertaken.
A systematic review of the literature addressing the correlation between written tests and performance based measures in basic life support (BLS) courses concluded that although the majority of papers (14 out of 17) did suggest a correlation, much of the data for healthcare professionals was inconsistent and contradictory requiring further research.8 The question of whether a written BLS test reflects BLS skills competence was felt to be “indeterminate.” Extrapolating the systematic review to advanced life support courses is not appropriate as the knowledge and skills to be performed in advanced life support are different than BLS and the relative performances of students in the written and performance measures are likely to be different.
The literature from the broader medical education community, although not conclusive, does suggest a positive correlation between written tests and more objective measures of performance such as Observed Structured Clinical Encounters (OSCEs).9, 10, 11, 12, 13, 14 The correlations are however quite variable and there is occasional published literature that contradict the above positive correlations.15, 16 Of particular interest from the above studies was the study by Ram et al.10 suggesting that the written test could actually correlate as well with performance in the clinical environment as an OSCE.
The lack of evidence directly related to the research question and inconclusive data in closely related fields emphasizes the need for well conducted trials to help guide the development of learner assessments within advanced resuscitation courses. Miller's classic pyramid outlines four successive levels of clinical assessment—Knows, Knows How, Shows, and Does.17 ‘Ideal’ evaluation of a learning should theoretically focus on learners’ actions in the clinical environment (i.e. what one ‘does’) but is generally not feasible for most short courses. Performance based tests, such as the resuscitation case scenario, assess the learner's ability to ‘show’ what they are capable of while written tests may be limited to lower levels assessing what the learner ‘knows’ (knowledge) or what they ‘know how’ to do (comprehension). Knowledge of a correlation between written and performance based tests (or lack thereof) is valuable, if not imperative, to the development of a well designed and practical assessment strategy for advanced resuscitation courses.
Section snippets
Materials and methods
A correlational study was conducted to compare ACLS written evaluation scores with practical skills performance in a simulated cardiac arrest scenario. Appropriate Institutional Review Board (IRB) approvals were obtained and subjects provided written informed consent for participation. Due to the need to match subject scores on the written evaluation with skills performance scores, subject anonymity was not possible. However, subject confidentiality was maintained.
Demographics of study subjects
Thirty-seven subjects initially enrolled in the study courses. The mean age of the subjects was 32.5 years. There were five males and 32 females in the study. Fifteen subjects were from bachelor degree nursing programs and 22 subjects were from associate degree nursing programs. 51.4% of the participants had prior health care experience, with a mean of 2.5 years experience (range 6 months to 30 years). Only one subject had prior ACLS experience. Three subjects withdrew from the program prior to
Discussion
At both the combined class level and the individual class level for these two ACLS courses, there was no correlation between the written evaluation and the practical skills evaluation. Therefore, the ACLS written evaluation cannot be used as a predictor of performance in cardiac arrest practical skills. The results from this study are consistent with previously reported findings in resuscitation education examining written testing instruments and practical skills assessments.6, 7
Development of
Conclusions
In the two ACLS classes conducted for this study, the ACLS written evaluation did not correlate with participant skills in managing a simulated cardiac arrest event immediately following an ACLS course. The ACLS skills evaluation tests a narrow portion of ACLS content while the written evaluation tests a much broader spectrum of content. Both work in concert to define participant knowledge and neither should be used exclusively to determine participant competence.
The broader implications of
Conflict of interest statement
The authors have no relevant conflict of interest on this topic.
Acknowledgements
The authors wish to thank David Matics, Louis Robinson, and Katrina Craddock; the instructors of the Charleston Area Medical Center Health Education and Research Institute ACLS program; and the course participants for their assistance with this study. Partial funding for this study was supplied by the Charleston Area Medical Center Health Education and Research Institute Research Appropriate Committee.
References (22)
- et al.
The development and evaluation of new versions of the written examination for the American Heart Association Advanced Cardiac Life Support provider course
Ann Emerg Med
(1994) - et al.
Feasibility and reliability of remote assessment of PALS psychomotor skills via interactive videoconferencing
Resuscitation
(2009) - et al.
Validation for a scoring system of the ALS cardiac arrest simulation test (CASTest)
Resuscitation
(2009) - et al.
Assessment of Advanced Life Support competence when combining different test methods—reliability and validity
Resuscitation
(2007) Standards for cardiopulmonary resuscitation and emergency cardiac care
JAMA
(1974)- et al.
Training in advanced cardiac life support
JAMA
(1976) - et al.
Training in advanced trauma life support
JAMA
(1980) - et al.
Advanced cardiac life support refresher course using standardized objective-based Mega Code testing
Crit Care Med
(1987) - et al.
Assessing pediatric senior residents’ training in resuscitation: fund of knowledge, technical skills, and perception of confidence
Pediatr Emerg Care
(2000) - et al.
Performance of advanced resuscitation skills by pediatric housestaff
Arch Pediatr Adolesc Med
(1998)
2005 international consensus on cardiopulmonary resuscitation and emergency cardiovascular care science with treatment recommendations. Part 8: Interdisciplinary topics
Resuscitation
Cited by (39)
European Resuscitation Council Guidelines for Resuscitation 2015. Section 10. Education and implementation of resuscitation
2015, ResuscitationCitation Excerpt :It is believed that learners with increased clinical experience have improved long-term retention of knowledge and skills.118,119 Written tests in ALS courses do not reliably predict practical skill performance and should not be used as a substitute for demonstration of clinical skill performance.120,121 Assessment at the end of training seems to have a beneficial effect on subsequent performance and retention.122,123
Retention of knowledge and skills after Advanced Cardiovascular Life Support courses
2014, American Journal of Emergency MedicineAnnual resuscitation competency assessments: A review of the evidence
2013, Australian Critical CareA systematic review of retention of adult advanced life support knowledge and skills in healthcare providers
2012, ResuscitationCitation Excerpt :Perhaps not surprisingly, skills appear to deteriorate more rapidly than knowledge. Similar discrepancies between knowledge and skill performance have been well documented in studies of ALS,32,33 BLS34–36 and other clinical skills.37 This latter finding suggests that the appropriate intervals and optimal retraining strategies may be determined separately for ALS knowledge and ALS skills.
- ☆
A Spanish translated version of the abstract of this article appears as Appendix in the final online version at doi:10.1016/j.resuscitation.2009.12.018.