Elsevier

Academic Radiology

Volume 15, Issue 7, July 2008, Pages 934-941
Academic Radiology

Radiologic education
2008 Joseph E. and Nancy Whitley Award recipient
Factors Affecting Attending Agreement with Resident Early Readings of Computed Tomography and Magnetic Resonance Imaging of the Head, Neck, and Spine

https://doi.org/10.1016/j.acra.2008.02.013Get rights and content

Rationale and Objectives

This study examines the joint effect of several factors on radiology resident performance in the task of interpreting after-hours neuroradiology examinations.

Materials and Methods

As part of a quality assessment process, we conducted a prospective evaluation of all (N = 21,796) after-hours preliminary readings of neuroradiology examinations performed by radiology residents over a 62-month period at our academic medical center. Each reading was scored by the interpreting neuroradiologist as “agree,” “disagree with minimal clinical impact,” and “disagree with significant clinical impact.” Coded resident and attending identities were also recorded for each case along with modality, body area studied, and the date of examination. These raw data were used to create an analytic data set with level of resident/attending agreement as the outcome and six predictors, including two date-derived variables: months 1–62 representing when the case occurred during the study and quartiles 1–4 accounting for the timing of the case in each resident's own experience. Cross tabulations, plots, bivariate statistics, and logistic regression were used to examine the relationships between study variables and the outcome (level of agreement).

Results

Over about 5 years of the study, the absolute number of significant disagreements remained stable at about three per month. The total caseload increased at a rate of 4.1 per month with most of the increase falling into the agree category, whereas the minimal disagreements actually decreased slightly (0.2 per month). In the logistic model for disagreement, three of the factors accounted for most of the variance: attending (61%), resident (15%), and month (15%). Study type (modality and area examined) accounted for another 10%. There was no significant contribution from the variable (quartile) constructed to test for individual resident learning during the on-call experience.

Conclusion

Although residents differ somewhat in the extent of attending agreement with their on-call work, evaluation or remediation made on the basis of simple comparison of these rates should be done with caution. Improved agreement over time seems to be a collective experience shared by residents.

Section snippets

Materials and methods

In March 2002, we instituted a quality assessment program to address radiology resident after-hours and weekend preliminary interpretation of neuroradiology computed tomography (CT) and magnetic resonance imaging (MRI) studies. This entailed a preprinted form attached to study requisitions that were completed by second through fourth year residents performing the initial reading (Fig. 1). These forms contained a text box for the resident to write their impression as well as spaces to denote

Results

There were 22,264 observations in the initial database query and after eliminating those for four residents with fewer than 200 entries, 21,796 remained, which were subsequently analyzed. These cases were preliminarily interpreted by 1 of 46 diagnostic radiology residents and reviewed by 1 of 9 attending neuroradiologists. Overall, there were 187 (0.86%) significant disagreements, 665 (3.05%) minimal disagreements, and 20,944 (96.09%) agreements. Figure 2 depicts a time series of the absolute

Discussion

Previous studies have addressed discrepancy rates between initial radiology resident and final attending interpretations of radiologic examinations. Most of the larger sample sizes have been achieved by characterizing and enumerating the disagreements (numerator) and then back-calculated percentages by counting the total number of cases (denominator) from other sources such as the radiology information system, Picture Archiving and Communication System, or dictation records (4, 5, 8, 9, 10, 11

References (15)

There are more references available in the full text version of this article.

Cited by (22)

  • Full Resolution Simulation for Evaluation of Critical Care Imaging Interpretation; Part 1: Fixed Effects Identify Influences of Exam, Specialty, Fatigue, and Training on Resident Performance

    2020, Academic Radiology
    Citation Excerpt :

    Neuroradiology cases were the most difficult to interpret correctly (p<0.0001) with effect size of 2.3 (95% CI 1.9–3.2) points lower than body and musculoskeletal oriented cases, which were not different from each other (p = 0.36). Multiple analyses by other investigators of resident-attending discrepancy reflect this trend with neuroradiology overnight readings having significantly higher rates compared to other sub-specialties (16–21). Case acuity had a significant effect (p<0.0001) on residence performance.

  • Resident Case Volume Correlates with Clinical Performance: Finding the Sweet Spot

    2019, Academic Radiology
    Citation Excerpt :

    It is likely some studies with major discordances, are not officially recorded by the attending radiologist and thus excluded from the calculation of the resident's discordance rate. Attending radiologist variability can therefore potentially affect resident discrepancy rates. (11,12,14). We also acknowledge that our measures of resident performance are imperfect.

  • The Clinical Impact of Resident-attending Discrepancies in On-call Radiology Reporting: A Retrospective Assessment

    2018, Academic Radiology
    Citation Excerpt :

    The discrepant management decisions or procedural techniques employed by residents of other specialties may not receive the same high profile persistence in the clinical record as those of radiology residents. Previous work has shown the attending to account for most of the variation in resident-attending discrepancies (23), thus an attempt was made to evaluate attending accuracy in their discrepancies. The attendings were correct 80.5% of the time when confirmatory testing was available, showing their supervision to be effective in improving study interpretation accuracy.

  • Clinical Impact of Second-Opinion Musculoskeletal Subspecialty Interpretations During a Multidisciplinary Orthopedic Oncology Conference

    2017, Journal of the American College of Radiology
    Citation Excerpt :

    Given the radiology-wide pressures to improve efficiency, add clinical value, and curb imaging costs, the clinical impact of second-opinion consultation reads is an evolving question that warrants further investigation [6-10]. Previous investigators have published discrepancy rates between different radiologists, including musculoskeletal radiologists, ranging from 0.1% to 26% [6-22]. Only one study has addressed the clinical value of second-opinion orthopedic radiology consultations; however, the study was not specific to a medically complex orthopedic oncology patient population [12].

View all citing articles on Scopus
View full text