Abstract
Purpose:In conjunction with curricular changes, a process to develop integrated examinations was implemented. Pre-established guidelines were provided favoring vignettes, clinically relevant material, and application of knowledge rather than simple recall. Questions were read aloud in a committee including all course directors, and a reviewer with National Board of Medical Examiners (NBME) item writing and review experience. This study examines the effectiveness of this process to improve the quality of in-house examinations. Methods:Five hundred and twenty items were randomly selected from two academic years for initial comparison; 270 from 2000 to 2001, and 250 from 2001 to 2002. The first set of items represented the style, content and format when courses and tests were departmentally/discipline based, assembled by course directors, and administered separately. The latter group represented similar characteristics when courses and tests were organ-system-based, committee-reviewed and administered in an integrated examination. Items were randomized, blinded for year of origin, and rated by three NBME staff members with extensive item review experience. A five-point rating scale was used: one indicated a technically flawed item assessing recall of an isolated fact; five indicated a technically unflawed item assessing application of knowledge. To assess continued improvement, a follow-up set of 250 items from the 2002 to 2003 academic year was submitted to the same three reviewers who were not informed of the purpose or origin of this set of test items Results:The mean rating for items from 2000 to 2001 was 2.51 ± 1.27; analogous values for 2001–2002 were 3.16 ± 1.33, (t = 5.83; p < 0.0001), and in 2002–2003; 3.59 ± 1.15 (t = 10.11; p<0.0001). Conclusion:Pre-established guidelines and an interdisciplinary review process resulted in improved item quality for in-house examinations.
Similar content being viewed by others
References
S.M. Case D.B. Swanson (1998) Constructing Written Test Questions for the Basic and Clinical Sciences National Board of Medical Examiners Philadelphia, PA
S.M. Case D.B. Swanson D.B. Becker (1996) ArticleTitleVerbosity, window dressing, and red herrings: do they make a better test item? Academic Medicine 71 IssueID10 S28–S30
S.M. Downing (2002) ArticleTitleConstruct-irrelevant variance and flawed test questions: do multiple-choice item-writing principles make any difference? Academic Medicine 77 IssueID10 S103–S104
K.E. Duffield J.A. Spencer (2002) ArticleTitleA survey of medical students’ views about the purposes and fairness of assessment Medical Education 36 IssueID9 879–86 Occurrence Handle10.1046/j.1365-2923.2002.01291.x
N. Frederickson (1984) ArticleTitleThe real test bias: influences of testing on teaching and learning American Psychologist 39 193–202
R.C. Godfrey (1995) ArticleTitleUndergraduate examinations – a continuous tyranny Lancet 345 765–67 Occurrence Handle10.1016/S0140-6736(95)90644-4
N.E. Gronlund (1998) Assessment of Student Achievement Allyn and Bacon Boston, MA
T.M. Haladyna S.M. Downing M.C. Rodriguez (2002) ArticleTitleA review of multiple-choice item writing guidelines Applied Meas Education 15 309–333
J.J. Norcini et al. (1984) ArticleTitleA comparison of knowledge, synthesis, and clinical judgment Multiple-choice questions in the assessment of physician competence. Evaluation & the Health Professions 77 IssueID4 485–499
R.F. Josefowicz B.M. Koeppen S.M. Case R. Galbraith D.B. Swanson H. Glew (2002) ArticleTitleThe quality of in-house medical school examinations Academic Medicine 77 IssueID2 156–61
R. Renger L.M. Meadows (1994) ArticleTitleTesting for predictive validity in health care education research: a critical review Academic Medicine 69 685–87
D.B. Swanson S.M. Case (1992) Trends in written assessment: a strangely biased perspective R. Harden I. Hart H. Mulholland (Eds) Approaches to the Assessment of Critical Competence: Part 1 UK Norwich 38–53
C.P.M. Vleuten ParticleVan der G.R. Norman E. Graaff Particlede (1991) ArticleTitlePitfalls in the pursuit of objectivity; issues of reliability Medical Education 25 110–118
Author information
Authors and Affiliations
Corresponding author
Additional information
Received 1 April 2004; accepted in December 2004
Rights and permissions
About this article
Cite this article
Wallach, P.M., Crespo, L.M., Holtzman, K.Z. et al. Use of a Committee Review Process to Improve the Quality of Course Examinations. Adv Health Sci Educ Theory Pract 11, 61–68 (2006). https://doi.org/10.1007/s10459-004-7515-8
Received:
Issue Date:
DOI: https://doi.org/10.1007/s10459-004-7515-8