Hostname: page-component-8448b6f56d-c4f8m Total loading time: 0 Render date: 2024-04-17T18:47:19.286Z Has data issue: false hasContentIssue false

The Script Concordance Test: A New Tool Assessing Clinical Judgement in Neurology

Published online by Cambridge University Press:  02 December 2014

Stuart Lubarsky
Affiliation:
Center for Education, Applied Health Sciences, Office for Medical Education, Faculty of Medicine, University of Montréal, Montréal, Canada
Colin Chalk*
Affiliation:
Center for Education, Applied Health Sciences, Office for Medical Education, Faculty of Medicine, University of Montréal, Montréal, Canada
Driss Kazitani
Affiliation:
Faculty of Medicine, Center for Education, Applied Health Sciences, Office for Medical Education, University of Montréal, Montréal, Canada
Robert Gagnon
Affiliation:
Faculty of Medicine, Center for Education, Applied Health Sciences, Office for Medical Education, University of Montréal, Montréal, Canada
Bernard Charlin
Affiliation:
Faculty of Medicine, Center for Education, Applied Health Sciences, Office for Medical Education, University of Montréal, Montréal, Canada
*
Montréal General Hospital, Room L7-313, 1650 Cedar AvenueMontréal, Québec, H3G 1A4, Canada
Rights & Permissions [Opens in a new window]

Abstract

Core share and HTML view are not available for this content. However, as you have access to this content, a full PDF is available via the ‘Save PDF’ action button.
Background:

Clinical judgment, the ability to make appropriate decisions in uncertain situations, is central to neurological practice, but objective measures of clinical judgment in neurology trainees are lacking. The Script Concordance Test (SCT), based on script theory from cognitive psychology, uses authentic clinical scenarios to compare a trainee’s judgment skills with those of experts. The SCT has been validated in several medical disciplines, but has not been investigated in neurology.

Methods:

We developed an Internet-based neurology SCT (NSCT) comprising 24 clinical scenarios with three to four questions each. The scenarios were designed to reflect the uncertainty of real-life clinical encounters in adult neurology. The questions explored aspects of the scenario in which several responses might be acceptable; trainees were asked to judge which response they considered to be best. Forty-one PGY1-PGY5 neurology residents and eight medical students from three North American neurology programs (McGill, Calgary, and Mayo Clinic) completed the NSCT. The responses of trainees to each question were compared with the aggregate responses of an expert panel of 16 attending neurologists.

Results:

The NSCT demonstrated good reliability (Cronbach alpha = 0.79). Neurology residents scored higher than medical students and lower than attending neurologists, supporting the test’s construct validity. Furthermore, NSCT scores discriminated between senior (PGY3-5) and junior residents (PGY1-2).

Conclusions:

Our NSCT is a practical and reliable instrument, and our findings support its construct validity for assessing judgment in neurology trainees. The NSCT has potentially widespread applications as an evaluation tool, both in neurology training and for licensing examinations.

Type
Original Article
Copyright
Copyright © The Canadian Journal of Neurological 2009

References

1. Schön, DA. The reflective practitioner: how professionals think in action. New York: Basic Books; 1983.Google Scholar
2. Montgomery, K. How doctors think: clinical judgment and the practice of medicine. Oxford: Oxford University Press; 2006.Google Scholar
3. Huddle, TS, Heudebert, GR. Taking apart the art: the risk of anatomizing clinical competence. Acad Med. 2007;82(6): 53641.Google Scholar
4. Accreditation Council for Graduate Medical Education. Table of toolbox methods. c2000 [cited 2008 Jul 15]. Available from: (http://www.acgme.org/Outcome/assess/ToolTable.pdf Google Scholar
5. Charlin, B, van der Vleuten, C. Standardized assessment of reasoning in contexts of uncertainty: the script concordance approach. Eval Health Prof. 2004;27(3):30419.CrossRefGoogle ScholarPubMed
6. Schmidt, HG, Norman, GR, Boshuizen, HPA. A cognitive perspective on medical expertise: theory and implications. Acad Med. 1990; 65(10):61121.CrossRefGoogle Scholar
7. Lemieux, M, Bordage, G. Propositional versus structural semantic analyses of medical diagnostic thinking. Cogn Sci. 1992;16: 185204.Google Scholar
8. Feltovich, PJ, Barrows, HS. Issues of generality in medical problem solving. In: Schmidt, HG, De Volder, ML, editors. Tutorials in problem-based learning: a new direction in teaching the health professions. Assen, The Netherlands: Van Gorcum; 1984. p. 128142.Google Scholar
9. Charlin, B, Boshuizen, HPA, Custers, EJFM, Feltovitch, PJ. Scripts and clinical reasoning. Med Ed. 2007;41:117884.CrossRefGoogle ScholarPubMed
10. Brazeau-Lamontagne, L, Charlin, B, Gagnon, R, Samson, L, van der Vleuten, C. Measurement of perception and interpretation skills along radiology training: utility of the script concordance approach. Med Teach. 2004;26:32632.Google Scholar
11. Carrière, B. Development and initial validation of a script concordance test for residents in a pediatric emergency medicine rotation. [Master degree thesis]. Chicago (IL): University of Illinois; 2005.Google Scholar
12. Marie, I, Sibert, L, Roussel, F, Hellot, MF, Lechevallier, J, Weber, J. The script concordance test: a new evaluation method of both clinical reasoning and skills in internal medicine. Rev Méd Interne. 2005;26:5017.Google Scholar
13. Meterissian, S, Zabolotny, B, Gagnon, R, Charlin, B. Is the script concordance test a valid instrument for the assessment of intraoperative decision-making skills? Am J Surg. 2007 Feb;193(2): 24851.Google Scholar
14. Llorca, G. Evaluation de résolution de problèmes mal définis en éthique clinique: variation des scores selon les méthodes de correction et selon les caractéristiques des jurys [Ill-defined problem assessment in clinical ethics: score variation according to scoring method and jury characteristics]. Pédagogie Médicale. 2003;4:808.Google Scholar
15. Sibert, L, Darmoni, SJ, Dahamna, B, Weber, J, Charlin, B. Online clinical reasoning assessment with Script Concordance test: a feasibility study. BMC Med Inform Decis Mak. 2005;5:18.CrossRefGoogle ScholarPubMed
16. Sibert, L, Darmoni, SJ, Dahamna, B, Hellot, MF, Weber, J, Charlin, B. Online clinical reasoning assessment with Script Concordance test in urology: results of a French pilot study. BMC Med Educ. 2006;6:45.Google Scholar
17. Elstein, AS, Shulman, LS, Sprafka, SA. Medical problem solving: an analysis of clinical reasoning. Cambridge, MA: Harvard University Press; 1978.Google Scholar
18. Gagnon, R, Charlin, B, Coletti, M, Sauvé, E, van der Vleuten, C. Assessment in the context of uncertainty: how many members are needed on the panel of reference of a script concordance test. Med Educ. 2005;39:28491.Google Scholar
19. Gagnon, R, Charlin, B, Lambert, C, Carriere, B, van der Vleuten, C. More cases or more questions? Adv in Heal Sci Educ Theory Pract. 2008 Epub 2008 May 15.Google Scholar
20. Crocker, L, Algina, J. Process of test construction. In: Introduction to classical and modern test theory. Belmont: Wadsworth-Thomson Learning; 1986. p. 6686.Google Scholar
21. The Royal College of Physicians and Surgeons of Canada. Objectives of training and specialty training requirements in adult and pediatric neurology. c2008 [cited 2007 Jan 21]. Available from http://rcpsc.medical.org/information/index Google Scholar
22. Gelb, DJ, Gunderson, CH, Henry, KA, Kirshner, HS, Jozefoicz, RF. The neurology clerkship core curriculum. Neurology. 2002 Mar 26;58(6):84952.Google Scholar
23. Accreditation Council for Graduate Medical Education. Outcome Project: enhancing residency education through outcomes assessment. c2008 [cited 2008 Jul 15]. Available from: (http://www.acgme.org/Outcome/ Google Scholar
24. Horowitz, SD. Maintenance of certification: the next phase in assessing and improving physician performance. Neurology. 2008;71:6059.Google Scholar
25. Schmidt, HG, Boshuisen, HPA. On the origin of intermediate effects in clinical case recall. Mem Cogn. 1993;21:33851.CrossRefGoogle ScholarPubMed
26. Grant, J, Marsden, P. Primary knowledge, medical education and consultant expertise. Med Educ. 1988;22:1739.Google Scholar
27. Norman, G. Research in clinical reasoning: past history and current trends. Med Educ. 2005;39:41827.Google Scholar
28. The Royal College of Physicians and Surgeons of Canada: CanMEDS 2005 framework, p. 2; c2005 [cited 2008 Oct 16]. Available at: (http://rcpsc.medical.org/canmeds/bestpractices/framework_e.pdf Google Scholar