Introduction
“AI isn’t going to replace radiologists, but radiologists who use AI will replace radiologists who don’t”
.1
The introduction of artificial intelligence (AI) to radiology has created a sense of urgency among radiologists to learn about this new technology and its applications. Some surveys show that young medical professionals have recognized the major transformation that can happen to radiology [
1] and have been in search of training opportunities to prepare them for the future [
2]. Recent surveys show that most radiologists think that they should actively participate in training programs about AI as soon as possible [
3] and they want to learn how to integrate AI into their work [
4]. A recent international survey among 1041 radiologists shows that the majority of radiologists consider the lack of knowledge as a major hurdle for effective use of AI at work, thus demanding for AI-related training to be included in the residency curriculum.
2
Scholars have stressed the need for training programs that introduce AI and its applications to radiologists in a professionally effective language [
5], to help them understand how AI works with medical data and translates them into medical insights [
3]. In addition, the rapid technological changes require the development of new training programs and swift update of their content. Next to the basics of medical data and informatics, advanced training on computer science and machine learning should be included in radiology curriculum [
5].
Although AI training is not systematically integrated into their curriculum [
4,
5], there have recently been various initiatives to train radiologists on AI-related subjects. The European Society of Radiology provides an online curriculum of radiology training (ESR-ETC),
3 which includes learning “the functioning and application of Artificial Intelligence tools” as part of the broader “knowledge on different aspects of computer science and information technology in the field of medical imaging” in order to gain knowledge on “different technical options to implement AI and deep learning applications in the radiology workflow.” The European Society of Medical Imaging Informatics (EuSoMII) has supported the promotion and adoption of this curriculum for the integration of both technical and ethical aspects of deep learning into the radiology curriculum. EuSoMII has suggested different levels of knowledge and skills, which are now part of the ESR-ETC, with official credits to gain. However, the ESR-ETC is mainly a guideline and the actual training programs that need to materialize this guideline are not yet systematically incorporated in radiology curriculum in all member states.
Another notable initiative is the National Imaging Informatics Course (NIIC),
4 co-organized by the Society for Imaging Informatics in Medicine (SIIM) and Radiological Society of North America (RSNA), which includes using algorithms in medical imaging and the foundations of deep learning. This program is offered to radiology residents, but is also open to PACS managers and other professionals in the medical imaging domain. In addition, American College of Radiology has been recently active in offering training programs on the foundations of AI and its potentials in radiology practice.
Besides, there are many training programs offered by a wide range of institutions. These programs are often occasional, short, and not integrated into the learning trajectory of radiologists. The fact that training radiologists on AI is quite emerging, there is still a major gap between the offerings of the training programs and what radiologists need to learn [
4]. Radiologists and the training advisors are facing many scattered and occasional training materials, each have different focuses. They need to search for relevant training outside their official curriculum [
3] and assemble their own learning elements often unsystematically. This runs the risk of being lost in heterogeneous, fragmented programs and missing a pedagogical framework and effective learning trajectory to offer the knowledge and skills that suit various groups of radiologists. So much so that some critics have questioned the common trend that urges all radiologists to learn data science, without critically considering their professional needs. Comparing AI with MRI, they suggest that “the new generation of radiologists should not necessarily become computer experts, but they should have a basic knowledge of the underlying technique” [
6].
In this situation, we need a critical, systematic review of the AI training programs that are offered to radiologists. We need to systematically analyze their content, map out the various topics that are covered, and identify the missing ones. We also need to examine how these programs target different groups of radiologists to see how far they are specialized for their needs and their specific contexts. In addition, it is important to have an overview of the active agents and instructors in offering the training programs to examine how various institutional and professional institutions (e.g., medical, technological, and social) are involved. Finally, we should examine the ways in which training AI is promoted and legitimized as an important part of professional development for radiologists, especially when AI is not yet an official part of their training.
We conducted a systematic review of 100 AI training programs that are offered for radiologists in 2019 and 2020. Through the content analysis of these programs, we offer a systematic overview by examining (1) the topics that they cover and how actively they engage radiologists in learning about the application of AI in their work, (2) how specialized these programs are for various radiologists groups, (3) who are the active institutions in offering them, and (4) what are the ways in which the programs are legitimized as an important learning investment for radiologists. This analysis offers radiologists a systematic overview of the training opportunities that are available and enable them to critically select the relevant ones that suit their own needs. In addition, it helps managers and educational authorities see the available training programs that they can adopt and reflect on the further training programs that they can contribute to their development.
Methods
We conducted a qualitative content analysis of the AI training programs that are offered to radiologists [
7]. The research followed a process of search and selection, coding, and analysis.
Search and selection
In this study, we use “training program” as an umbrella term for multiple formats in which a training is organized. It includes on-site and online courses, both educational seminars and webinars, and workshops or lectures. For being included, a program has to (1) be about artificial intelligence or other related topics such as deep learning, machine learning, and learning algorithms; (2) target radiologists as the audience; (3) have specific learning objectives; (4) specify the time and ways of delivery; and (5) have specific instructors. We therefore excluded the presentations, talks, videos, and other materials that did not have one of these criteria.
We searched for online information about the training programs, basically via Google, LinkedIn, and PubMed platforms, using a series of keywords (see Table
1). We also searched on the website of various radiological associations and organizations (see Table
A6 in Appendix
1). We checked our initial list of training programs with four experts, who are active in offering AI training in radiology, which helped us enrich the list and find new programs. We initially found 130 potentially relevant programs. We excluded the programs about which limited information was available. Eventually, we collected a sample of 100 training programs, offered in 2019 and 2020 (until end of June), which appeared to cover a wide range of programs in terms of content and offering agents.
Table 1
Overview of the search platforms and keywords
LinkedIn | This platform provided me with lots of experts in the field of both radiology and AI who regularly posted initiatives for training programs. | 52 |
Google | The first five result pages of Google were explored to look for training programs using the search terms described in the paragraph below this table. | 38 |
PubMed | PubMed is a publicly accessible archive containing biomedical and science-related topics. The first five result pages of PubMed were explored to look for training programs using the search terms described in the paragraph below this table. | 10 |
Data collection and coding
We collected qualitative data about each training program by examining their general aspects (e.g., date, duration, and price), content, covered topics, target audience, instructors, offering agents, and ways of legitimizing (see Table
2). The qualitative data on each of these dimensions was archived in an excel file (see Appendix
2) for further analysis.
Table 2
The coding scheme
Date | The year in which the training program was held and the start date of the training program | To provide information about the date to include in the general overview of the training programs | Distinguishing between periods: - 2019—1st half - 2019—2nd half - 2020—1st half |
Delivery mode | How the training program is offered to the trainees | The availability of the training in different places and times | - Offline - Online - Hybrid |
Duration | The number of minutes, hours or days that a training program lasts | To provide information about the duration to include in the general overview of the training programs | - Shorter than 1 h - 1–3 h - 3–6 h - 1 day - 2 days - Longer than 2 days |
Price | The amount of money a training program costs for participation | To provide information about the price to include in the general overview of the training programs | - Free - 0–100 EUR - 51–100 EUR - 101–200 EUR - 201+ EUR - Not specified |
Content type | The type of content that is provided within the training program, more particularly: active or passive content | To distinguish between different kind of information that is provided in programs; theoretical, application, or hands-on content | - Passive-theoretical content - Passive-applied content - Active-hands-on training |
Topic | The subjects on which the programs focus | To classify the content of the programs based on the topics they address and give indications about the popularity of each topic | Related to - Medical - Technical - Ethical/legal - Managerial |
Audience | The people for whom the training is intended | To identify the intended people for whom the program is designed and what topics were common for this audience | - Only radiologists - Radiologists and other medical professionals - Radiologists and non-medical professionals - Radiologists and medical professionals and non-medical professionals |
Instructors | The speakers that teach the content of the programs | To find out what instructors teach on programs, see the link with the topics discussed and see if the instructors can be linked to an indication of legitimacy | Different instructors’ backgrounds: - Medical - Technical - Other |
Offering Agent | The institution that offers the training program | To examine how different institutions such as professional, academic, and commercial are active in offering the training programs | - Professional institutions - Academic institutions - Commercial companies - Other |
Legitimization | How the programs legitimize their training programs | To distinguish between the different ways of achieving legitimacy for their training program | Legitimizing based on: - Content - Offering agent - Accreditation - Acknowledgement |
We used deductive thematic analysis [
8] to analyze the data, based on the codebook (Table
2). One coder coded the entire data and a second coder cross-checked the results and resolved the ambiguities. The results were organized in a database (Appendix
2), in which we explored patterns based on cross-tabulation and comparisons between the training programs [
9]. We interacted with three experts in the domain of AI and radiology to validate the relevance and richness of the extracted patterns.
Discussions
As AI becomes increasingly important, radiologists need to actively engage in learning about it and acquire the skills that enable them to effectively use AI.
From scattered, fragmented trainings to systematic curriculum
Our analysis shows that most AI training programs offered to radiologists are primarily short, stand-alone, and fragmented.
6 Although they are useful for creating awareness and familiarizing radiologists with basic concepts, they still need to be complemented with programs that are coherent and pedagogically integrated into the general radiology curriculum. As we are in the stage of reframing the radiology profession, we need to develop training programs that are based on scientific research and offer an effective learning trajectory for radiologists who want to invest in long-term development of their career. Some initiatives such as the ESR-ETC can serve as guiding frameworks for designing the programs that complement each other. This requires a more active involvement of academic institutes in designing and offering the training programs that are based on scientific research and aligned with pedagogical principles.
7 When these training opportunities are offered systematically as part of the national radiology curricula, they could be incorporated into professional examinations such as European Diploma in Radiology (EDiR). Here, some regional and international institutions such as ESR can play a strong role by offering relevant courses needed for the EDiR examination.
From awareness to learning how to work with AI applications in clinical settings
Our findings show that the majority of the programs focus on passive delivery of content on the basics of AI and its potential impacts on radiology work. As AI gradually finds its way in the daily practice of radiologists, we need to engage learners in practical exercises with real AI applications and learning how to use them effectively and critically at their work. Training programs need to focus more on real use-cases, and (close to) real work settings and applications. Learning about the basics of machine learning, though might offer some basic knowledge, is not enough for radiologists who will be “working with,” rather than “developing,” the AI applications. Learning how to critically use and effectively integrate AI applications into the working routines requires developing new mental and practical skills. Developing these skills requires the involvement of not only medical and technical instructors, but also experts from organizational, legal, ethical, and psychological domains. Above all, younger radiologists need to be supported by programs that enable them to strategically design and develop their professional career for future.
From generic to specialized and customized training
Finally, our study shows that most of the current training is generic in terms of their content and target audience. Although these generic programs are effective for creating some basic familiarity, they are limited to develop knowledge and skills that are specific to the context of radiology work. Further programs can be more specialized on the specific use-cases that radiologists face in using AI and consider the special needs of radiologists (e.g., when residents need to check their examinations with a senior radiologist). Furthermore, we need to develop training programs that are customized and localized to the specific clinical practices and organizational settings. Especially in Europe, the working conditions, legal frameworks, and workflow configurations vary from one country to another. All these differences require customized training programs that go beyond the generic ideas and consider the specific organizational settings (e.g., technological infrastructure), legal frameworks (e.g., privacy regulations), and cultural aspects (e.g., patients’ and referring physicians’ expectations of and trusts in AI applications). Table
3 offers a list of suggestions for developing AI training.
Table 3
Suggestions for developing and selecting appropriate AI training programs for radiologists
• Integrating the basics of machine learning and its applications into radiology curriculum • Covering a pedagogical path through which learners accumulate their knowledge • Combining passive (offering content) and active (hands-on) elements • Paying balanced attention to the basics of machine learning and clinical application in radiology • Paying attention to the practical implementation in clinical settings, including the integration of AI with the existing systems and infrastructure • Covering the managerial, contractual, and financial aspects (e.g., how to define feasible business cases, how to estimate ROI and decide on the reimbursement model, and how to define value propositions) • Considering ethical, legal, and organizational aspects of working with AI (e.g., privacy and security considerations in handling medical data) • Being supported by academic institutes and scientific communities • Being customized (e.g., for sub-specializations) and localized into the specific working conditions and legal environments • Integrating the practical and supporting components to deepen the learning (e.g., career coaching, mentoring, …) |
Limitations and future research
Although we examined a diverse sample of training programs, our focus on LinkedIn (instead of Twitter) may have biased our sample towards Europe [
10]. Future studies can also examine the other offerings such as podcasts and knowledge clips, which, although are not officially framed as training programs, are often used by radiologists to learn about AI [
11]. In addition, since many training programs are still emerging, future studies need to continuously examine new trends and update our findings. Next to radiologists, other professional groups such as radiographers, clinical physicists, and technical physicians are learning how to use AI for supporting various tasks in the acquisition of the images and pre-processing them. Knowing how they learn to work with AI and what skills they can develop is important to understand how AI can be effectively used in radiology.
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.