Appendix 1
Working group 1—scanners, quality assurance and validation of scans
Specific statements and agreement levels.
Participant Demographics: Statements were voted on by 26 SDiPath members, the composition of which consisted of 73% (19) pathologists | 15% (4) researcher | 4% (1) industry business development | 8% (2) Lab Staff/Procurement/IT.
These participants state their place of work to be 61% (16) university institute | 23% (6) public hospital institute | 8% (2) private institute | 8% (2) industry.
1. Scope of the diagnostic workflow of Digital Pathology
1.1 Workflow determination
1.1.1 Know your scope: The institute should clearly define the scope of the targeted DP-workflow (i.e., intended use). For example: diagnostic biopsy workflow, special stain workflow, image analysis workflow, fluorescence workflow, organ-specific workflow
- 77% (20) Strongly Agree | 23% (6) Agree.
1.1.2 Required workflow analysis: The entire workflow needs to be inclusive of all workflow components and should consider current and future requirements for long-term flexibility (e.g., paperless transitions)
- 73% (19) Strongly Agree | 27% (7) Agree.
1.1.3 The entire workflow needs to be described in a well-documented standard operating procedure (SOP) including its components, the operation and handling of the scanning step, the scanning mode (or modes) for different settings as evaluated in the validation, and the internal quality control (e.g., scan quality, others
62% (16) Strongly Agree | 27% (7) Agree | 8% (2) Neutral | 4% (1) Disagree.
1.1.4 SOPs should be written in line with on-site quality management systems and/or accreditation requirements and targets
58% (15) Strongly Agree | 38% (10) Agree | 4% (1) Neutral.
1.2 Participants and access authorization
1.2.1 Security settings of authorized access need to be determined for each component of the workflow, including physical and digital modalities
46% (12) Agree | 42% (11) Strongly Agree | 12% (3) Neutral.
1.2.2 Stakeholders (staff, pathologists, technicians, IT) involved in the validation/workflow process need to be informed about their task and receive associated training
81% (21) Strongly Agree | 19% (5) Agree
1.3 Workflow adjustments
1.3.1 Prioritize workflows based on scope using estimated case load and turnaround time (e.g., all biopsies should be processed first by 11 a.m. to ensure same-day sign-out)
58% (15) Strongly Agree | 27% (7) Agree | 8% (2) Neutral | 8% (2) Disagree.
1.3.2 Define the turnaround time by time recording of the individual work steps, but especially additional work steps:
42% (11) Agree | 38% (10) Strongly Agree | 19% (5) Neutral.
1.3.3 The Laboratory Information System (LIS) should be adapted to work with digital pathology workflows in terms of clinical information, case management, and worklists of attributed cases with the goal of eliminating mismatching of slides from different patients
65% (17) Strongly Agree | 35% (9) Agree.
2. Scanner requirements
2.1 Scanner selection criteria
2.1.1 Evaluate different scanning systems in a pre-selection process for example using a scoring model. Besides technical requirements, integration into the LIS are important items to be considered
54% (14) Strongly Agree | 42% (11) Agree | 4% (1) Neutral.
2.2 Technical requirements, to be considered
2.2.1 Scanner must meet the intended purpose/must reflect the conditions that are suitable for the intended purpose (including capacity (how many slides/time unit), slide type, scanner slide magazine compatibility, others)
65% (17) Strongly Agree | 35% (9) Agree.
2.2.2 For diagnostic purposes in pathology, scanner must be CE-IVD certified
37% (11) Strongly Agree | 27% (8) Agree | 23% (7) Neutral | 10% (3) Disagree | 3% (1) Strongly Disagree.
2.2.3 Scanner maintenance should be taken into account: costs, effort, effect on workflow (time, frequency)
50% (13) Strongly Agree | 42% (11) Agree | 8% (2) Neutral.
3. Output formats (refer additionally to WG2)
3.1 Identify the ideal scanning profile settings associated with the specific workflow, resulting in consistently high picture quality
73% (19) Strongly Agree | 23% (6) Agree | 4% (1) Disagree.
3.2 Define file formats for storage, sharing etc.
62% (16) Strongly Agree | 35% (9) Agree | 4% (1) Neutral.
3.3 Define picture size, format and archiving period
54% (14) Strongly Agree | 42% (11) Agree | 4% (1) Neutral.
4. Scanner validation study
4.1 Validation aim
4.1.1 Define the scope of the validation study (see WG1–1.1.1)
65% (17) Strongly Agree | 23% (6) Agree | 12% (3) Neutral.
4.1.2 Select tissue source, stains and techniques (e.g., HE, fluorescence, frozen)
65% (17) Strongly Agree | 23% (6) Agree | 12% (3) Neutral.
4.1.2 Select tissue source, stains and techniques (e.g., HE, fluorescence, frozen)
65% (17) Strongly Agree | 27% (7) Agree | 8% (2) Neutral.
4.1.3 Define the acceptance criteria for diagnostic purpose
62% (16) Strongly Agree | 38% (10) Agree.
4.1.4 Define needed concordance level (e.g., same observer > 95%)
58% (15) Strongly Agree | 27% (7) Agree | 15% (4) Neutral.
4.1.5 Define severity for non-concordance (minor–major)
50% (13) Strongly Agree | 27% (7) Agree | 23% (6) Neutral.
4.1.6 Define deviation (e.g., any finding identified by one modality but not with the other (DP vs mic) with clinical significance
54% (14) Strongly Agree | 31% (8) Agree | 15% (4) Neutral.
4.2 Validation protocol and sample size
4.2.1 Establish a validation protocol based on the validation criteria and test a sufficiently representative sample for each application
46% (12) Strongly Agree | 42% (11) Agree | 12% (3) Neutral.
4.3.1 A report summarizing the validation aim, the results, and the final conclusions that points out for what clinical use the DP workflow has been validated and approve
62% (16) Strongly Agree | 38% (10) Agree.
4.3.2 The report or the amendments shall also mention the technical requirements, scanner settings, and trainings that were evaluated and determined
62% (16) Strongly Agree | 35% (9) Agree | 4% (1) Neutral.
Appendix 2
Working group 2—integration of WSI-scanners and DP systems into the Pathology Laboratory Information System.
Specific statements and agreement levels.
Participant demographics: These statements were voted on by 25 SDiPath members, the composition of which consisted of 72% (18) pathologists | 16% (4) researcher | 12% (3) lab staff/procurement/IT.
These participants state their place of work to be 60% (15) university institute | 32% (8) public hospital institute | 4% (1) private institute | 4% (1) industry.
5. Visualization (monitors)
5.1 General considerations
5.1.2 Larger, high-resolution displays show more of the slide at 1:1 magnification (1 screen pixel = 1 image pixel). Lower resolution displays require more panning of the image in order to cover the same physical area. The monitor should be validated by an expert and selected by the pathologist
68% (17) Strongly Agree | 20% (5) Agree | 12% (3) Neutral.
5.1.3 When selecting the monitor size, the working distance between the monitor and the pathologist must also be taken into account so that ergonomic working (according to the guidelines of the Caisse nationale suisse d’assurance en cas d’accidents/Schweizerische Unfallversicherungsanstalt/Istituto nazionale svizzero di assicurazione contro gli infortuni
1) is possible
68% (17) Strongly Agree | 20% (5) Agree | 12% (3) Neutral.
5.2 Recommendations
5.2.1 The monitor should have a color calibration option with a low deviation. Automatic self-calibration and adaptation to ambient light are recommended. Manual calibration should be performed at time intervals according to the manufacturer’s recommendations. Calibration steps should be documented
44% (11) Strongly Agree | 44% (11) Agree | 8% (2) Neutral | 4% (1) Disagree.
5.3 Brightness and contrast
The screen should have a minimum contrast ratio of 1000:1 and a brightness of at least 260 cd (candela)/m
2 in order to maintain high readability in brighter ambient lighting situations. The minimum brightness should be displayable at 0.5 cd/m
2 or greater [
27].
44% (11) Strongly Agree | 36% (9) Agree | 20% (5) Neutral.
5.4 Color depth
5.4.1 The displayable color space should support 24-bit color (8-bit RGB) and 8-bit grayscale. Color depth: Coverage of at least 98% of the Adobe RGB color space is most likely beneficial to display WSI colors accurately [
28].
48% (12) Strongly Agree | 40% (10) Agree | 12% (3) Neutral.
5.4.2 Navigation devices should allow a smooth and ergonomic slide navigation within the viewer software
76% (19) Strongly Agree | 20% (5) Agree | 4% (1) Neutral.
6. Integration of the WSI scanner into the pathology information system (Patho-LIS) for routine diagnostics
6.1 The scan workflow should be integrated into a Pathology Laboratory Information System (Patho-LIS), image management system (IMS) and an image archive. Therefore, it ideally supports standard communication formats such as HL7 and open image format standards (e.g., DICOM)
-80% (20) Strongly Agree | 16% (4) Agree | 4% (1) Neutral.
6.2 All image data in open-formats should optionally be sent to a storage system (e.g., picture archiving and communication system (PACS), vendor neutral archive (VNA)) and retrieved from there using an appropriate streaming mechanism
-68% (17) Strongly Agree | 24% (6) Agree | 8% (2) Neutral.
6.3 A secondary test environment is recommended to test the respective parameterization of the digital workflow. This system has to function independently from the production system (used for diagnostic) and allows for testing new functionalities, software updates, or functional integrations
-48% (12) Strongly Agree | 32% (8) Agree | 20% (5) Neutral.
6.4 Interfaces between WSI scanner and Patho-LIS or between Patho-LIS and IMS must be established. The application of the interface must be tested by at least one pathologist as part of the required validation study
-72% (18) Strongly Agree | 28% (7) Agree.
6.5 In addition to the barcode on the glass slides, the alphanumeric code (i.e., sample id and staining) of the slide should also be printed. In this way, the slide can be identified by comparing the recognized barcode with the alphanumeric label and manually corrected, for example, by comparing it with the original slides
-72% (18) Strongly Agree | 24% (6) Agree | 4% (1) Neutral.
6.6 Any communication protocols should be documented in order to facilitate finding integration issues, ensuring reproducibility and future updates
-56% (14) Strongly Agree | 44% (11) Agree.
6.7 The scanning process should be verified after hardware or software modifications (e.g., updates, upgrades) in addition to the controls required by national guidelines
-72% (18) Strongly Agree | 16% (4) Agree | 12% (3) Neutral.
6.8 A quality control step should be in place before the scans are handed over to a medical expert. This can be automated, or by hand
-64% (16) Strongly Agree | 28% (7) Agree | 8% (2) Neutral
6.9 Slides used outside of clinical routine diagnostics (e.g., research projects)
must comply with the Federal Human Research Act (HRA)/
Humanes Forschungsgesetz (HFG)/Loi relative à la recherche sur l’être humain (LRH)/Legge sulla ricerca umana (LRU) [
29].
-64% (16) Strongly Agree | 20% (5) Agree | 16% (4) Neutral.
7. Recommendations for IT interfaces, standards and workflow
7.1 The image viewer comprised of (a) virtual microscope which displays the whole slide images (WSI) of scanned histological sections and (b) macroscopic specimen image viewer should be integrated into an image management system (IMS) that allows a (bi)directional communication with the Pathology Laboratory Information System (Patho-LIS) and digital archive
-72% (18) Strongly Agree | 24% (6) Agree | 4% (1) Neutral.
7.2 If an IMS is used, it must retrieve or pull all necessary information from the Patho-LIS to identify and link the scanned slides to the corresponding LIS entries
-64% (16) Strongly Agree | 24% (6) Agree | 12% (3) Neutral.
7.3 Virtual microscopes should support a comparison view between photographed (overview images) and scanned images to check WSI for completeness
-72% (18) Strongly Agree | 24% (6) Agree | 4% (1) Neutral.
7.4 The network speed required for a smooth workflow depends on various parameters (e.g., number of scanners used simultaneously, distance of the scanners to the server) and must be adjusted according to the specific condition. As a general recommendation, the minimum network speed should be 1 Gbps for each individual scanner
-72% (18) Strongly Agree | 16% (4) Neutral | 12% (3) Agree.
7.5 The scanner(s) should be connected to a high-performance storage solution (low latency, scale-out architecture, fast transfer speed) that can be integrated into the existing system landscape
-68% (17) Strongly Agree | 24% (6) Agree | 8% (2) Neutral.
7.7 For the implementation of digital pathology in routine diagnostic, it is recommended to configure the system in such a way that redundant installations (e.g., not only one but at least 2 scanners) and/or an alternative workflow are defined (e.g., maintain the possibility to dispatch the slides) in case of a hardware/software malfunction
-76% (19) Strongly Agree | 24% (6) Agree.
8. Recommendations for archiving
8.1 To ensure a smooth workflow in DP and daily routine practice, it is recommended to store the WSI for at least 3 months on a high-performance server architecture
-72% (18) Strongly Agree | 20% (5) Agree | 8% (2) Neutral.
8.2 As a possible currently viable approach archiving of WSI, e.g., in a picture archiving and communication system (PACS)/vendor neutral archive (VNA) for 3 years can be envisaged as a cost–benefit compromise. Such duration of digital archiving would cover the majority of situations in the routine diagnostic workflow where cases need to be compared with previous biopsies. Long-term storage of glass slides should remain unchanged
-57% (17) Agree | 23% (7) Strongly Agree | 10% (3) Neutral | 10% (3) Disagree.
8.3 The storage concept should take into account compression methods up to the end of the visualization chain, extensive error redundancy during storage, automatic progressive arrangement of the compressed data streams and the patent-free nature of the storage format
- 40% (10) Strongly Agree | 40% (10) Agree | 20% (5) Neutral.
8.4 The acquisition of systems that use industry standards for communication (e.g., DICOM, HL7, CDA, FHIR
2) with third-party systems (e.g., the hospital information system) or whose systems meet the Integrating the Healthcare Enterprise (IHE) conformance criteria should be preferentially considered for both interoperability and longer-term sustainability (i.e., less likely to become outdated), especially in the environment of larger institutions (e.g., universities/public hospitals)
-48% (12) Strongly Agree | 32% (8) Agree | 20% (5) Neutral.
Appendix 3
Working group 3—digital workflow—compliance with general quality guidelines.
Specific statements and agreement levels.
Participant demographics: These statements were voted on by 22 SDiPath members, the composition of which consisted of 73% (16) pathologists | 18% (4) researcher | 9% (2) lab staff/procurement/IT.
These participants state their place of work to be 59% (13) university institute | 27% (6) public hospital institute | 9% (2) private institute | 5% (1) industry.
9. Quality requirements
9.1 For laboratory staff and technicians
9.1.1 All technicians should be specifically trained for digital pathology requirements, e.g., avoid specimen placement at the edge of slides, sensitivity to artifacts like scratches, and folds
-77% (17) Strongly Agree | 14% (3) Neutral | 9% (2) Agree.
9.1.2 Training with scanners and other high-tech equipment is recommended to advanced users
-68% (15) Strongly Agree | 18% (4) Agree | 9% (2) Disagree | 5% (1) Neutral.
9.1.3 All workflow steps should be documented in SOP of the quality management system
-82% (18) Strongly Agree | 18% (4) Agree.
9.1.4 Consistent barcoding and readable information should be placed on vials, FFPE blocks, slides and reports
-82% (18) Strongly Agree | 18% (4) Agree.
9.1.5 Compatibility with additional barcoding solutions for, e.g., special stainers should be ensured
-68% (15) Strongly Agree | 27% (6) Agree | 5% (1) Neutral.
9.1.6 Ordered stainings should quickly appear as space-holders in the DP system. Additional tools to highlight completed cases are recommended
-59% (13) Strongly Agree | 41% (9) Agree.
9.1.7 Preparation process should be defined taking into consideration existing regular workflows to encompass triage of stainings (hematoxylin & eosin, vs special stains), prolonged drying times, and prioritization of highly urgent cases
-64% (14) Strongly Agree | 32% (7) Agree | 5% (1) Neutral.
9.1.8 Processes should be defined to switch to regular microscopy, e.g., for not compatible microscopy techniques (polarization) or selection purposes (molecular pathology)
-73% (16) Strongly Agree | 27% (6) Agree.
9.1.9 A process should be in place to allow for immediate retrieval of glass slides for rescanning or non-virtual microscopy
-82% (18) Strongly Agree | 18% (4) Agree.
9.1.10 Emergency plans for severe system errors should be in place
-86% (19) Strongly Agree | 14% (3) Agree.
9.1.11 Back-up systems for individual components in case of services/maintenance are recommended
-82% (18) Strongly Agree | 18% (4) Agree.
9.1.12 Process for re-scanning should be in place. Counts of re-scanning may serve as performance test of the scanning process
-68% (15) Strongly Agree | 32% (7) Agree.
9.1.13 Deviations and problems should be announced within a quality management or critical incidence reporting system
-73% (16) Strongly Agree | 27% (6) Agree.
9.2 For digital workflow pathologists.
9.2.1 All pathologists should be specifically trained for the specific DP system, e.g., case management, ordering of re-scans, and measurements
-68% (15) Strongly Agree | 23% (5) Agree | 5% (1) Strongly Disagree | 5% (1) Neutral.
9.2.2 General knowledge in DP and its potential pitfalls/limitations should be incorporated into the validation process (e.g., some cytological features, recognition of microbiota, and those provided by SDiPath, ESP, USCAP, and ASCP) and should be part of the basic training for the Swiss federal title of pathology, represented by the Institut Suisse Pour la Formation Médicale (ISFM/ SIWF)
-59% (13) Strongly Agree | 27% (6) Agree | 14% (3) Neutral.
9.2.3 All workflow steps should be documented in SOPs of the quality management system
-82% (18) Strongly Agree | 9% (2) Agree | 9% (2) Neutral.
9.2.4 Thumbnail images must be compared to scanned images to ensure complete tissue recognition
-77% (17) Strongly Agree | 18% (4) Agree | 5% (1) Neutral.
9.2.5 Additional support systems may be included, e.g., tracking systems for hovered areas, annotations for teaching and discussion, and time spent on details. A permanent storage of these data is not mandatory. The use of these data should be institutionally regulated and consented by the employed pathologists
-50% (11) Strongly Agree | 45% (10) Agree | 5% (1) Disagree.
9.2.6 Ordered stainings should appear as space-holders in the IMS system, to indicate pending stainings
-77% (17) Strongly Agree | 18% (4) Agree | 5% (1) Neutral.
9.2.7 Processes should be defined to switch to regular microscopy, e.g., for not compatible microscopy techniques (polarization) or selection purposes (molecular pathology)
-68% (15) Strongly Agree | 32% (7) Agree.
9.2.8 Emergency plans for severe system errors should be in place
-77% (17) Strongly Agree | 23% (5) Agree.
9.2.9 Digital process for requesting re-scanning should be in place
-64% (14) Strongly Agree | 32% (7) Agree | 5% (1) Neutral.
9.2.10 Deviations and problems should be announced within a quality management or critical incidence reporting system
-68% (15) Strongly Agree | 32% (7) Agree.
9.2.11 DP itself has evolved as a microscope equivalent in terms of good clinical practice. However, automated tools like scripts or algorithms fall under full considerations of indication, validation, plausibility check, and quality control
-55% (12) Strongly Agree | 36% (8) Agree | 9% (2) Neutral.
9.3 IT support
9.3.1 IT personnel familiar with the complete system must be in place (in house or as service)
-86% (19) Strongly Agree | 14% (3) Agree.
9.4 Tumor boards
9.4.1 Case presentation at the tumor board can be performed at lower resolutions and in a representative way. However, amendments of diagnosis should take place in the diagnostic workstation setting
-45% (10) Strongly Agree | 45% (10) Agree | 5% (1) Disagree | 5% (1) Neutral.
9.5 Inter-institute tele-consulting
9.5.1 The sending institute asking for digital tele-consulting is responsible for slide selection, scanning, resolution, and representativity. Approval of these steps should be declared in the order for consulting
-64% (14) Strongly Agree | 36% (8) Agree.
9.5.2 The receiving pathologist should be advised to ensure the diagnosis is made in an appropriate digital setup
-63% (19) Strongly Agree | 30% (9) Agree | 7% (2) Neutral.
9.5.3 The institute performing the tele-consulting should document in its sign-out report at minimum the number of electronic slides evaluated, viewing platform, and date of access
-64% (14) Strongly Agree | 32% (7) Agree | 5% (1) Neutral.
9.5.4 Pathologists may want to retain digital copies of the region of interest used for consultation
-59% (13) Strongly Agree | 36% (8) Agree | 5% (1) Neutral.
9.5.5 Receiving tele-consulting institutes in Switzerland are recommended to validate their workflows within regular accreditation processes
-53% (16) Agree | 43% (13) Strongly Agree | 3% (1) Neutral.
9.5.6 To transparently outline the obligations of the asking institute, the sentence “this diagnosis was established on digitized whole slide images kindly provided by the Institute XXX” may be included
-50% (11) Strongly Agree | 41% (9) Agree | 5% (1) Disagree | 5% (1) Neutral.
9.5.7 Comparable to physical slide sharing, the final diagnosis and legal liabilities are determined according to the SSPATH guidelines for consultation cases (for link to guidelines, see below
3)
-73% (16) Strongly Agree | 27% (6) Agree.
9.5.8 Transmission and access to healthcare information must be ensured according to data protection laws, e.g., HIN secure emails and local servers with controlled access. The responsibility remains with the providing institute until a national platform is created
-86% (19) Strongly Agree | 14% (3) Agree.
9.6 Compliance with quality management systems.
9.6.1 The validation test for the established end-to-end workflow (derived from WP1) should be documented within the quality management system and repeated after major equipment changes
-47% (14) Agree | 47% (14) Strongly Agree | 7% (2) Neutral.
9.6.2 SOPs should include all major components of the workflow—equipment, software versions, scanner models, etc
-73% (16) Strongly Agree | 23% (5) Agree | 5% (1) Neutral.
9.6.3 A re-validation of the complete digital workflow must be performed if major components are replaced, in particular, a change in scanners, PACS-system, Pathology-LIMS/IMS, software-interfaces, etc
-55% (12) Strongly Agree | 36% (8) Agree | 5% (1) Disagree | 5% (1) Neutral.
9.6.4 Other minor changes due to the modularization of the workflow are handled according to the institutional QM guidelines. This includes new versions of software and equipment without major changes that are integrated with a non-inferior test and simple validation, higher monitor resolution, better graphic power of workstation, faster server conditions, updated scanner same series, etc
-45% (10) Agree | 41% (9) Strongly Agree | 9% (2) Neutral | 5% (1) Disagree.
9.6.5 Separate validations must be performed for specific physical measurements like length or areas as replacement for high power fields
-45% (10) Strongly Agree | 41% (9) Agree | 14% (3) Neutral.
9.7.1 DP workflows will increase patient safety and quality measurements. The additional technical processes could be covered with higher reimbursement rates as negotiated by SSPATH
-50% (11) Strongly Agree | 36% (8) Agree | 14% (3) Neutral.
9.7.2 Financial sustainability negotiated with reimbursement agencies should include the needed personnel and equipment under the new DP conditions
-63% (19) Strongly Agree | 30% (9) Agree | 3% (1) Disagree | 3% (1) Neutral.
Appendix 4
Working group 4—image analysis (IA)/artificial intelligence (AI).
Specific statements and agreement levels.
Participant demographics: These statements were voted on by 24 SDiPath members, the composition of which consisted of 75% (18) pathologists | 17% (4) researcher | 8% (2) lab staff/procurement/IT.
These participants state their place of work to be 58% (14) university institute | 29% (7) public hospital institute | 8% (2) private institute | 4% (1) industry.
10. General considerations
10.1 For bioimage analyses and AI-assisted solutions intended for diagnostic use, Institutes of Pathology should use officially certified (e.g., IVD-CE-certified, FDA-approved) systems in the intended manner. Laboratory developed systems may be used as long as the proper validation, quality control, and quality assurance requirements are fulfilled to deliver accurate, precise, and reproducible results
-47% (14) Agree | 40% (12) Strongly Agree | 13% (4) Neutral.
10.2 The final diagnosis is determined by the pathologist who is fully legally responsible for it
-83% (20) Strongly Agree | 12% (3) Agree | 4% (1) Neutral.
10.3 As the level of autonomy rises, interpretability and understanding of how the system is coming to a conclusion becomes more critical
-58% (14) Strongly Agree | 33% (8) Agree | 8% (2) Neutral.
10.4 Algorithms which may indicate germline or somatic mutation status must follow existing laws for molecular testing
-71% (17) Strongly Agree | 21% (5) Agree | 4% (1) Neutral | 4% (1) Disagree.
10.5 All these systems must fulfill the Swiss regulatory requirements (e.g., Heilmittelgesetz, HMG; Medizinprodukteverordnung, MepV; Verordnung über in-vitro-Diagnostika, IvDV)
-53% (16) Strongly Agree | 43% (13) Agree | 3% (1) Neutral.
10.6 AI results must be reported to and reviewed by a board-certified pathologist to align with the “integrative diagnosis” paradigm. Unreviewed reporting to third parties should be regarded as off-label use
- 53% (16) Strongly Agree | 40% (12) Agree | 7% (2) Neutral.
11. Implementation and validation of IA/AI solutions in diagnostic routine
11.1 Each Institute of Pathology has to validate the IA/AI solutions internally, even if they are using officially certified systems (e.g., IVD-CE certified, FDA-approved). The scope of the validation should be clearly stated (input/output)
-62% (15) Strongly Agree | 29% (7) Agree | 8% (2) Neutral.
11.1.1 Validation should be appropriate for, and applicable, to the intended clinical use and clinical setting of the application in which the IA/AI algorithms will be employed. The category of algorithm should be described in context with the clinical use (e.g., diagnostic, prognostic, predictive), although it is acknowledged that this will not always be unequivocally possible
-60% (18) Agree | 27% (8) Strongly Agree | 13% (4) Neutral.
11.1.2 Validation of IA/AI systems should involve specimen preparation types relevant to the intended use (e.g., formalin-fixed paraffin-embedded tissue, frozen tissue, immunohistochemical stains, fluorescence, cytology slides, hematology blood smears)
-71% (17) Strongly Agree | 29% (7) Agree.
11.1.3 The validation study of the IA/AI algorithm system should closely emulate and encompass the real-world clinical environment in which the technology will be used
-71% (17) Strongly Agree | 25% (6) Agree | 4% (1) Neutral.
11.1.4 Documenting metadata associated with WSI creation must take place, including software versions, firmware versions, understanding that small changes may have large impacts on IA/AI performance
-62% (15) Strongly Agree | 29% (7) Agree | 8% (2) Neutral.
11.1.5 Diagnoses made using IA/AI should include a version number associated with validation protocol (which contains version numbers of all software/firmware) to enable future contextualization and potential reproduction of results
-62% (15) Strongly Agree | 33% (8) Agree | 4% (1) Neutral.
11.1.6 Revalidation is required whenever a significant change (e.g., changing of software, firmware updates) is made to any component of the WSI workflow
-71% (17) Strongly Agree | 29% (7) Agree.
11.1.7 If there are known edge-cases where an IA/AI may not perform well, they should be documented
-62% (15) Strongly Agree | 38% (9) Agree.
11.2 The pathology report should contain a comment that an IA/AI-tool was used for diagnosis and a comment regarding the regulatory state of the IA/AI solution, e.g., “The tool XY aiding in this diagnosis received a IVD-CE mark in 2022” or “The tool XY aiding in this diagnosis is not currently CE-regulated.”
-53% (16) Strongly Agree | 40% (12) Agree | 7% (2) Neutral.
11.3 Furthermore, model performance of the on-site validation study may be included
-46% (11) Agree | 38% (9) Strongly Agree | 17% (4) Neutral.
11.4 All tissue that is present on a glass slide should be available and subject to computational analysis, i.e., one needs a verification step to ensure that all relevant tissue areas have been analyzed (whole tissue or relevant hot-spot areas).
-53% (16) Strongly Agree | 40% (12) Agree | 3% (1) Disagree | 3% (1) Neutral.
11.5 A quality control step will be necessary to ensure that the images being analyzed are of suitable quality, for example, regions of blurriness will impact algorithm performance and thus should be alerted to the user. This should include carefully examining whether faint stains, pen marks, foreign objects, air bubbles during sealing, or damage to the cover slide affected the quality of scanned digital images and whether errors such as misalignment of strips or tiles when image stitching has occurred.
-67% (16) Strongly Agree | 29% (7) Agree | 4% (1) Neutral.
11.6 User requirements should be clearly delineated: What level of expertise is required to operate the software? What extent/duration of staff training will be required to effectively utilize the software
-58% (14) Strongly Agree | 42% (10) Agree.
11.7 Personnel/IT requirements should be clearly delineated: What additional resources (e.g., technologists, hardware, laboratory footprint, IT infrastructure, etc.,) are required to support and ensure optimal function?
-50% (12) Strongly Agree | 50% (12) Agree.
11.8 Regions of interest (ROI) selection methodology should be clearly stated and described (if applicable) in the diagnostic report and whether analysis was on ROI, hot spots, WSI or was based on a pre-selected sample (e.g., tissue microarray (TMA) spot) in order for it to be reliable and reproducible. Selection of ROI or hot spots can be completely automated, completely manual, or a combination of both. The approaches are subject to alternative potential errors. These approaches are likely to be disease and organ-specific
-40% (12) Agree | 40% (12) Strongly Agree | 17% (5) Neutral | 3% (1) Disagree.
11.9 The validation process should include a large enough set of slides to be fully representative of the intended application (e.g., H&E- stained sections of fixed tissue, frozen sections, cytology, hematology) that reflects the anticipated spectrum and complexity of specimen types, presentation artifacts, and variabilities, along with diagnoses likely to be encountered during routine practice. Be aware of “rare” diseases, tissue alterations, and aberrant tissue present and how the system handles them (have they been part of the training cohort?)
-67% (16) Strongly Agree | 25% (6) Agree | 8% (2) Neutral.
11.10 Clear descriptions must be provided of quality control measures and validation steps for every clinical assay where image analysis is used. This should include a careful description of algorithm validation
-67% (16) Strongly Agree | 29% (7) Agree | 4% (1) Neutral.
11.11 Measures of reproducibility should be documented such as pathologist-algorithm correlation and measures of inter-pathologist variability
-54% (13) Strongly Agree | 38% (9) Agree | 8% (2) Neutral.
11.12 A validation study should establish diagnostic concordance between digital and glass slides for a single observer (i.e., intra-observer variability).
-58% (14) Strongly Agree | 25% (6) Agree | 17% (4) Neutral.
11.13 Non-inferiority testing should be carried out between algorithm and pathologists typically assigned to that particular workflow, and/or against published statistics
-50% (12) Strongly Agree | 33% (8) Agree | 17% (4) Neutral.
12. Desirable technical properties
12.1 Integration into the existing digital pathology workstation environment is recommended to avoid task-switching burden
-67% (16) Strongly Agree | 25% (6) Agree | 8% (2) Neutral.
12.2 The performance of an IA/AI-solution must keep up with the increasing number of cases. Analysis of WSIs right after scanning should be supported for time intensive analyses. Algorithm selection should be automated based on available LIS information (e.g., tissue type, staining)
-50% (12) Strongly Agree | 42% (10) Agree | 8% (2) Neutral.
12.3 Algorithms should highlight the regions on the digitized slides, which were used to determine their output, to enable visual control by the pathologist
-75% (18) Strongly Agree | 25% (6) Agree.
12.4 The ability to provide feedback, to either mark regions or cases as examples of great successes/failures, is suggested. These will allow for both (a) improvement of algorithms and (b) testing of subsequent versions on real-world difficult/interesting cases
-58% (14) Strongly Agree | 38% (9) Agree | 4% (1) Neutral.
12.5 Algorithms can be employed to prioritize cases within work lists or slides within cases (i.e., move those cases or slides that the AI algorithms flagged with positive findings to the top of the worklist so that they are reviewed first)
-57% (17) Agree | 23% (7) Strongly Agree | 17% (5) Neutral | 3% (1) Disagree.
12.6 An indication should be provided to clearly advise when the algorithm has yet to be run, or is still running, and may still return additional results to prevent premature sign out of cases
- 50% (12) Strongly Agree | 42% (10) Agree | 8% (2) Neutral.
12.7 A documentation of where and how the results are stored should be part of the architecture design. Are they in a secured automatically backed up location? Are the results associated with the image itself or the patient file? The results of the algorithm should be stored in a way that diagnostic decisions can be retraced and in accordance with the legal requirements (e.g., screenshot, images of the critical regions)
-54% (13) Strongly Agree | 46% (11) Agree.
12.8 Documenting expected input/output formats is important to ensure they are in “standard” formats (e.g., DICOM, CSV, XLS) that will be easy to share/re-use over the long term. Avoiding the need of using vendor-specific software to access results.
-54% (13) Strongly Agree | 42% (10) Agree | 4% (1) Neutral.
13. Maintenance
13.1 A clear SOP should be in place for the management of hardware and software malfunctions
-67% (16) Strongly Agree | 29% (7) Agree | 4% (1) Neutral.
13.2 A clear SOP should be in place for the management of updates, including a documentation of what the updates consist of, changes to the algorithm and requirements for re-validation
-62% (15) Strongly Agree | 33% (8) Agree | 4% (1) Neutral.
13.3 Burden of update frequency should be weighed against potential benefits and cost of re-validation of system. Awareness of expected algorithm update frequency is important
54% (13) Strongly Agree | 33% (8) Agree | 12% (3) Neutral.