Swiss digital pathology recommendations: results from a Delphi process conducted by the Swiss Digital Pathology Consortium of the Swiss Society of Pathology

Integration of digital pathology (DP) into clinical diagnostic workflows is increasingly receiving attention as new hardware and software become available. To facilitate the adoption of DP, the Swiss Digital Pathology Consortium (SDiPath) organized a Delphi process to produce a series of recommendations for DP integration within Swiss clinical environments. This process saw the creation of 4 working groups, focusing on the various components of a DP system (1) scanners, quality assurance and validation of scans, (2) integration of Whole Slide Image (WSI)-scanners and DP systems into the Pathology Laboratory Information System, (3) digital workflow—compliance with general quality guidelines, and (4) image analysis (IA)/artificial intelligence (AI), with topic experts for each recruited for discussion and statement generation. The work product of the Delphi process is 83 consensus statements presented here, forming the basis for “SDiPath Recommendations for Digital Pathology”. They represent an up-to-date resource for national and international hospitals, researchers, device manufacturers, algorithm developers, and all supporting fields, with the intent of providing expectations and best practices to help ensure safe and efficient DP usage.


Introduction
Clinical pathology is in the process of undergoing a digital transformation, wherein routinely produced glass slides are no longer read in an "analog" manner using a microscope but are instead viewed in a "digital" manner on computer screens after digitization.This digital pathology (DP) paradigm offers a number of important advantages, many of which are now being realized in clinical routines [1].Improvements in streamlining of pathology practices, workflows, and quality-of-life enhancements for pathologists have already been seen.For example, DP streamlines pathology practices by having pathologists access and analyze slides remotely, eliminating the need for the organization, and physical transportation, of glass slides [2].DP further facilitates collaboration among pathologists, enabling them to easily share and discuss cases, which can lead to more accurate and timely diagnoses [3].Moreover, DP is staged to reduce costs associated with slide storage and management [4], as digital images can be stored electronically and accessed when needed, potentially eliminating the need for long-term physical archives.These same repositories provide rapid retrieval of previous cases that may be of comparative interest.
Beyond these clinical improvements, substantial work is demonstrating that once these images are digitized, they can be employed by computational approaches geared towards predicting diagnosis, prognosis, and therapy response of patients [5].These DP tools and image-based biomarkers leverage the present day confluence of 4 factors for their success: (a) relatively inexpensive computational power in the form of graphics processing units (GPU), (b) inexpensive storage of the large file sizes typically associated with Andrew Janowczyk and Inti Zlobec share co-first authorship in this work.
Extended author information available on the last page of the article DP, which can often reach more than 2 GB per slide, (c) increased generation of whole slide images (WSI) via adoption of digital slide scanners in both research and clinical use cases, with some institutions routinely producing more than 2000 whole slide images per day, and (d) new algorithms, such as deep learning [6], whose success continues to be built upon the availability of the other factors.
In contrast, disadvantages associated with DP appear to be connected with its setup and instantiation, as opposed to long-term sustainment and usage.Challenges associated with initial cost [7], software and hardware integrations, refinement of lab practices [8], and additional training requirements potentially disrupting workflow and productivity are not uncommon during early stages of DP deployment.Secondary issues, associated with, e.g., slide scanning time, standardization of work product, and compliance with regulatory and legal issues, are likely connected to lack of experience and detailed planning, thus benefiting from and motivating the need for sharing of points for consideration and best practices.
In spite of any limitations, the opportunities afforded by going digital appear to be driving substantial investments by academic researchers, hospitals, and industry to put in place validated DP workflows for clinical usage [9].As a result, groups of motivated experts have been formed both nationally and internationally to engage in knowledge sharing and best practices.For example, our Swiss Digital Pathology Consortium (SDiPath) was founded as a working group of the Swiss Society of Pathology (SSPath) in 2018 and now enlists over 170 members, evenly split between pathologists (defined here in the broadest sense, including board-certified pathologists, neuropathologists, dermatopathologists, residents, and trainees), computational pathology researchers, and technical experts which enable DP activities (i.e., histology-technicians and information technology specialists).
A common theme emerging from the development of our own vision for a national DP infrastructure [10], to surveys regarding DP usage and adoption [11,12], is the apparent need for national recommendations for the deployment and validation of DP pipelines, workflows, and algorithms.This is in line with efforts in other countries and organizations that have produced similar recommendation documents geared towards their specific needs and regulatory environments (e.g., Germany [13], Australia [14], USA [15], UK [16]).These efforts express the importance associated with producing consistent work product, documenting workflows, and estimating both human and technological costs, together serving the tenet of patient safety having paramount importance.Notably, it is a requirement that the digital transformation of DP should not yield inferior performance, safety, or quality assurances as compared to its microscope-based analog counterpart, as reflected by respective CE, FDA, or IVD certifications.
Given the nascent nature of clinical DP instantiation, and the associated cross-domain skillset needed, a concerted effort of agglomerating different stakeholders' experiences and opinions is warranted.This is especially the case as digital workflows are often non-trivial to materialize and may further be burdensome to upgrade or rectify if unexpected issues arise [17,18].There are often unforeseen challenges, for example, those associated with incorrect scope definition as discussed in our previous work entitled "Going digital: more than just a scanner!" [17].While claims that "Digital Pathology: The time has come" [18] are emerging, there appears to remain potential hesitancy to engage in a digital transformation without clear guidelines of expectations and deliverables [11,12].The recommendations presented here, similar to those produced in other countries, employed surveys and discussions with experts in their respective fields to curate experiences and thoughts.The ultimate goal is not only to provide best practices and suggestions to those at different stages of their digital transformation but to take also current potentially ad hoc approaches and solidify them into common practices to the benefit of pathologists, regulators, device and algorithm manufacturers, researchers, and, above all, our patients.

Methods
To build consensus on a set of DP recommendations from SDiPath members, a Delphi process was used.Briefly, this process consists of rounds wherein (1) participants vote on their level of agreement with provided statements, (2) discordant statements are reviewed, discussed, and revised, and (3) new statements are submitted for voting again until a consensus is reached.
To facilitate this process, four working groups were formed around major pillars associated with DP: (1) scanners, quality assurance and validation of scans, (2) integration of WSI-scanners and DP systems into the Pathology Laboratory Information System, (3) digital workflow-compliance with general quality guidelines, and (4) image analysis (IA)/artificial intelligence (AI).These working groups were led by experts in their respective areas who were tasked with recruiting members to their WGs having relevant expertise as needed to generate a series of statements.On average, each working group consisted of approximately 10 people.Working groups were encouraged to review existing guidelines from other organizations, such as the Digital Pathology Association [19], CAP [15], Canadian [20], UK [16], German [13], Korean [21], and Australian [14] guidelines [22], and use them to critically reflect on their own statements.
Nomenclature was suggested such to indicate level of severity of proposed statements, with (a) "must" indicating an imperative, (b) "should" indicating a suggestion, and (c) "could" indicating preferable but not required.
After the individual working groups formulated their statements, they were unified into a single document, in which all working group members reviewed and provided feedback.In total, 83 statements were created and voted upon at a WG level via Google Forms, such that there was 1 form per WG, to allow participants to selectively engage with WG's matching their expertise.Participants were asked to select between (i) strongly agree, (ii) agree, (iii), neutral, (iv) disagree, and (v) strongly disagree for each statement.The demographics of the expertise and background of the participants was recorded and is provided below.The survey was announced via various venues including the SDiPath mailing list, in person meetings, and direct departmental level recruitment.
After a 1-month waiting period for feedback, between May 2022 and June 2022, 14 statements were identified as needing discussion and clarifying language at the WG level.These statements were returned to the working groups wherein they underwent a supervised revision with the experts to modify the statements based on comments provided by the voting members.These were then again reviewed by all working groups for approval before being submitted to the members for a second round of Delphi voting via a single unified Google Form.This round was made available in November 2022 for 2 weeks, after which a review of the participant votes and feedback indicated convergence.Importantly, the voting members represented a diverse set of Swiss pathology stakeholders, hailing from all over the country.Those pathologists involved in the production of these guidelines are affiliated with all 5 university hospitals, as well as 4 cantonal hospitals, and 2 private institutions.
Statement responses are reported in descending percentage order.Consensus was determined as being reached if > 66% of all voters "Agreed" or "Strongly Agreed."All of the statements presented here reached that level of agreement, indicating full consensus.
Participation was entirely voluntary, and there was no financial compensation for study participation and no disadvantage related to non-participation.

Working group 1-scanners, quality assurance and validation of scans
This working group focuses on scanners, quality assurance, and the validation of scans in digital pathology.
These recommendations emphasize the importance of clear workflow definition, scanner evaluation, and thorough validation processes in digital pathology.These statements were asked to focus on the first part of the digital pathology pipeline-the selection, installation, and validation of whole slide image scanners.They discuss workflow creation and adjustment, documentation requirements, ideal scanner properties, and approaches for scanner validation.The recommendations are summarized as follows, with individual statements and agreement levels provided in Appendix 1.

Scanner requirements
• Evaluate different scanning systems, considering technical requirements and integration into the LIS.See abridged example in Fig. 1. • Ensure that scanners meet the intended purpose, including capacity, slide compatibility, and CE-IVD certification for diagnostic purposes.• Consider scanner maintenance costs and their impact on workflow.

Output formats
• Identify ideal scanning profile settings for consistently high picture quality.• Define file formats for storage and sharing, with a preference for open, non-proprietary formats.• Specify image size, format, and archiving periods.

Scanner validation study
• Define the scope of the validation study, including tissue sources, stains, and acceptance criteria for diagnostic purposes.• Establish concordance levels and define severity for nonconcordance.• Create a validation protocol and test a representative sample for each application (e.g., see Fig. 2).• Generate a report summarizing the validation aim, results, conclusions, technical requirements, scanner settings, and training evaluations.

Working group 2-integration of WSI-scanners and DP systems into the Pathology Laboratory Information System
Assuming a validated scanner is in place, these recommendations emphasize the framework for effectively integrating WSI scanners and DP systems into a Pathology Laboratory Information System, ensuring optimal visualization, data management, and workflow efficiency.The recommendations are summarized as follows, with individual statements and agreement levels provided in Appendix 2.

Visualization (monitors)
• Larger, high-resolution displays are preferred for better image quality.

Integration of WSI scanner into Pathology Laboratory Information System (Patho-LIS) for routine diagnostics
• The scan workflow should be integrated into the Patho-LIS, image management system (IMS), and an image archive, supporting standard communication formats.• Image data in open formats should be stored in a storage system for retrieval using appropriate streaming mechanisms.

Working group 3-digital workflow-compliance with general quality guidelines
These statements are geared towards achieving conformity with current accreditation norms and traceable quality parameters that can be documented within the quality management systems (QMS) of each institute.The legal framework in which DP enters the stage consists of many facets, from adopted European regulations like in vitro diagnostic regulation (IVDR), to general data protection regulation (GDPR), national legislation like human research act (HRA) and human research law (HRL), and the medical/pathological guidelines of the SSPath.Swiss laboratories regularly perform accreditation [23] according to the ISO15189 [24] and ISO17025 [25].Relevant elements in terms of quality documentation comprise organizational, procedural, technical, and personnel aspects.For accreditation, DP is regarded as equivalent to conventional histology and thus tends to benefit from conventional quality control improvements (e.g., usage of barcoding).
In the future, even more improvements via additional quality measurements can be expected with the deployment of DP.For instance, histological sectioning for DP needs more attention by technical personnel, in terms of correct thickness and avoidance of folds, scratches, or peripheral placement of the tissue.Contribution to round-robin tests (Quality in Pathology (Germany), European Society of Pathology (ESP), NordiQC) can be fulfilled via slide upload and in-depth calibration measurements.In consequence, DP appears situated to facilitate creation of new more precise standards.The recommendations are summarized as follows, with the individual statements and agreement levels presented Appendix 3.

Quality requirements for laboratory staff and technicians
• Technicians should receive specific training for digital pathology, including avoiding specimen placement at slide edges and recognizing artifacts.
• Training with scanners and high-tech equipment is recommended for advanced users.• All workflow steps should be documented in the quality management system's SOP.
• Consistent barcoding and readable information should be placed on vials, FFPE blocks, slides, and reports.• Compatibility with additional barcoding solutions for special stainers should be ensured.• Ordered stainings should quickly appear as placeholders in the digital pathology system.• The preparation process should be defined to encompass triage of stainings and prioritize highly urgent cases.• Processes should be defined to switch to regular microscopy for non-digitally compatible microscopy techniques or selection purposes.• A process should be in place to allow for immediate retrieval of glass slides for rescanning or non-virtual microscopy.
• Emergency plans for severe system errors should be in place.
• Back-up systems for individual components in case of service or maintenance are recommended.• A process for re-scanning should be in place and counts of re-scanning may serve as a performance test of the scanning process.• Deviations and problems should be reported within a quality management, or critical incidence reporting, system.

Additional quality requirements for digital workflow pathologists
• All pathologists should receive specific training for the digital pathology system, including case management, ordering re-scans, and measurements.• General knowledge in digital pathology and its potential pitfalls/limitations should be incorporated into the validation process and basic training for the Swiss federal title of pathology.• Thumbnail images must be compared to scanned images to ensure complete tissue recognition.• Additional support systems may be included, e.g., tracking systems for hovered areas, annotations for teaching and discussion, and time spent on details.The use of these data should be institutionally regulated and consented by the employed pathologists.• A digital process for requesting re-scanning should be in place.
• Automated tools like scripts or algorithms should be used cautiously and follow indications, validation, plausibility checks, and quality control.

IT support
• IT personnel familiar with the complete system must be in place (in-house or as a service).

Tumor boards
• Case presentation at tumor boards can be performed at lower resolutions and in a representative way, but amendments to diagnosis should take place in the diagnostic workstation setting.

Inter-institute tele-consulting
• The sending institute requesting digital tele-consulting is responsible for slide selection, scanning, resolution, and representativity, with approval declared in the consulting order.• The receiving pathologist should ensure the diagnosis is made in an appropriate digital setup.• The institute performing the tele-consulting should document in its sign-out report the number of electronic slides evaluated, viewing platform, and date of access.• Pathologists may retain digital copies of regions of interest used for consultation.• Receiving tele-consulting institutes in Switzerland are recommended to validate their workflows within regular accreditation processes.• To outline the obligations of the asking institute, the sentence "this diagnosis was established on digitized whole slide images kindly provided by the Institute XXX" may be included.• The final diagnosis and legal liabilities are determined according to the SSPATH guidelines for consultation cases.

Compliance with quality management systems
• The validation test for the established end-to-end workflow should be documented within the Quality Management system and repeated after major equipment changes.• SOPs should include all major components of the workflow.• A re-validation of the complete digital workflow must be performed if major components are replaced.• Other minor changes due to the modularization of the workflow are handled according to institutional QM guidelines.• Separate validations must be performed for specific physical measurements.
• DP workflows are expected to increase patient safety and quality measurements, which could be covered with higher reimbursement rates.1).
To scope the statements, it was discussed that the current state of technology was not sufficiently high to justify consideration of type 5 algorithms, with type 4 only being considered in niche roles.As such, in the statements below, AI solutions were thought to be aimed to automate repetitive and time-consuming tasks (e.g., mitosis counting, immunohistochemical (IHC)-marker evaluation) and provide a decision support system to the pathologists (e.g., for ambiguous or rare cases).It was noted that the field is very rapidly evolving, and as such, attention should be paid to determine when revision of these statements in light of new inventions, experience, and wisdom is required.The recommendations are summarized as follows, with the individual statements and agreement levels presented in Appendix 4.

General considerations
• For bioimage analyses and AI-assisted solutions intended for diagnostic use, institutes of pathology should use officially certified systems (e.g., IVD-CE certified, FDAapproved) or laboratory-developed systems that meet validation and quality control requirements.• The final diagnosis is the responsibility of the pathologist.• As the level of autonomy in AI systems rises, the interpretability of results becomes more critical.• Algorithms indicating germline or somatic mutation status must comply with existing laws for molecular testing.• All systems must fulfill Swiss regulatory requirements.
• AI results must be reported to and reviewed by a boardcertified pathologist, following the "integrative diagnosis" paradigm [26].

Desirable technical properties
• Integration into the existing digital pathology workstation environment is recommended.• The IA/AI system's performance must scale with the increasing number of cases.• Algorithms should highlight regions on digitized slides used to determine their output.• The ability to provide feedback and prioritize cases or slides is suggested.• IA/AI can be employed to prioritize cases within work lists or slides.• Indications should be provided regarding the status of running algorithms.• Results should be stored in a secure and retrievable manner, in accordance with legal requirements.• Expected input/output formats should be documented to ensure long-term usability without vendor-specific software.

Maintenance
• A clear SOP should be in place for the management of hardware and software malfunctions.
• A clear SOP should be in place for the management of updates, including documentation and re-validation requirements.• The burden of update frequency should be weighed against potential benefits and re-validation costs with awareness of expected algorithm update frequency.

Conclusion
Using a Delphi process, the members of the Swiss Digital Pathology Consortium reached consensus on practical recommendations for the implementation and validation of digital pathology in clinical workflows.These recommendations focused on its safe usage, with attempts at maximizing patient safety and benefit while minimizing overhead.As a result, we put forward these statements as best practices to be considered when adopting DP within Switzerland, while also providing another resource for our international colleagues.We are happy to report significant concordance between existing national recommendations and our own, likely due to the converging nature of what appears to be emerging best practices for DP.These recommendations integrate and update previous guidelines, providing a dedicated section on the implementation of AI and IA.This fills a niche absent from other recommendations, likely due to the nascent nature of AI/IA field during their creation.Of particular note was that working groups appreciated how rapidly the field is maturing and realized that, unlike other more established technologies, these DP recommendations will likely need to undergo revisions as technology and the associated implications of this paradigm-shifting technology become clearer.

Appendix 1
Working

Appendix 2
Working group 2-integration of WSI-scanners and DP systems into the Pathology Laboratory Information System.Specific statements and agreement levels.

Visualization (monitors)
5.1 General considerations 5.1.2Larger, high-resolution displays show more of the slide at 1:1 magnification (1 screen pixel = 1 image pixel).Lower resolution displays require more panning of the image in order to cover the same physical area.The monitor should be validated by an expert and selected by the pathologist 68% ( 17) Strongly Agree | 20% (5) Agree | 12% (3) Neutral.
5.1.3When selecting the monitor size, the working distance between the monitor and the pathologist must also be taken into account so that ergonomic working (according to the guidelines of the Caisse nationale suisse d'assurance en cas d'accidents/Schweizerische Unfallversicherungsanstalt/ Istituto nazionale svizzero di assicurazione contro gli infortuni1 ) is possible 68% (17) Strongly Agree | 20% (5) Agree | 12% (3) Neutral.

5.2.1
The monitor should have a color calibration option with a low deviation.Automatic self-calibration and adaptation to ambient light are recommended.Manual calibration should be performed at time intervals according to the manufacturer's recommendations.Calibration steps should be documented 44% (11) Strongly Agree | 44% (11) Agree | 8% (2) Neutral | 4% (1) Disagree.

Brightness and contrast
The screen should have a minimum contrast ratio of 1000:1 and a brightness of at least 260 cd (candela)/m 2 in order to maintain high readability in brighter ambient lighting situations.The minimum brightness should be displayable at 0.5 cd/m 2 or greater [27].
5.4 Color depth 5.4.1 The displayable color space should support 24-bit color (8-bit RGB) and 8-bit grayscale.Color depth: Coverage of at least 98% of the Adobe RGB color space is most likely beneficial to display WSI colors accurately [28].48% 6.2All image data in open-formats should optionally be sent to a storage system (e.g., picture archiving and communication system (PACS), vendor neutral archive (VNA)) and retrieved from there using an appropriate streaming mechanism -68% ( 17) Strongly Agree | 24% (6) Agree | 8% (2) Neutral.
6.3 A secondary test environment is recommended to test the respective parameterization of the digital workflow.This system has to function independently from the production system (used for diagnostic) and allows for testing new functionalities, software updates, or functional integrations -48% (12) Strongly Agree | 32% (8) Agree | 20% (5) Neutral.
6.4 Interfaces between WSI scanner and Patho-LIS or between Patho-LIS and IMS must be established.The application of the interface must be tested by at least one pathologist as part of the required validation study -72% (18) Strongly Agree | 28% (7) Agree.6.5 In addition to the barcode on the glass slides, the alphanumeric code (i.e., sample id and staining) of the slide should also be printed.In this way, the slide can be identified by comparing the recognized barcode with the alphanumeric label and manually corrected, for example, by comparing it with the original slides -72% (18) Strongly Agree | 24% (6) Agree | 4% (1) Neutral.

Recommendations for IT interfaces, standards and workflow
7.1 The image viewer comprised of (a) virtual microscope which displays the whole slide images (WSI) of scanned histological sections and (b) macroscopic specimen image viewer should be integrated into an image management system (IMS) that allows a (bi)directional communication with the Pathology Laboratory Information System (Patho-LIS) and digital archive -72% (18) Strongly Agree | 24% (6) Agree | 4% (1) Neutral.
7.2 If an IMS is used, it must retrieve or pull all necessary information from the Patho-LIS to identify and link the scanned slides to the corresponding LIS entries -64% (16) Strongly Agree | 24% (6) Agree | 12% (3) Neutral.
7.4 The network speed required for a smooth workflow depends on various parameters (e.g., number of scanners used simultaneously, distance of the scanners to the server) and must be adjusted according to the specific condition.As a general recommendation, the minimum network speed should be 1 Gbps for each individual scanner -72% (18) Strongly Agree | 16% (4) Neutral | 12% (3) Agree.
7.5 The scanner(s) should be connected to a high-performance storage solution (low latency, scale-out architecture, fast transfer speed) that can be integrated into the existing system landscape -68% (17) Strongly Agree | 24% (6) Agree | 8% (2) Neutral.
7.7 For the implementation of digital pathology in routine diagnostic, it is recommended to configure the system in such a way that redundant installations (e.g., not only one but at least 2 scanners) and/or an alternative workflow are defined (e.g., maintain the possibility to dispatch the slides) in case of a hardware/software malfunction -76% ( 19) Strongly Agree | 24% (6) Agree.

Recommendations for archiving
8.1 To ensure a smooth workflow in DP and daily routine practice, it is recommended to store the WSI for at least 3 months on a high-performance server architecture -72% (18) Strongly Agree | 20% (5) Agree | 8% (2) Neutral.
8.2 As a possible currently viable approach archiving of WSI, e.g., in a picture archiving and communication system (PACS)/vendor neutral archive (VNA) for 3 years can be envisaged as a cost-benefit compromise.Such duration of digital archiving would cover the majority of situations in the routine diagnostic workflow where cases need to be compared with previous biopsies.Long-term storage of glass slides should remain unchanged -57% (17) Agree | 23% (7) Strongly Agree | 10% (3) Neutral | 10% (3) Disagree.
8.3 The storage concept should take into account compression methods up to the end of the visualization chain, extensive error redundancy during storage, automatic progressive arrangement of the compressed data streams and the patent-free nature of the storage format -40% (10) Strongly Agree | 40% (10) Agree | 20% (5) Neutral.
8. 4 The acquisition of systems that use industry standards for communication (e.g., DICOM, HL7, CDA, FHIR 2 ) with third-party systems (e.g., the hospital information system) or whose systems meet the Integrating the Healthcare Enterprise (IHE) conformance criteria should be preferentially considered for both interoperability and longer-term sustainability (i.e., less likely to become outdated), especially in the environment of larger institutions (e.g., universities/public hospitals) -48% (12) Strongly Agree | 32% (8) Agree | 20% (5) Neutral.

Appendix 3
Working group 3-digital workflow-compliance with general quality guidelines.
Specific statements and agreement levels.
Participant demographics: These statements were voted on by 22 SDiPath members, the composition of which consisted of 73% ( 16) pathologists | 18% (4) researcher | 9% (2) lab staff/procurement/IT.These participants state their place of work to be 59% (13)  11.4All tissue that is present on a glass slide should be available and subject to computational analysis, i.e., one needs a verification step to ensure that all relevant tissue areas have been analyzed (whole tissue or relevant hot-spot areas).
11.5 A quality control step will be necessary to ensure that the images being analyzed are of suitable quality, for example, regions of blurriness will impact algorithm performance and thus should be alerted to the user.This should include carefully examining whether faint stains, pen marks, foreign objects, air bubbles during sealing, or damage to the cover slide affected the quality of scanned digital images and whether errors such as misalignment of strips or tiles when image stitching has occurred.
-67% the diagnostic report and whether analysis was on ROI, hot spots, WSI or was based on a pre-selected sample (e.g., tissue microarray (TMA) spot) in order for it to be reliable and reproducible.Selection of ROI or hot spots can be completely automated, completely manual, or a combination of both.The approaches are subject to alternative potential errors.These approaches are likely to be disease and organ-specific -40% (12) Agree | 40% (12) Strongly Agree | 17% (5) Neutral | 3% (1) Disagree.
11.9 The validation process should include a large enough set of slides to be fully representative of the intended application (e.g., H&E-stained sections of fixed tissue, frozen sections, cytology, hematology) that reflects the anticipated spectrum and complexity of specimen types, presentation artifacts, and variabilities, along with diagnoses likely to be encountered during routine practice.Be aware of "rare" diseases, tissue alterations, and aberrant tissue present and how the system handles them (have they been part of the training cohort?) -67% ( 16) Strongly Agree | 25% (6) Agree | 8% (2) Neutral.
11.10 Clear descriptions must be provided of quality control measures and validation steps for every clinical assay where image analysis is used.This should include a careful description of algorithm validation -67% ( 16) Strongly Agree | 29% (7) Agree | 4% (1) Neutral.
11.12 A validation study should establish diagnostic concordance between digital and glass slides for a single observer (i.e., intra-observer variability).
12.2 The performance of an IA/AI-solution must keep up with the increasing number of cases.Analysis of WSIs right after scanning should be supported for time intensive analyses.Algorithm selection should be automated based on available LIS information (e.g., tissue type, staining) -50% (12) Strongly Agree | 42% (10) Agree | 8% (2) Neutral.
12.3 Algorithms should highlight the regions on the digitized slides, which were used to determine their output, to enable visual control by the pathologist -75% (18) Strongly Agree | 25% (6) Agree.
12.4 The ability to provide feedback, to either mark regions or cases as examples of great successes/failures, is suggested.These will allow for both (a) improvement of algorithms and (b) testing of subsequent versions on realworld difficult/interesting cases -58% ( 14) Strongly Agree | 38% (9) Agree | 4% (1) Neutral.
12.5 Algorithms can be employed to prioritize cases within work lists or slides within cases (i.e., move those cases or slides that the AI algorithms flagged with positive findings to the top of the worklist so that they are reviewed first) -57% (17) Agree | 23% (7) Strongly Agree | 17% (5) Neutral | 3% (1) Disagree.
12.6 An indication should be provided to clearly advise when the algorithm has yet to be run, or is still running, and may still return additional results to prevent premature sign out of cases -50% (12) Strongly Agree | 42% (10) Agree | 8% (2) Neutral.
12.7 A documentation of where and how the results are stored should be part of the architecture design.Are they in a secured automatically backed up location?Are the results associated with the image itself or the patient file?
The results of the algorithm should be stored in a way that diagnostic decisions can be retraced and in accordance with the legal requirements (e.g., screenshot, images of the critical regions) -54% (13) Strongly Agree | 46% (11) Agree.12.8 Documenting expected input/output formats is important to ensure they are in "standard" formats (e.g., DICOM, CSV, XLS) that will be easy to share/re-use over the long term.Avoiding the need of using vendor-specific software to access results.

Fig. 1 Fig. 2
Fig. 1 Example of selection criteria compared in two different systems using a scoring model: The criteria should have the same scale (e.g., 1-10) and can be weighted to give more importance to, e.g., diagnostic and workflow aspects

Recommendations for IT interfaces, standards, and workflow
• Financial sustainability negotiated with reimbursement agencies should include the needed personnel and equipment under the new DP conditions.

Table 1
Levels of automation considered by this working group Revalidation is required whenever a significant change is made to any component of the WSI workflow.•KnownedgecaseswhereIA/AImay not perform well should be documented.•Thepathologyreportshouldcontain information about the use and regulatory status of IA/AI tools.•Modelperformance of on-site validation studies may be included.•Alltissueon a glass slide should be available for computational analysis.•Qualitycontrol measures should ensure the quality of digital images for analysis.•Userrequirements and IT requirements for software operation should be clearly defined.•ROIselection methodology should be stated and described in the diagnostic report.•Thevalidation process should include a representative set of slides for the intended application.• Clear descriptions of quality control measures and validation steps should be provided.
• • Reproducibility measures, such as pathologist-algorithm correlation, should be documented.• A validation study should establish diagnostic concordance between digital and glass slides.• Non-inferiority testing should be carried out between algorithm and pathologists.

group 1-scanners, quality assurance and vali- dation of scans Specific statements and agreement levels.
Know your scope: The institute should clearly define the scope of the targeted DP-workflow (i.e., intended use).For example: diagnostic biopsy workflow, special stain workflow, image analysis workflow, fluorescence workflow, organ-specific workflow -77% (20) Strongly Agree | 23% (6) Agree.1.1.2Required workflow analysis: The entire workflow needs to be inclusive of all workflow components and should consider current and future requirements for long-term flex- A clear SOP should be in place for the management of updates, including a documentation of what the updates consist of, changes to the algorithm and requirements for re-validation -62% (15) Strongly Agree | 33% (8) Agree | 4% (1) Neutral.13.3 Burden of update frequency should be weighed against potential benefits and cost of re-validation of system.Awareness of expected algorithm update frequency is important 54% (13) Strongly Agree | 33% (8) Agree | 12% (3) Neutral.