Introduction

Clinical pathology is in the process of undergoing a digital transformation, wherein routinely produced glass slides are no longer read in an “analog” manner using a microscope but are instead viewed in a “digital” manner on computer screens after digitization. This digital pathology (DP) paradigm offers a number of important advantages, many of which are now being realized in clinical routines [1]. Improvements in streamlining of pathology practices, workflows, and quality-of-life enhancements for pathologists have already been seen. For example, DP streamlines pathology practices by having pathologists access and analyze slides remotely, eliminating the need for the organization, and physical transportation, of glass slides [2]. DP further facilitates collaboration among pathologists, enabling them to easily share and discuss cases, which can lead to more accurate and timely diagnoses [3]. Moreover, DP is staged to reduce costs associated with slide storage and management [4], as digital images can be stored electronically and accessed when needed, potentially eliminating the need for long-term physical archives. These same repositories provide rapid retrieval of previous cases that may be of comparative interest.

Beyond these clinical improvements, substantial work is demonstrating that once these images are digitized, they can be employed by computational approaches geared towards predicting diagnosis, prognosis, and therapy response of patients [5]. These DP tools and image-based biomarkers leverage the present day confluence of 4 factors for their success: (a) relatively inexpensive computational power in the form of graphics processing units (GPU), (b) inexpensive storage of the large file sizes typically associated with DP, which can often reach more than 2 GB per slide, (c) increased generation of whole slide images (WSI) via adoption of digital slide scanners in both research and clinical use cases, with some institutions routinely producing more than 2000 whole slide images per day, and (d) new algorithms, such as deep learning [6], whose success continues to be built upon the availability of the other factors.

In contrast, disadvantages associated with DP appear to be connected with its setup and instantiation, as opposed to long-term sustainment and usage. Challenges associated with initial cost [7], software and hardware integrations, refinement of lab practices [8], and additional training requirements potentially disrupting workflow and productivity are not uncommon during early stages of DP deployment. Secondary issues, associated with, e.g., slide scanning time, standardization of work product, and compliance with regulatory and legal issues, are likely connected to lack of experience and detailed planning, thus benefiting from and motivating the need for sharing of points for consideration and best practices.

In spite of any limitations, the opportunities afforded by going digital appear to be driving substantial investments by academic researchers, hospitals, and industry to put in place validated DP workflows for clinical usage [9]. As a result, groups of motivated experts have been formed both nationally and internationally to engage in knowledge sharing and best practices. For example, our Swiss Digital Pathology Consortium (SDiPath) was founded as a working group of the Swiss Society of Pathology (SSPath) in 2018 and now enlists over 170 members, evenly split between pathologists (defined here in the broadest sense, including board-certified pathologists, neuropathologists, dermatopathologists, residents, and trainees), computational pathology researchers, and technical experts which enable DP activities (i.e., histology-technicians and information technology specialists).

A common theme emerging from the development of our own vision for a national DP infrastructure [10], to surveys regarding DP usage and adoption [11, 12], is the apparent need for national recommendations for the deployment and validation of DP pipelines, workflows, and algorithms. This is in line with efforts in other countries and organizations that have produced similar recommendation documents geared towards their specific needs and regulatory environments (e.g., Germany [13], Australia [14], USA [15], UK [16]). These efforts express the importance associated with producing consistent work product, documenting workflows, and estimating both human and technological costs, together serving the tenet of patient safety having paramount importance. Notably, it is a requirement that the digital transformation of DP should not yield inferior performance, safety, or quality assurances as compared to its microscope-based analog counterpart, as reflected by respective CE, FDA, or IVD certifications.

Given the nascent nature of clinical DP instantiation, and the associated cross-domain skillset needed, a concerted effort of agglomerating different stakeholders’ experiences and opinions is warranted. This is especially the case as digital workflows are often non-trivial to materialize and may further be burdensome to upgrade or rectify if unexpected issues arise [17, 18]. There are often unforeseen challenges, for example, those associated with incorrect scope definition as discussed in our previous work entitled “Going digital: more than just a scanner!” [17]. While claims that “Digital Pathology: The time has come” [18] are emerging, there appears to remain potential hesitancy to engage in a digital transformation without clear guidelines of expectations and deliverables [11, 12]. The recommendations presented here, similar to those produced in other countries, employed surveys and discussions with experts in their respective fields to curate experiences and thoughts. The ultimate goal is not only to provide best practices and suggestions to those at different stages of their digital transformation but to take also current potentially ad hoc approaches and solidify them into common practices to the benefit of pathologists, regulators, device and algorithm manufacturers, researchers, and, above all, our patients.

Methods

To build consensus on a set of DP recommendations from SDiPath members, a Delphi process was used. Briefly, this process consists of rounds wherein (1) participants vote on their level of agreement with provided statements, (2) discordant statements are reviewed, discussed, and revised, and (3) new statements are submitted for voting again until a consensus is reached.

To facilitate this process, four working groups were formed around major pillars associated with DP: (1) scanners, quality assurance and validation of scans, (2) integration of WSI-scanners and DP systems into the Pathology Laboratory Information System, (3) digital workflow—compliance with general quality guidelines, and (4) image analysis (IA)/artificial intelligence (AI). These working groups were led by experts in their respective areas who were tasked with recruiting members to their WGs having relevant expertise as needed to generate a series of statements. On average, each working group consisted of approximately 10 people. Working groups were encouraged to review existing guidelines from other organizations, such as the Digital Pathology Association [19], CAP [15], Canadian [20], UK [16], German [13], Korean [21], and Australian [14] guidelines [22], and use them to critically reflect on their own statements.

Nomenclature was suggested such to indicate level of severity of proposed statements, with (a) “must” indicating an imperative, (b) "should" indicating a suggestion, and (c) "could" indicating preferable but not required.

After the individual working groups formulated their statements, they were unified into a single document, in which all working group members reviewed and provided feedback. In total, 83 statements were created and voted upon at a WG level via Google Forms, such that there was 1 form per WG, to allow participants to selectively engage with WG’s matching their expertise. Participants were asked to select between (i) strongly agree, (ii) agree, (iii), neutral, (iv) disagree, and (v) strongly disagree for each statement. The demographics of the expertise and background of the participants was recorded and is provided below. The survey was announced via various venues including the SDiPath mailing list, in person meetings, and direct departmental level recruitment.

After a 1-month waiting period for feedback, between May 2022 and June 2022, 14 statements were identified as needing discussion and clarifying language at the WG level. These statements were returned to the working groups wherein they underwent a supervised revision with the experts to modify the statements based on comments provided by the voting members. These were then again reviewed by all working groups for approval before being submitted to the members for a second round of Delphi voting via a single unified Google Form. This round was made available in November 2022 for 2 weeks, after which a review of the participant votes and feedback indicated convergence. Importantly, the voting members represented a diverse set of Swiss pathology stakeholders, hailing from all over the country. Those pathologists involved in the production of these guidelines are affiliated with all 5 university hospitals, as well as 4 cantonal hospitals, and 2 private institutions.

Statement responses are reported in descending percentage order. Consensus was determined as being reached if > 66% of all voters “Agreed” or “Strongly Agreed.” All of the statements presented here reached that level of agreement, indicating full consensus.

Participation was entirely voluntary, and there was no financial compensation for study participation and no disadvantage related to non-participation.

Results

Working group 1—scanners, quality assurance and validation of scans

This working group focuses on scanners, quality assurance, and the validation of scans in digital pathology.

These recommendations emphasize the importance of clear workflow definition, scanner evaluation, and thorough validation processes in digital pathology. These statements were asked to focus on the first part of the digital pathology pipeline—the selection, installation, and validation of whole slide image scanners. They discuss workflow creation and adjustment, documentation requirements, ideal scanner properties, and approaches for scanner validation. The recommendations are summarized as follows, with individual statements and agreement levels provided in Appendix 1.

Scope of the diagnostic workflow of digital pathology

  • Define the scope of the targeted digital pathology workflow, considering different types of workflows (e.g., diagnostic biopsy, special stain, image analysis).

  • Create a well-documented standard operating procedure (SOP) that describes the entire workflow, including scanning, and is in line with quality management systems and accreditation requirements.

  • Establish security settings for authorized access to workflow components.

  • Prioritize workflows based on scope, estimated case load, and turnaround time.

  • Adapt the Laboratory Information System (LIS) to work seamlessly with digital pathology workflows.

Scanner requirements

  • Evaluate different scanning systems, considering technical requirements and integration into the LIS. See abridged example in Fig. 1.

  • Ensure that scanners meet the intended purpose, including capacity, slide compatibility, and CE-IVD certification for diagnostic purposes.

  • Consider scanner maintenance costs and their impact on workflow.

Fig. 1
figure 1

Example of selection criteria compared in two different systems using a scoring model: The criteria should have the same scale (e.g., 1–10) and can be weighted to give more importance to, e.g., diagnostic and workflow aspects

Output formats

  • Identify ideal scanning profile settings for consistently high picture quality.

  • Define file formats for storage and sharing, with a preference for open, non-proprietary formats.

  • Specify image size, format, and archiving periods.

Scanner validation study

  • Define the scope of the validation study, including tissue sources, stains, and acceptance criteria for diagnostic purposes.

  • Establish concordance levels and define severity for non-concordance.

  • Create a validation protocol and test a representative sample for each application (e.g., see Fig. 2).

  • Generate a report summarizing the validation aim, results, conclusions, technical requirements, scanner settings, and training evaluations.

Fig. 2
figure 2

Example validation protocol for comparing diagnosis from glass slide with those of digital slides

Working group 2—integration of WSI-scanners and DP systems into the Pathology Laboratory Information System

Assuming a validated scanner is in place, these recommendations emphasize the framework for effectively integrating WSI scanners and DP systems into a Pathology Laboratory Information System, ensuring optimal visualization, data management, and workflow efficiency. The recommendations are summarized as follows, with individual statements and agreement levels provided in Appendix 2.

Visualization (monitors)

  • Larger, high-resolution displays are preferred for better image quality.

  • Monitors should be validated by experts and chosen by pathologists.

  • Consider ergonomic factors when selecting monitor size.

  • Monitor calibration with low deviation is recommended, with documentation.

  • Minimum contrast ratio and brightness levels for readability in different lighting conditions are specified.

  • Adequate color depth and smooth navigation within viewer software are essential.

Integration of WSI scanner into Pathology Laboratory Information System (Patho-LIS) for routine diagnostics

  • The scan workflow should be integrated into the Patho-LIS, image management system (IMS), and an image archive, supporting standard communication formats.

  • Image data in open formats should be stored in a storage system for retrieval using appropriate streaming mechanisms.

  • A secondary test environment is recommended for testing modifications to digital workflow (e.g., new tools).

  • Interfaces between WSI scanner and Patho-LIS, as well as between Patho-LIS and IMS, must be established and validated.

  • Barcode and alphanumeric codes should be printed on slides for identification.

  • Communication protocols should be documented.

  • Quality control should be in place before scans are handed over to experts.

  • Compliance with research regulations is required for non-routine diagnostic slides.

Recommendations for IT interfaces, standards, and workflow

  • An integrated image viewer should communicate with Patho-LIS and the digital archive.

  • The IMS should retrieve necessary information from Patho-LIS to link scanned slides.

  • Virtual microscopes should support a comparison view.

  • Network speed should meet specific requirements, with a general recommendation of 1 Gbps per scanner.

  • High-performance storage solutions should be integrated.

  • Redundant installations and alternative workflows are recommended to handle hardware or software malfunctions.

Working group 3—digital workflow—compliance with general quality guidelines

These statements are geared towards achieving conformity with current accreditation norms and traceable quality parameters that can be documented within the quality management systems (QMS) of each institute. The legal framework in which DP enters the stage consists of many facets, from adopted European regulations like in vitro diagnostic regulation (IVDR), to general data protection regulation (GDPR), national legislation like human research act (HRA) and human research law (HRL), and the medical/pathological guidelines of the SSPath.

Swiss laboratories regularly perform accreditation [23] according to the ISO15189 [24] and ISO17025 [25]. Relevant elements in terms of quality documentation comprise organizational, procedural, technical, and personnel aspects. For accreditation, DP is regarded as equivalent to conventional histology and thus tends to benefit from conventional quality control improvements (e.g., usage of barcoding).

In the future, even more improvements via additional quality measurements can be expected with the deployment of DP. For instance, histological sectioning for DP needs more attention by technical personnel, in terms of correct thickness and avoidance of folds, scratches, or peripheral placement of the tissue. Contribution to round-robin tests (Quality in Pathology (Germany), European Society of Pathology (ESP), NordiQC) can be fulfilled via slide upload and in-depth calibration measurements. In consequence, DP appears situated to facilitate creation of new more precise standards. The recommendations are summarized as follows, with the individual statements and agreement levels presented Appendix 3.

Quality requirements for laboratory staff and technicians

  • Technicians should receive specific training for digital pathology, including avoiding specimen placement at slide edges and recognizing artifacts.

  • Training with scanners and high-tech equipment is recommended for advanced users.

  • All workflow steps should be documented in the quality management system’s SOP.

  • Consistent barcoding and readable information should be placed on vials, FFPE blocks, slides, and reports.

  • Compatibility with additional barcoding solutions for special stainers should be ensured.

  • Ordered stainings should quickly appear as placeholders in the digital pathology system.

  • The preparation process should be defined to encompass triage of stainings and prioritize highly urgent cases.

  • Processes should be defined to switch to regular microscopy for non-digitally compatible microscopy techniques or selection purposes.

  • A process should be in place to allow for immediate retrieval of glass slides for rescanning or non-virtual microscopy.

  • Emergency plans for severe system errors should be in place.

  • Back-up systems for individual components in case of service or maintenance are recommended.

  • A process for re-scanning should be in place and counts of re-scanning may serve as a performance test of the scanning process.

  • Deviations and problems should be reported within a quality management, or critical incidence reporting, system.

Additional quality requirements for digital workflow pathologists

  • All pathologists should receive specific training for the digital pathology system, including case management, ordering re-scans, and measurements.

  • General knowledge in digital pathology and its potential pitfalls/limitations should be incorporated into the validation process and basic training for the Swiss federal title of pathology.

  • Thumbnail images must be compared to scanned images to ensure complete tissue recognition.

  • Additional support systems may be included, e.g., tracking systems for hovered areas, annotations for teaching and discussion, and time spent on details. The use of these data should be institutionally regulated and consented by the employed pathologists.

  • A digital process for requesting re-scanning should be in place.

  • Automated tools like scripts or algorithms should be used cautiously and follow indications, validation, plausibility checks, and quality control.

IT support

  • IT personnel familiar with the complete system must be in place (in-house or as a service).

Tumor boards

  • Case presentation at tumor boards can be performed at lower resolutions and in a representative way, but amendments to diagnosis should take place in the diagnostic workstation setting.

Inter-institute tele-consulting

  • The sending institute requesting digital tele-consulting is responsible for slide selection, scanning, resolution, and representativity, with approval declared in the consulting order.

  • The receiving pathologist should ensure the diagnosis is made in an appropriate digital setup.

  • The institute performing the tele-consulting should document in its sign-out report the number of electronic slides evaluated, viewing platform, and date of access.

  • Pathologists may retain digital copies of regions of interest used for consultation.

  • Receiving tele-consulting institutes in Switzerland are recommended to validate their workflows within regular accreditation processes.

  • To outline the obligations of the asking institute, the sentence “this diagnosis was established on digitized whole slide images kindly provided by the Institute XXX” may be included.

  • The final diagnosis and legal liabilities are determined according to the SSPATH guidelines for consultation cases.

Compliance with quality management systems

  • The validation test for the established end-to-end workflow should be documented within the Quality Management system and repeated after major equipment changes.

  • SOPs should include all major components of the workflow.

  • A re-validation of the complete digital workflow must be performed if major components are replaced.

  • Other minor changes due to the modularization of the workflow are handled according to institutional QM guidelines.

  • Separate validations must be performed for specific physical measurements.

  • DP workflows are expected to increase patient safety and quality measurements, which could be covered with higher reimbursement rates.

  • Financial sustainability negotiated with reimbursement agencies should include the needed personnel and equipment under the new DP conditions.

Working group 4—image analysis (IA)/artificial intelligence (AI)

These statements aim to provide recommendations for the testing, implementation, and quality control of digital image analysis solutions in DP practice.

As part of this process, the working group attempted to concretely define specific terms that are employed within the statements:

  1. 1.

    Image analysis (IA): the extraction of meaningful information from images by means of image processing techniques.

  2. 2.

    Artificial intelligence (AI): the theory and development of computer systems able to perform tasks normally requiring human intelligence, such as visual perception, speech recognition, decision-making, and translation between languages.

  3. 3.

    Machine learning (ML): the use and development of computer systems that are able to learn and adapt without following explicit instructions, by using algorithms and statistical models to analyze and draw inferences from patterns in data.

  4. 4.

    Deep learning (DL): a type of machine learning based on artificial neural networks in which multiple layers of processing are used to extract progressively higher-level features from data.

  5. 5.

    Levels of autonomy: appreciation that unique challenges and requirements are likely needed on a per tasks/algorithms basis depending on the supervision required to safely employ them (see Table 1).

Table 1 Levels of automation considered by this working group

To scope the statements, it was discussed that the current state of technology was not sufficiently high to justify consideration of type 5 algorithms, with type 4 only being considered in niche roles. As such, in the statements below, AI solutions were thought to be aimed to automate repetitive and time-consuming tasks (e.g., mitosis counting, immunohistochemical (IHC)-marker evaluation) and provide a decision support system to the pathologists (e.g., for ambiguous or rare cases). It was noted that the field is very rapidly evolving, and as such, attention should be paid to determine when revision of these statements in light of new inventions, experience, and wisdom is required. The recommendations are summarized as follows, with the individual statements and agreement levels presented in Appendix 4.

General considerations

  • For bioimage analyses and AI-assisted solutions intended for diagnostic use, institutes of pathology should use officially certified systems (e.g., IVD-CE certified, FDA-approved) or laboratory-developed systems that meet validation and quality control requirements.

  • The final diagnosis is the responsibility of the pathologist.

  • As the level of autonomy in AI systems rises, the interpretability of results becomes more critical.

  • Algorithms indicating germline or somatic mutation status must comply with existing laws for molecular testing.

  • All systems must fulfill Swiss regulatory requirements.

  • AI results must be reported to and reviewed by a board-certified pathologist, following the “integrative diagnosis” paradigm [26].

Implementation and validation of IA/AI solutions

  • Each Institute of Pathology must internally validate IA/AI solutions, even if officially certified systems are used. The scope of validation should be clearly defined.

  • Validation should be appropriate for the intended clinical use and clinical setting of the application.

  • Validation should involve specimen preparation types relevant to the intended use.

  • The validation study should closely emulate the real-world clinical environment.

  • Metadata associated with whole slide image (WSI) creation should be documented.

  • Diagnoses made using IA/AI should include a version number associated with the validation protocol.

  • Revalidation is required whenever a significant change is made to any component of the WSI workflow.

  • Known edge cases where IA/AI may not perform well should be documented.

  • The pathology report should contain information about the use and regulatory status of IA/AI tools.

  • Model performance of on-site validation studies may be included.

  • All tissue on a glass slide should be available for computational analysis.

  • Quality control measures should ensure the quality of digital images for analysis.

  • User requirements and IT requirements for software operation should be clearly defined.

  • ROI selection methodology should be stated and described in the diagnostic report.

  • The validation process should include a representative set of slides for the intended application.

  • Clear descriptions of quality control measures and validation steps should be provided.

  • Reproducibility measures, such as pathologist-algorithm correlation, should be documented.

  • A validation study should establish diagnostic concordance between digital and glass slides.

  • Non-inferiority testing should be carried out between algorithm and pathologists.

Desirable technical properties

  • Integration into the existing digital pathology workstation environment is recommended.

  • The IA/AI system’s performance must scale with the increasing number of cases.

  • Algorithms should highlight regions on digitized slides used to determine their output.

  • The ability to provide feedback and prioritize cases or slides is suggested.

  • IA/AI can be employed to prioritize cases within work lists or slides.

  • Indications should be provided regarding the status of running algorithms.

  • Results should be stored in a secure and retrievable manner, in accordance with legal requirements.

  • Expected input/output formats should be documented to ensure long-term usability without vendor-specific software.

Maintenance

  • A clear SOP should be in place for the management of hardware and software malfunctions.

  • A clear SOP should be in place for the management of updates, including documentation and re-validation requirements.

  • The burden of update frequency should be weighed against potential benefits and re-validation costs with awareness of expected algorithm update frequency.

Conclusion

Using a Delphi process, the members of the Swiss Digital Pathology Consortium reached consensus on practical recommendations for the implementation and validation of digital pathology in clinical workflows. These recommendations focused on its safe usage, with attempts at maximizing patient safety and benefit while minimizing overhead. As a result, we put forward these statements as best practices to be considered when adopting DP within Switzerland, while also providing another resource for our international colleagues. We are happy to report significant concordance between existing national recommendations and our own, likely due to the converging nature of what appears to be emerging best practices for DP. These recommendations integrate and update previous guidelines, providing a dedicated section on the implementation of AI and IA. This fills a niche absent from other recommendations, likely due to the nascent nature of AI/IA field during their creation. Of particular note was that working groups appreciated how rapidly the field is maturing and realized that, unlike other more established technologies, these DP recommendations will likely need to undergo revisions as technology and the associated implications of this paradigm-shifting technology become clearer.