Introduction

As investments in healthcare are generally large and mission critical in case of medical imaging technology, the need for structured implementation, measurement approaches, and holistic evaluation methods is expanding [13]. The fact that hospitals around the world are re-evaluating their picture archiving and communication system (PACS) implementations reflects this demand [4]. PACS are workflow-integrated imaging systems that are designed to streamline operations throughout the entire patient-care delivery process and have become an integrated component of today’s healthcare delivery system [5]. However, hospitals have often failed to achieve necessary productivity levels and operational efficiencies (e.g., reduction of costs, productivity increase, and optimization of patient episode throughput time) using PACS, even though pursuing a full filmless environment using PACS is a high-cost venture [6].

Currently, hospitals have a tendency toward re-evaluating implemented radiology systems to overcome technical and practical limitations of operational file systems and deployed PACS database design [7, 8]. More efficient, extensive, cost-effective, and vendor-independent infrastructure PACS solutions are available to hospitals. This re-evaluation process is also driven by the current volumes of imaging data produced by advanced modalities like CTs and MRs that have major impact on the common information system and information technology (IS/IT) architecture. This poses serious challenges in terms of storing growing amounts of data, cross enterprise document sharing, protecting patient information [9], and general alignment with evolving technologies and disruptive innovations such as the application of serial advanced technology attachment, data grid architecture development, cloud computing, scalable distributed server environment, and service-oriented architectures [10]. In doing so, hospitals are planning for major updates and even large-scale replacements trajectories to other PACS vendors in overcoming technical, practical limitations and improve overall system performance [4, 7].

Due to the above developments and growth of the radiology practice, the importance of strategic and prospective directions is becoming more urgent [11]. In practice, strategic planning approaches toward imaging technology, PACS, and PACS (re)deployment are lacking, both in hospital board rooms as well as in scientific literature. This makes it difficult to harmonize or “align” PACS goals, objectives, and improvement activities with the hospital’s strategic agenda. Achieving optimal alignment of PACS and pursuing its intended goals and objectives within the hospital enterprise seems an intricate and poorly examined process and lacks scientific grounds.

Digital radiology (management) systems are difficult to evaluate due its evolving nature, the high number of involved departments and professionals and hard to define patient outcomes [12]. Earlier, Van de Wetering et al. [3] argued that theories on business–IT alignment, organizational fit, and adoption of IS/IT can help to understand why certain key elements in clinical practice have not been realized. A method that supports the process of optimally aligning PACS and defines the synergetic effects within hospital operations will be very valuable. However, this method has not been developed, empirically applied, and validated yet.

In developing such a method, the current paper combines an existing model on PACS maturity (i.e., continuous evolvability process from immature stages of growth/maturity toward another level) [13], with the concept of business–IT alignment (i.e., investments made in organizational domains related to PACS should be balanced out in the organization in order to obtain synergizing benefits) [14, 15]. From this, the first objective is to develop a framework and associated survey to empirically assess a hospital’s current PACS maturity and alignment regarding different strategic directions. The second goal is to systematically examine the applicability of this framework and instrument and attentively explore its implications in terms of maturity and alignment on PACS performance at two different pilot hospitals in The Netherlands. Based on the responses from these pilot hospitals, improvements will be made to the PACS integrated situational alignment (PISA) framework and related survey.

The remainder of this paper is outlined as follows: We will first review a synthesized model to measure levels of PACS maturity and subsequently address principles of business/IT alignment and review the concepts of PACS performance. Subsequently, a first version of the PISA framework is proposed. “Methods and Material” section discusses applied validation methods, after which results are presented. Several framework improvements are presented, after which the PISA framework is completed. Finally, the paper concludes with a brief discussion on the implications of this study, identifies inherent limitations, sets out a research agenda, and sets out main conclusions.

Defining Framework Constructs

PACS Maturity Concepts

Theories on IS/IT maturity and adoption are well established in business and IS/IT literature, going back to the early 1970s. The concept of the IS/IT maturity stage hypothesis was introduced by Nolan [16] in 1973. Later on, this model was extended, frequently discussed, and adapted [17]. In general, IS/IT maturity models provide insight into the structure of elements that represent process effectiveness of IS/IT in organizations. They also allow organizations to define roadmaps on how to get from one level of maturity and evolve to the next [17]. Recently, a specific PACS maturity model (PMM) [14] was developed that describes PACS maturity and evolvability in the hospital enterprise. Based on a literature review of 34 scientific papers on PACS development and a subsequent meta-analysis, the PMM was built upon three general streams in PACS maturity and evolution: (1) radiological and hospital-wide process improvements, (2) integration optimization and innovation, and (3) enterprise PACS and the electronic patient record. From this, the model defines five cumulative levels of PACS maturity that hospital enterprises can achieve:

  • Level 1: PACS infrastructure

  • Level 2: PACS process

  • Level 3: clinical process capability

  • Level 4: integrated managed innovation

  • Level 5: optimized enterprise PACS chain

With the progress toward the top maturity level 5 operational (workflow) efficiency, IS/IT integration and qualitative care using PACS technology increases. Although the PMM model can be interpreted as a straightforward (i.e., sequential) accumulation of PACS investments, it does not explicitly define mechanisms on how hospitals can actually move from one maturity level to another. For developing a PACS alignment framework, however, the PMM serves as a foundation for its underlying body of knowledge and rigorous theoretical approach including the concept of business–IT alignment.

Concepts on Business/IT Alignment

The concept of “strategic” alignment—also called “fit” [18]—is a top concern for executives and IT practitioners for decades and refers to applying IS/IT in an appropriate and timely way, in harmony (i.e., complementarity between activities) with business strategies, goals, and needs [19]. It is a central element of strategic planning, the process by which organizations develop and deploy a competitive, long-term strategy in which internal resources are integrated into external opportunities [20].

The classical strategic alignment model of Henderson and Venkatraman [14] is undoubtedly the most cited concept in the field and extended by others [21]. Their model argues that a dynamic process is needed to ensure continuous alignment between business and IS/IT domains, to achieve “strategic fit” as well as “functional integration.” The model does, however, have its limitations. For instance, relations in the model are not operationalized, nor clearly defined [15]. Subsequently, Turban et al. [22] developed—and extended—the model for business–IT alignment containing new mutual relations among business dimensions that are assumed to contribute to the successful implementation and adoption of IS/IT. Better known as the MIT90’s framework, a descendant of Leavitt’s diamond, another alignment model, was developed based on the idea of internal “fit” as a dynamic equilibrium of five key organizational dimensions and external fit as strategy formulation based on environmental trends and changes in the market place using IT as an enabler. Many studies since then have used concepts that are incorporated in the MIT90 framework and its model. Thus, it is not surprising that recent alignment models have strong similarities. A shortcoming of most alignment models, however, is that they do not explicate how the dimensions of the model interact with and depend on each other. This shortcoming was addressed by Scheper [15] among others. Scheper starts by defining five organizational dimensions:

  1. I.

    Strategy and policy (S&P): organization of strategy and policy procedures

  2. II.

    Organization and processes (O&P): addresses processes as a basic principle for organizational development

  3. III.

    Monitoring and control (M&C): financial and non-financial management control

  4. IV.

    Information technology (IT); concerns IT management and development processes

  5. V.

    People and culture (P&C): reflects value and significance of employees for an organization

Different from most other alignment models, Scheper subsequently developed levels of maturity in measuring the development stage for each of the five dimensions, creating a multidimensional maturity matrix (cf. Sledgianowski et al. [23] for an equivalent approach). Next, he claims that alignment can be directly measured by comparing the maturity levels of all five dimensions at the same time. His alignment principle is based on the idea that organizations can mature each single dimension, but only equalization among all dimensions (i.e., alignment) will significantly improve organizations’ performances.

Multi-factorial PACS Performance

Evaluation methods have proven valuable in the past in order to assess consequences of changing the traditional film-based practice to digital (radiological) workflow [24]. Since then, several scholars argue that enterprise–PACS benefits should be evaluated from different perspectives [2]. As there are many interrelated steps between PACS usage and eventual patient outcome, the imaging workflow chain and substeps that affect the quality of imaging services and clinical outcomes are imperative to the evaluation of PACS [25].

Based on the wider adoption of technology acceptance models and levels of clinical efficacy [26, 27], we define PACS performance as the multi-factorial impacts and benefits produced by the application of PACS. This is expressed in terms of hospital efficiency and clinical effectiveness with respect to PACS workflow and the patients’ clinical journey. For the purpose of this paper, we adopted the outcomes of a meta-analytic approach that has synthesized original PACS sources on PACS performance and balanced evaluation models [3, 28]. Based on a total of 37 papers published between 2000 and 2009 that were included for review purposes in this approach (after applying several inclusion criteria having retrieved 980 key publications and positioning papers), four performance constructs are defined. These constructs were subsequently translated into measurements that are representative elements for the maturity performance measurement of PACS.

Defined by its multi-factorial nature, these constructs include measures that are available for the assessment of PACS in hospitals and can be applied to assess impacts of PACS, from (a) technical and IS/IT perspective, (b) organizational efficiency, (c) service outcomes, and (d) clinical impact. “Appendix” includes also all adopted PACS measures. These measures are adopted into our framework since they are valuable for empirical research and likewise relevant for radiology practice.

PISA Framework

From the previous sections, our PISA framework combines the two pillars [1] PACS maturity (i.e., classifying PACS systems according to their stage of development and evolutionary plateau of process improvement) and [2] alignment (i.e., investments made in organizational dimension related to PACS should be balanced out in the organization in order to obtain synergizing benefits) and claims that both affect PACS performance of the hospital enterprise. Figure 1 provides a schematic sketch of our framework and depicts PACS maturity on the horizontal axis and organizational dimensions on the vertical axis. Of the two pillars, alignment is often the least defined one (if at all) and is often suggested as important without guidelines for practice. In our framework, we explicitly define alignment as the degree of leveling between five organizational dimensions described in “Concepts on Business/IT Alignment” section [15], i.e., alignment is measured as the degree to which the five dimensions are distinctive with regard to their maturity levels as previously defined by the PMM (see “PACS Maturity Concepts” section). Hence, alignment can be expressed within Fig. 1 as a line connecting all five (horizontal) organizational dimension (the figure also displays two “possible” scenarios of alignment among dimension on a low and high level of maturity for demonstrative purposes). Next, the framework claims that the degree of alignment and synergetic mechanisms between the several organizational dimensions (i.e., independent variable) is directly and positively related to PACS performance (i.e., dependent variable).

Fig. 1
figure 1

PISA framework

Methods and Material

Framework Development Process

A project teamFootnote 1 applied an incremental development process that follows the design science methodology. In this approach, knowledge is produced by constructing and evaluating artifacts which are subsequently used as input for a better awareness of the problem [29]. To ensure quality and validity of the developed measurement instrument, we applied complementary validation methods. Guidelines were used when building the artifacts [29], securing their face and content validity, performing extensive pre-tests, and executing pilot cases at two different hospitals. Figure 2 provides an overview of the five interrelated process steps that were conducted, using a process delivery diagram [30]. During the first step, the project team reviewed literature on PACS maturity and alignment and created common ground concerning the research topic. Based on this review and the project teams’ own field experience, key concepts were developed. In step 2, these key topics were critically reviewed by two recognized PACS experts (a professor of radiology and head of a radiology department). These experts provided the project team with feedback, input, and advice on key concepts in diagnostic imaging and general radiology practices. Taking these valuable suggestions into account, an initial survey was created—step three—which was subsequently discussed with industry consultants and a PACS R&D manager during a focus group meeting. A result of this focus group was a technical architecture of each of the PACS maturity stages of the framework that was helpful in redefining technical aspect in the survey (this architecture is available upon request). The initial survey contained 28 statements—four statements for each organizational dimension (i.e., independent variable) and eight for PACS performances (i.e., dependent variable). Starting step 4, two PACS experts once again reviewed each item of this initial construct and commented on applied scales, significance and importance of each item for PACS maturity and alignment. Also, both experts evaluated the PACS maturity and alignment level of their own hospital. Outcomes suggested some extensions of the survey. All input was transformed into individual validation sessions (“Delphi method”) with three radiologists, a neurologist, a technologist, and medical informatics researcher. These sessions were used to evaluate the structure of the constructed framework and comment on the specified PACS topics and features of the operationalized survey. Using the “talk-aloud protocol,” experts articulated their thoughts and considerations as they filled in the survey. These six experts (representing four hospitals and four different economical geographical areas in The Netherlands) were recruited using personal and professional networks. Outcomes were used to improve our survey statements on validity (i.e., do the incorporated survey items measure what they are supposed to measure), reliability (i.e., is each question posed correctly and can radiologist, technologists, and PACS administrators address them), and empirical application (e.g., size of survey and tooling). Results were discussed within the project team. The survey was extended to a total of 45 statements, covering most intersections of the five horizontal axes (i.e., organizational dimensions), and the five vertical axis (i.e., PACS maturity levels). Per organizational dimension, the items were formulated according to a cumulative order, i.e., that of “increasing complexity” along the maturity scale.

Fig. 2
figure 2

Framework development process

The results of the two pilots are described in “Results” section of this paper (the piloted survey items are presented in “Appendix”). Step 5 is the conclusion, i.e., the final questionnaire as a result of the complete validation of the framework.

Pilot Sites

In the spring of 2010, two Dutch hospitals with different characteristics were selected (see Table 1) to participate as a pilot case. The two hospitals were known within the field as actively involved in optimizing their PACS deployment. As such, the pilot could focus on applying the framework for the first time on two “advanced hospitals.” This provides good opportunities to improve the contents of the survey and improve the clarity of the statements, if necessary.

Table 1 Background characteristics of pilot sites

Per hospital two radiologists (including head of department), head of radiological technologists and a PACS administrator completed an online survey within a secured web-environment. These informants appear to be the most familiar with the subject of PACS maturity and performance, making intra-institutional validity likely. Including multiple stakeholders from the radiology department also reduces common source variance associated with sampling from the same source [19, 31], excluding face validity issues. The respondents completed the survey separately, in order to avoid systematic bias and any peer pressure to give particular answers. It took the eight respondents approximately 20–25 min to complete the survey. Comparing the individual results within each hospital allowed us to measure levels of agreement between the respondents.

The applied seven-point Likert scale for each statement consisted of the classic values “strongly disagree,” “disagree,” “somewhat disagree,” “neutral,” “somewhat agree,” “agree,” and “strongly agree.” This scale is typically used for subjective performance evaluation. It is applicable for balanced assessments rather than objective measurements. In general, outcomes of assessments in complex areas as medical processes are often qualitative and hence rely on subjective perceptions of clinicians rather than on quantitative statistics [32]. All statements were phrased in present tense, but respondents were asked to provide answers for both the current and future/preferred situation of their hospital. Next to the items measuring PACS maturity as defined by our PISA framework, the survey also contained some general questions (e.g., name, function, years of experience using PACS, etc.). PACS performance finally was measured using 13 performance statements on how well PACS contributes to efficiency and effectiveness (see “Multi-factorial PACS Performance” section).

Results

Data and Measurement

Analysis of the data from the eight completed pilot questionnaires was performed in two steps. First, the results were analyzed by comparing the descriptive statistics for both hospitals. The mean as well as standard deviation (SD) and median (M) of all maturity items were computed for the two hospitals. First, we assume that there is low variation in scores per maturity level and organization dimension, indicating that there is a level of agreement between the four respondents of each hospital. Secondly, we assume for both hospitals that there is a decreasing mean score for each next maturity level per organizational dimension confirming the cumulative order (“difficulty”) of the maturity items as defined by the PISA framework. The Table 2 sets out the results for each question (Q), where Q1 and Q2 are the two statements per maturity level (ML).

Table 2 Descriptive statistics

Based on the descriptives in Table 2, several remarks can be made. Regarding the mean scores per hospital, most statements comply with the assumed cumulative order, i.e., mean scores decrease with higher maturity levels for each organizational dimension. There are a few exceptions, however; these items are in “bold.” For instance, item Q2 of maturity level 4 related to “strategy and policy” has an unexpected high mean score (6.38), deviating from the premise that the mean score at lower maturity levels would be higher as these items are less “difficult” to agree with for hospitals.

Likewise, item Q2 of maturity level 3 related to “organization and processes” has a lower mean score than assumed, i.e., items at higher maturity levels of this dimension show higher instead of lower mean scores. These deviations are in fact violations of our accumulation assumption but also provide essential improvement opportunities.

In same vein, we analyzed the scores on the PACS performance items for both hospitals. PACS performance was measured through 13 questions that address the perceived benefits of PACS application in the respondents’ hospital. Table 3 shows the results.

Table 3 PACS performance descriptives

Based on comments of the respondents and the descriptive statistics, some remarks can be made. It appears during the pilot that for hospital B, that questions C1 and C4 could only be answered by radiologists, making them apparent radiologist specific. Also, statement C2 has a relatively “too high” mean and median score accompanied by a low SD, suggesting that the scale of this item needed modification.

Besides evaluation by analysis of the hospital descriptive statistics, all eight respondents commented on the items and provided suggestions for their elaboration and modification. In general, the respondents of the pilot concluded that the survey was useful, easy to interpret, and a valuable tool for further PACS development in their hospital.

Instrument Refinement

By critically considering the results of the pilot statistics, including the comments and recommendations of the respondents, we modified several survey statements. Table 4 lists the modifications proposed after the pilot studies. From the mean “modified” and “old” scores of statements for each separate maturity level—see Fig. 3—it can be gleaned that the modified survey statements were successfully ordered by PACS maturity, i.e., hierarchical order.

Table 4 Survey modifications
Fig. 3
figure 3

Mean scores per PACS maturity level

Measuring PACS Maturity and Alignment

After the maturity items and measurements are evaluated, the next step is to examine how PACS “alignment” can be measured. As suggested earlier by our PISA framework, we recall that alignment can be indicated by the differences between the maturity scores on the five organizational dimensions. Hence, overall maturity scores for each of the five organizational dimensions were calculated. To adjust the survey design in which we defined subsequent maturity items, an algorithm was constructed in such a way that it captures this principle of accumulating maturity.

This algorithm includes the following assumptions:

  1. 1.

    Mean scores (ranging from 1 to 7) were used in order to calculate scores for each organizational dimension (i) for maturity levels 3–5: μ i3–5.

  2. 2.

    PACS maturity levels (λ) were assigned points, λ 3 = 200, λ 4 = 300, and λ 5 = 400. Thus, each λ has a total range of 100 points.

  3. 3.

    A mean score of 4—on the applied Likert scale—was used as a threshold (Τ). A mean per statement of ≥4 confirms the specified λ to which the statement is assigned to. A mean of <4 confirms that the preceding λ is more applicable.

  4. 4.

    Because we defined a Τ at 4 on our Likert scale, the remaining score space for both scores ≥4 (i.e., 7 − Τ) and <4 (i.e., T − 1) is applied in the calculation of maturity, represented by γ.

Using the above assumptions, the following three-step algorithm was applied to each organizational dimension:

  1. I.

    if μ i3 ≥ 4; then (μ i3 − Τ) × (100/γ) + λ 3; else

    ((μ i3 − 1)/γ )× λ 3 → π i3

  2. II.

    if μ i4 ≥ 4; then (μ 4 − Τ) × (100/γ) + λ 4; else

    {if ((μ i4 − 1)/γ) × λ 4 ≤ π i3; then π i3; else ((μ i4 − 1)/γ) × λπ i4 }

  3. III.

    if μ i5 ≥ 4; then (μ 5 − Τ) × (100/γ) + λ 5; else

    {if ((μ i5 − 1)/γ) × λ 5 ≤ π i4; then (π i3 + π i4)/2; else ((μ i5 − 1)/γ) × λ 5 → π i5 }

The PACS maturity scores (π i3–5) calculated following this algorithm were then normalized and summed, weighing their contribution to the overall PACS maturity (see Fig. 4).

Fig. 4
figure 4

PACS maturity and alignment scores

Now that we have created monotonous cumulating maturity scales for each organization dimensions, we next define the overall alignment measure (α). There are different methods possible to calculate the differences between the maturity scores of the five organizational dimensions. In this study, we applied the difference between the maximum and minimum maturity scores of the five dimensions as measure [33]. The “smaller” this difference, the “better” the alignment between the dimensions (see Table 5). An alternative for the alignment measurement is calculation of the standard deviation between the five dimensional maturity scores, or selecting the minimum score (as the “weakest link”). In practice, these alternative measurements for alignment strongly correlate with our initial “min–max” measurement.

Table 5 PACS maturity and alignment scores

Table 5 shows the final and aggregated PACS maturity and alignment scores for both hospitals, including the mean score for PACS performance (β). Comparing these means allows us to explore the claim that high PACS alignment coincides with high PACS performance. Obviously this can only be done attentively, as we only have data on two hospitals.

From Table 5 can be concluded that hospital B has a relatively lower α-score than hospital A, which implies that its PACS alignment (i.e., degree of alignment and synergetic mechanisms between the several organizational dimensions) is better. Also hospital B has a higher β score than A, implying better PACS performance. While hospital A especially has lower maturity scores on O&P and P&C, hospital B has deficiencies in the M&C dimension. Hence, for hospital A, the gap between IT and O&P is to be closed to reach optimal alignment (represented by the solid gray line in Fig. 4). Likewise, optimal alignment for hospital B is inhibited by the distance between the scores on the P&C and M&C dimensions. Items related to O&P have moderately low maturity scores, whereas the IT dimension demonstrates relative higher maturity in both hospitals. This seems to resemble a (current) strong technology focus that might hinder a balanced perspective also taking operational/organizational efficiencies into account.

In doing our pilot, we also collected qualitative information to evaluate our PISA framework and underlying concepts/expectations. From interviews, it became clear that both hospitals currently have a predominant “local” (departmental) focus on PACS, rather than aiming at aligning PACS operations hospital-wide and beyond hospital boundaries. The following excerpt from a respondent from hospital B clarifies this view:

Our hospital has a clear local orientation towards operations with respect to PACS. Strategy and policy on a regional, cross-enterprise level is currently missing….

Also, the outcomes of this pilot suggest that both hospitals have improvement opportunities on several organizational dimensions. Our PISA framework supports hospitals to define specified investments that will improve PACS maturity and alignment.

Application in Practice

Now that we have described basic concepts of PACS maturity and alignment, the next step is to glean some common and practical guidelines for PACS that will support hospital decision makers in deciding how to travel in a certain direction (i.e., mature PACS), given the hospital’ specific resources and competencies and current system infrastructure. Hospital strategic planning processes are formed on the basis of internal, external, market-driven, and non-market-driven components [34] (i.e., they are thus situational). Improvement activities to realize a PACS maturity transition are likewise conditional on given situations such as the given PACS’s state of maturity and the specified strategic alignment direction. Thus, PACS maturity and alignment improvement programs depend on what is required to realize transition toward higher levels of PACS maturity. As such, we believe that desired maturity levels cannot be achieved without conscientiously governing and addressing all process focused elements and deployment activities at each of the intermediary maturity levels. This implies that careful consideration is required when toward a certain maturity level.

In order to define specified improvement activities—with accompanying investments—that can be executed along the five organizational dimensions that best meets a hospitals’ current and future needs, we suggest the following steps to be taken:

  • As a first step, assess the maturity, alignment, and performance state of the current deployed PACS (“as-is”) and also a “to-be” situation should be determined using the PACS maturity model—and survey—involving multiple stakeholders (e.g., radiologists, technologists and PACS administrator, etc.).

  • A second step concerns performing a fit-gap analysis in determining if the current PACS maturity level is either a precursor for the to-be situation or the desired maturity level “leaps” over intermediary stages. Hence, at this stage, decision makers need to be decided whether the improvement roadmap follows an incremental improvement process (stages follow one another by definition), radical changes (i.e., introducing radical changes to processes and not following the logic of monotonous sequential development), or a hybrid combination of the two. This decision implies—in terms of strategic direction—if the hospital structure and PACS process focus and/or persist on a previously chosen paths by retaining current strategies and structures.

  • In the third and final step, we suggest to set out all improvement activities and make deliberate investments that are required in order to achieve the desired level of PACS maturity and alignment. For this purpose, a set of measurements can be defined which are organized into projects that take into account the risks involved, investment costs, critical success factors, and benefits. In the course of the execution of all (hospital-wide) activities, the level of alignment between the five organizational dimensions should be monitored in managing similarities, overlap, and synergy between the improvement projects in order to realize strategic objectives and optimal deployment of PACS.

In practice, hospitals define their own improvement roadmaps incrementally, radical or both as a strategy. Based on the above steps and accompanying considerations, each optimization roadmap defines improvement projects that can be executed according to the five organizational dimensions. Both the alignment and fit approach that build the model imply that multi-disciplinary teams should be formed to deliver (on tactical and operational levels) the agreed objectives [2]. Consecutively, actions and results should be monitored using project management methods. Basically, evolving toward a higher level of PACS maturity includes critically reflecting on the chosen path (i.e., direction) while continuously maintaining alignment between the business and IS/IT dimensions and improve overall PACS performance.

Discussion

Motivated by what appears to be an intricate process on which scientific sources seem scarce, this paper proposes an integrated situational framework that allows hospitals to empirically assess PACS maturity and alignment. By explicitly addressing a hierarchical order (“increasing complexity”) of survey items along the maturity scale, communality, and interrelationship of stages of maturity, this paper avoided common pitfalls in business–IT alignment survey instruments and case research. Our study provided initial support for the basic claim that alignment and performance in the PACS domain coincide, thus enabling practical mechanisms for decision makers for setting goals, critically reflecting on current systems and strategically plan toward higher levels of PACS maturity and alignment.

In spite of its enticement, our framework has several limitations that suggest caution is required with the interpretation of the findings. First, this study was based on only two hospitals. This inhibits generalizability of key results. However, by restricting the scope of the study, we were able to get an in-depth view of both hospitals’ operations. Second, this study did not provide any “best practices” that other hospitals can benefit from nor does it provide extensive handles for comprehensive strategic planning to obtain optimum PACS performance. Since both studied hospitals differ in their respective PACS maturity and alignment scores, this could imply that the PISA framework may elucidate best practices. Finally, there is a need to develop a more robust measure of maturity and alignment. The applied calculation may have inherent biases since it was partly based on a rule-based algorithm.

The above limitations suggest avenues for further research. First, it is our ambition to validate the PISA framework in a representative group of Dutch hospitals. This allows the project team to examine extreme (high or low) scores and their respective impact on PACS performance by applying conventional analytical techniques, path analysis, and structural equation modeling to verify construct validity, factor reliability, and overall goodness-of-fit of the model. Second, the underlying mechanisms through which PACS performance is achieved are by no means comprehensible. Therefore, additional research is required to identify interaction effects—co-alignment [18]—of (latent) variables connecting maturity and alignment to PACS performance.

Conclusions

We argue that better PACS performance can be achieved by explicitly aligning maturity scores on each of the five organizational dimensions, done simultaneously and hence by an integrated management perspective. The PISA framework demonstrates promising results and outcomes of this study support empirical application of the framework to hospital enterprises in order to gain insights into their PACS maturity and alignment. In practice, the PISA framework appears as a useful checklist to systematically identify the improvement areas for hospitals in the PACS domain and is designed for further empirical research and clinical practice application.