Impulse response function analysis of the impacts of hospital accreditations on hospital efficiency


Improving hospital efficiency is an emerging area of interest among policy makers in the new global economy’s healthcare system. To ensure accurate efficiency analyses, we consider the nonhomogeneous input/output characteristics of various hospital departments, particularly the Department of Medicine, Department of Surgery, and Department of Other Specialist Medicine. These departments employ co-inputs to produce nonhomogeneous outputs. Specifically, we employ data envelopment analysis to evaluate the efficiency of 15 veterans hospitals in Taiwan. Empirical results show that the performance of the Department of Surgery has higher quality than that of the Department of Medicine and Department of Other Specialist Medicine. In addition, we include another data science technique, namely, impulse response function analysis. The findings indicate that “the New Hospital Accreditation” introduced in 2007 and revised in 2011 improved the efficiency of all departments in their respective first year of introductions. By contrast, the efficiencies of the Department of Surgery and Department of Other Specialist Medicine immediately decreased in the second year after the introductions.


Increasing medical costs and aging population have required governments and healthcare providers to pay increased attention to the productivity and quality of medical resources. The ineffective use of healthcare resources is one of the main causes of healthcare spending [1]. Therefore, enhancing the self-competitiveness and sustainable development of hospitals has become an important issue. Prior studies [2,3,4,5,6,7,8,9,10] that use data envelopment analysis (DEA)Footnote 1 to explore the efficiency of hospitals have been successful despite the multi-input and multi-output characteristics of hospitals. Hospital departments have indivisible resource inputs and outputs and do not offer the same services. This finding means that a homogeneity hypothesis among decision-making units (DMUs) may not be applicable to all hospital departments.

The nonhomogeneous DMU problem can be addressed by dividing the set of DMUs into multiple groups, wherein all members of a group produce the same inputs and outputs [12]. Thus, a separate DEA analysis can be performed for each group. In this study, we propose DEA-based models for assessing the relative efficiencies of a set of DMUs that belong to the same industry or organization. However, the requirements of homogeneity are lessen. We then use these models to examine the efficiencies of hospital departments. Three structural assessment frameworks are established to measure the operating performance of the Department of Medicine, Department of Surgery, and Department of Other Specialist Medicine of veterans hospitals in Taiwan via co-inputs and nonhomogeneous outputs.

These frameworks are applied on 15 veterans hospitals in Taiwan from 2002 to 2015 for the following reasons. First, the medical system of veterans hospitals has been carefully operated for more than 50 years. Over the years, the system has developed into three veterans general hospitals, 12 veterans hospitals, and 18 veterans homes of health groups. A health system is the most comprehensive and complete medical organization and leads the field of medical service energy and teaching research in Taiwan. Evaluation methods and standards should be improved to promote the quality of medical services. Second, various problems related to accreditation have been raised given the rapid progress of medical treatments. Taiwan implemented the first hospital accreditation in 1978 and became the fifth country in the world and the first in Asia to conduct a hospital evaluation. The Department of Health of Taiwan formally announced the “New Hospital Accreditation” in 2005 and implemented the evaluation of all hospitals in 2007. This assessment process was revised in 2011. Veterans hospitals underwent accreditation for more than 10 years. However, the in-depth discussion conducted in the hospital accreditation is too minimal to create an impact on the performance of veterans hospitals. Therefore, this study applies impulse response function (IRF) to examine how hospital accreditations contribute toward hospital efficiencies, which are estimated using DEA.

This study has twofold contributions. First, this study is the first to establish three structural assessment frameworks to measure the operating performance of various departments, including the Department of Medicine, Department of Surgery, and Department of Other Specialist Medicine, in veterans hospitals in Taiwan. Co-inputs and nonhomogeneous outputs of a DEA performance measurement model were simultaneously considered to overcome traditional measures due to the indivisibility of resources and outputs among departments. This step facilitates the objective assessment of the performance of various hospital departments. Assessing the efficiencies of nonhomogeneous DMUs gives rise to the problem of how to fairly compare DMUs because some DMUs may not have exactly the same “nature of business.” Thus, this study aims to overcome traditional measures and utilizes an objective performance assessment of various hospital departments. The literature on the medical industry mostly assesses the efficiency of hospitals or compares the efficiency of different properties of hospitals, but interdepartmental research on hospitals is very rare (except in Li et al. [13]). Managers usually have a comprehensive understanding of medical management situations after the implementation of the New Hospital Accreditation, which facilitates the improvement of operational efficiency.

Second, this study is the first to use the IRF method to identify the impacts of the New Hospital Accreditation implemented in 2007 and its second/revised edition in 2011. The impulse responses of the three departments of veterans hospitals were separately inspected as a reference for subsequent policy changes. Following Kapelko et al. [14], the present study uses IRF to evaluate the impulse responses of the Department of Medicine, Department of Surgery, and Department of Other Specialist Medicine after receiving the New Hospital Accreditation. Kapelko et al. [14] were the first to use IRF to determine whether the impact of policy implementation on operational efficiency during policy implementation is a deferred effect.

The remainder of this paper is organized as follows. Section 2 reviews related literature. Section 3 discusses the methodology. Section 4 presents the empirical tests and results. Section 5 presents the conclusion.

Literature review

Overview of the hospital accreditation system in Taiwan

According to the Taiwan Ministry of Health and Welfare, the main purpose of hospital accreditations is to ensure “patient rights” and “quality of medical services” for the proper medical care of the population. This connotation indicates a self-evaluation of medical institutions and an external peer review process in accordance with established standards. These standards facilitate an accurate assessment of the performance of hospitals and set out the implementation of programs to continuously improve the healthcare system. Hospital accreditation in Taiwan began to develop in 1978. The first teaching hospital evaluation by the Ministry of Education and Department of Health was completed in 1979.

The Quality Indicator Project (QIP) of the Maryland Hospital Association began in 1985. In 1992, the United Kingdom took the lead of non-US hospitals to join the project, which prompted the development of QIP into the International Quality Indicator Project (IQIP). The program is regularly monitored by indicators to understand the clinical results of hospitals and improve the quality of medical care. Accordingly, IQIP won international recognition. More than 60 countries, including Canada, Austria, Germany, the Netherlands, Belgium, Australia, Singapore, and Taiwan, implemented the program as of 2017.

In 1999, Taiwan Ministry of Health and Welfare set up the Taiwan Joint Commission on Hospital Accreditation (TJCHA) with the ultimate goal of integrating and promoting a high-quality medical system. TJCHA initiated the Taiwan Quality Indicator Project (TQIP) in 2000 to seek its mission of excellence in healthcare quality on the basis of the successful experiences and international benchmarking of IQIP and the Maryland Quality Indicator Project.

To align hospital accreditations with current requirements, the Department of Health formally announced “the New Hospital Accreditation” in 2005 and implemented this assessment on all hospitals in 2007. This new accreditation was different from past accreditations, which overemphasized the professional evaluation model. The starting point was changed to “patient centeredness” and reference was based on the evaluation criteria in the USA, Japan, and other countries. The criteria have been integrated into eight chapters: 1) hospital management strategy and community function, 2) reasonable hospital management, 3) patient rights and safety, 4) complete medical system and operation, 5) appropriate medical work, 6) appropriate nursing care, 7) comfortable medical environment and care, and 8) human quality and quality promotion. 508 accreditation items cover three major areas, namely, management, medical, and nursing. During the implementation of the New Hospital Accreditation, TJCHA continued to collect opinions and analyzed the degree to which the annual benchmark can be reached. The second edition of the hospital accreditation and teaching hospital accreditation benchmark was formally implemented to carry out the normalization assessment. Fig. 1 shows the evolution of Taiwan hospital accreditation benchmarks.

Fig. 1

Evolution of the new hospital accreditation benchmark

Application of DEA in medical care

Nunamaker [15] and Sherman [16] presented a guide on DEA research in the healthcare field. Nunamaker published the first paper on the classification of nursing service efficiency. Sherman examined DEA by adopting the procedure to a group of teaching hospitals and figured out that the method is an effective means to identify and measure hospital inefficiency.

Several studies have focused on method comparison. Banker et al. [17] applied and compared DEA and translog models using a sample composed of hospitals in North Carolina. Linna [18] and Giuffrida and Gravelle [19] surveyed the efficiency of primary care services and hospitals with stochastic frontier analysis (SFA) and DEA. Worthington [20] reviewed research works on healthcare that adopted frontier efficiency measurements. Hollingsworth [21] summarized the latest development of DEA in the healthcare field.

Many studies have applied the DEA method to assess the technical and cost efficiency of medical institutions. Huang [22] measured the operating efficiency of 213 hospitals in Florida, USA, using the CCR model. Sexton et al. [23] evaluated the performance of 159 veterans hospitals. Valdmanis [24] assessed the productivity of public and private nonprofit hospitals in Michigan. Lynch and Ozcan [2] assessed the efficiency of 284 hospitals (158 Renmin Hospitals, 65 Air Force hospitals, 37 Army hospitals, and 24 Naval hospitals). White and Ozcan [25] explored the efficiency of church and nonprofit hospitals in California using the CCR model. Ozcan and McCue [3] constructed an aggregate measure of financial performance with the DEA approach from a random sample of acute care hospitals and compared this DEA measure with conventional ratio analysis. They concluded that DEA index is an effective measure of overall financial performance. Chirikos and Sear [26] measured the efficiency of 186 outpatient hospitals in the USA.

Ferrier et al. [27] argued that the DEA method can be adapted to characterize a variety of inputs and outputs produced by hospitals. This diversity, especially in terms of outputs, cannot be easily applied in the econometric model of production. This method requires quantitative data, which are readily available in hospitals. Hence, DEA is suitable in assessing hospital performance. In the DEA framework, performance is assessed with respect to an efficient frontier established by testing linear combinations of DMUs and defining the minimum input usage necessary to attain a given output level [28]. Barbetta et al. [29] adopted parametric (SFA and COLS) and DEA methods to identify behavioral differences between public and private nonprofit hospitals by introducing the DRG-based payment system in the Italian NHS from 1995 to 2000. Liu et al. [30] argued that the measurement of health system efficiency at various levels has an ongoing debate in health economics literature from nursing homes or individual hospitals to the entire health system. The two most widely used approaches in assessing the efficiency of how various units deliver care are SFA and non-parametric DEA.

Veihola et al. [31] used DEA to analyze the relationship between discard rates and technical efficiency of the production chain for blood collection in the Blood Unit. Nayar and Ozcan [32] conducted a performance evaluation of hospitals in Virginia, USA, and found a technically efficient hospital with good medical quality. Hollingsworth [21] pointed out that efficiency research approaches in the healthcare field are mainly based on DEA and summarized the latest progress of DEA application in the healthcare category.

Chu et al. [33] evaluated the operating efficiency of 417 private hospitals in Taiwan. The results showed that the performance of TQIP was better than that of united hospitals. The study suggested that hospital managers should consider joining a medical quality indicator program to further control costs and improve quality. Chang et al. [34] evaluated 31 regional hospitals in Taiwan to join the medical quality indicator program (TQIP). The results show that the business performance of hospitals that join the program is better than that of hospitals that did not join.

Kounetas and Papathanassopoulos [35] used a double-bootstrap DEA approach to measure the performance of Greek hospitals and determine the factors that influence their efficiency. They also offered decision makers with valuable inputs for the decision-making process. Dowd et al. [36] found that DEA provides a scalar index of “value.” This discovery proposes that input efficiency may provide a useful scalar measure of value for a value-based payment system for physician services. This study measures the evaluation efficiency among hospital departments using nonhomogeneous DMUs to provide information about differences within hospitals. This information can be used by operators as a reference for decision-making.

Ferrera et al. [37] extended the literature on evaluating efficiency in primary healthcare by conjointly considering the influence of quality indicators and environmental variables in a case study. Different extensions of DEA methodology were also applied. The findings offer evidence to support the importance of including information on the two aspects in the analysis. Hence, efficiency measures can be interpreted as an exact reflection of performance. Castelli et al. [38] tested variations in hospital productivity in the English NHS using DEA and found a substantial variation in productivity among English hospitals. This finding suggested a range for productivity improvement. Gascón et al. [39] assessed the efficiency of a sample of 37 large pharmaceutical laboratories from 2008 to 2013 using a DEA method. They linked these laboratory transactions with R&D investment and DEA efficiency. They also found that efficient laboratories make more financial business deals and the relative size of each business deal is larger.

In summary, DEA is suitable for assessing the performance of hospitals, but majority of the literature is used to explore the performance of hospitals or compare the performance of different properties of hospitals. Only a few in-depth analyses were conducted on various departments. Furthermore, the inputs of resources and outputs between departments of a hospital are inseparable, and various departments have a unique output, thereby resulting in non-homogeneity. Therefore, traditional DEA methods cannot be directly applied.

This study adapted the co-input and nonhomogeneous output of the DEA performance measurement model to overcome traditional measures and explore the operational efficiency of various hospital departments. In addition, the improvement of medical quality needs time and investment in the conversion of resources. However, most periods covered in the literature are short, which causes difficulty in conducting an objective assessment of efficiency changes. The present study covered 14-year data (2002 to 2015), which can facilitate an objective performance assessment of various hospital departments.

Impulse response function

Existing literature about the DEA assessed operational efficiency and mostly used Tobit regression in testing the impact of variables or the Malmquist productivity change index to analyze changes in productivity. Nonetheless, these studies could not show the deferred effect of the policy or system on operational efficiency. IRF tracking of the reaction of a system’s variables via impulses of a system’s shocks can solve this problem. Jordà [40] found that the impact of investment spikes in the post-investment spike period can be identified by specifying an IRF assessed by the local projections approach. IRF analysis has been adopted in many environments, for example, empirical macroeconomic examination. In particular, it has been used in examining the effects of banking and debt crises on GDP [41] and banking crisis on unemployment [42], price responses, or food supply chains [43].

Existing literature also shows that IRF has been used in many industries. IRF is an appropriate method for assessing the impact of changes on business performance. This study introduced the impulse response function analysis on veterans hospitals to separately investigate the impulse responses of the Department of Medicine, Department of Surgery, and Department of Other Specialist Medicine after undergoing the “New Hospital Accreditation.” This study is the first to apply IRF testing in the operating efficiency changes after the implementation of a hospital accreditation. Hence, the goal is to provide better understanding of the effectiveness of policy implementation and deferred effect as a reference for subsequent policies. IRF has the following advantages: “(1) they are more robust to misspecification, (2) they can be estimated by simple regression techniques, (3) they provide simple joint inference for impulse response coefficients, and (4) they can be easily extended to very nonlinear and flexible models” [14, p., 119].

Research design

Research objectives

Veterans hospitals belong to the medical institutions of the Department of Veterans Service and Assistance. In the medical system, 15 institutions are divided into Taipei Veterans General Hospitals, Taichung Veterans General Hospitals, and Kaohsiung Veterans General Hospitals system. Table 1 shows the hospital system and corresponding names of the hospitals.

Table 1 Medical service hospital of veterans medical system

Fig. 2 depicts the research conceptual model used in achieving the objectives of this study. The model incorporated DEA and impulse response function. As have been discussed, an efficiency analysis via DEA was first performed in this study. This analysis involves two steps: evaluation of overall efficiencies and grouped analysis evaluation by departments. Then, the IRF method was applied to assess the impact of the New Hospital Accreditation on the efficiencies of various hospital departments.

Fig. 2

Research conceptual model

Classification of medical sections and hospital inputs and outputs

A medical department has unique areas, and input resources and output results are exclusive themselves. This study is based on the classification of veterans general hospitals and the study of Chung et al. [44], which considers the Department of Medicine and Surgery. However, this division did not classify other departments. The unclassified departments were then centralized in the Department of Other Specialist Medicine in accordance with expert interviews. Data of veterans hospitals from 2002 to 2015 covered 630 DMUs. Certain divisions have a special nature, such as anesthesia, laboratory, and radiology. Sub-clinical services have no outpatient/inpatient patients, and the Division of Rehabilitation only has outpatients (without emergency or hospitalized patients). DEA chooses the DMU with a high homogeneity principle. Table 2 shows the divisions excluded from the study.

Table 2 Classification of medical departments

The most commonly used model in medical quality measurement is the three-step framework of structure–process–outcome [45]. This framework is similar with the input–process–output framework used in a manufacturing environment. The traditional medical quality assessment focuses on structural perspectives, such as equipment, hospital size, quantity of facilities, and quantity of services provided. The Health Care Financing Administration of the USA proposed the mortality of each hospital as a quality indicator in 1986. This outcome indicator became a new direction in medical quality assessment [46].

The present study refers to the literature on hospital efficiency assessment, such as Pilyavsky et al. [47], Chang et al. [48], Chang et al. [49], and Banker et al. [50]. Given the limitations of available data, we chose six inputs, three co-outputs, and individual outputs.

The input variables chosen for the analysis are the number of physicians, patient beds, other medical personnel, nurses, equipment, and floor area. The co-output variables are the number of patient days, patients receiving surgery, and net inpatient survivors. The number of outpatient visits at every department was considered the individual output variable. Table 3 lists the variables and definitions.

Table 3 Definition of hospital inputs and outputs

Cooper et al. [51] indicated a rule of thumb in determining the number of DMUs: “The number of DMUs should be at least three times the total number of inputs and outputs.” Every department has 210 DMUs (14 years of data for the three departments of 15 hospitals). The model has six inputs and four outputs: 210 > 3 (6 + 4) = 30. Thus, the model constructed in this study is consistent with construction validity.

To clearly understand the non-homogeneity of the DMU in DEA, the main service outputs of 15 veterans hospitals included days of stay, operation, and number of net survivors and visits. Departments with similar outputs were grouped into P groups. As shown in Table 4,Np(p = 1, ..., P), where P = 3, N1 is the Department of Medicine, N2 is the Department of Surgery, and N3 is the Department of Other Specialist Medicine. Rk(k = 1, ..., K), where Rk represents the subset of outputs with the characteristic of all members as the outputs of exactly the same set of DMUs.

Table 4 Medical departments by DMU group outputs

Descriptive statistics

Table 5 provides the descriptive statistics of veterans hospital inputs and outputs from 2002 to 2015. In terms of individual output, the maximum average number of outpatient visits at the Department of Medicine is 176,510 and the Department of Surgery is at least 72,300.

Table 5 Descriptive statistics on inputs and outputs of hospitals (n = 210)



DEA is a method used for evaluating the relative efficiencies of a group of homogeneous DMUs that use the same input and output measures [11]. However, the hypothesis of homogeneity among DMUs may not be adopted for hospital departments. Hospital departments may not produce the same services. Cook et al. [12] first proposed DEA-based models for assessing the relative efficiencies of a group of DMUs, where the claim of homogeneity is lessen. These models were then used in this study to evaluate the efficiencies of a set of hospital departments. This method aims to overcome the shortcomings of different hospital departments that own different output services. The efficiency of a given department can be determined from the following stages:

Stage 1 Determine the split portion of input resources

Assume the hospital department falls into P mutually incompatible groups, which are denoted by \( {\left\{{N}_p\right\}}_{p=1}^P \). Mutually exclusive output subgroups Rk(k = 1, ..., K) are then formed, where Rk represents the subset of outputs with the characteristic that all its members appear as the outputs of the same set of departments. The part of each input i that will be allocated to each of the output subgroups \( {R}_k\in {L}_{N_{po}} \) for the department under assessment should be identified, which is represented as jo ∈ Npo. This proportion is denoted by \( {\alpha}_{i{R}_{kpo}} \). The multiplier model is expressed as a solution to the following fractional program by keeping the output level constant in the variable returns-to-scale (VRS) model:

$$ {\displaystyle \begin{array}{l}{\vartheta}_o=\min \sum \limits_{R_k\in {L}_{N_{po}}}{W}_{R_k jo}\left[\left(\sum \limits_i{v}_i{\alpha}_{i{R}_{kpo}}{x}_{ijo}+\pi \right)/\sum \limits_{r\in {R}_k}{u}_r{y}_{rjo}\right]\\ {}s.t.\end{array}} $$
$$ {\displaystyle \begin{array}{l}\sum \limits_{R_k\in {L}_{N_p}}{W}_{R_kj}\left[\left(\sum \limits_i{v}_i{\alpha}_{i{R}_{kp}}{x}_{ij}+\pi \right)/\sum \limits_{r\in {R}_k}{u}_r{y}_{rj}\right]\kern0.5em \ge 1,\\ {}\kern5em \forall j\in {N}_p,{R}_k\in {L}_{N_p},p=1,...,P,\kern0.75em \end{array}} $$
$$ {\displaystyle \begin{array}{l}\sum \limits_i{v}_i{\alpha}_{i{R}_{kp}}{x}_{ij}/\sum \limits_{r\in {R}_k}{u}_r{y}_{rj}\ge 0,\\ {}\kern5em \forall j\in {N}_p,{R}_k\in {L}_{N_p},p=1,...,P,\end{array}} $$
$$ {\displaystyle \begin{array}{l}\sum \limits_{R_k\in {L}_{N_p}}{\alpha}_{i{R}_{kp}}=1,\\ {}\kern5em \forall i,p=1,...,P,\end{array}} $$
$$ {\displaystyle \begin{array}{l}{L}_{i{R}_{kp}}\le {\alpha}_{i{R}_{kp}}\le {U}_{i{R}_{kp}},\\ {}\kern5em \forall i,{R}_k,p=1,...,P,\end{array}} $$
$$ {u}_r,{v}_i,{\alpha}_{i{R}_{kp}}\ge 0,\pi \in free. $$

Here, \( {\vartheta}_o^{\ast } \) is the optimal value of the overall efficiency of the target department, xij is the amount of input i to department j, yrj is the amount of output r to department j, vi is the weight given to input i, and ur is the weight given to output r. The basic idea of this model is to express the overall efficiency of a department as a convex combination \( \left({\sum}_{R_k\in {L}_{N_{po}}}{W}_{R_k jo}=1\right) \) of the efficiencies \( \left({\sum}_i{v}_i{\alpha}_{i{R}_{kpo}}{x}_{ijo}+\pi \right)/{\sum}_{r\in {R}_k}{u}_r{y}_{rjo} \) of individual subgroups Rk. From an accounting point of view, the proportion of inputs can be reasonably allocated to a subgroup to dictate the importance of that subgroup to the entire department. Thus, weight is defined as \( {W}_{R_k jo} \), which should be allocated to subgroup Rk as

$$ {W}_{R_k jo}={\sum}_{r\in {R}_k}{u}_r{y}_{rjo}/{\sum}_{R_k\in {L}_{N_{po}}}{\sum}_{r\in {R}_k}{u}_r{y}_{rjo}\cdot $$

Objective function (1a) shows that minimizing the weighted average of subgroup rations is equivalent to the process of maximizing the entire efficiency ratio of the objective department. Constraint (1b) requires that the multipliers chosen for an objective department should satisfy the condition when applied to any other department. The corresponding ratio (of inputs to outputs) is over or equal to unity.

Constraint (1c) shows that resource splitting variables \( {\alpha}_{i{R}_{kp}} \)should be chosen in a mode that permits the efficiency ratio corresponding to the subset of outputs in Rk to suppose a value that is over or equal to unity for some values of multipliers vi, ur. Constraints (1b) and (1c) are similar and may be deleted from either of the groups. Constraint (1d) presents that \( {\alpha}_{i{R}_{kp}} \) values allocated to subgroups Rk of outputs that correspond to any set p sum to unify each input i. Finally, constraint (1e) sets lower and upper limits on the sizes of \( {\alpha}_{i{R}_{kp}} \) variables. Hence, we obtained the optimal value of \( {\alpha^{\ast}}_{i{R}_{kp}} \) by solving this fractional program (1).

Stage 2 Determine subgroup efficiencies

In stage 1, we can easily derive the optimal proportions of input \( {\alpha^{\ast}}_{i{R}_{kp}} \) that will be allocated to output subgroups Rk. We can then assign this value to subgroup Rk and the suitable amount of input xij, namely, \( {\widehat{x}}_{ijo}^k={{\alpha^{\ast}}_{i{R}_{kpo}}}^{\ast }{x}_{ijo} \) . The conventional VRS DEA model [41] can be applied to each of the subgroups Rk of jo.

\( {M}_{R_k} \) is the set of all department groups that have Rk as a member, that is, \( {M}_{R_k}=\left\{{N}_p\ \mathrm{such}\ \mathrm{that}\ {R}_k\in {L}_{N_p}\right\} \). The DEA model is solved for each department jo and each subgroup Rko corresponding to the set Npo that includes jo as a component:

$$ {\displaystyle \begin{array}{l}{\vartheta}_{R_{ko jo}}=\min \sum \limits_{i=1}^m{v}_i{\widehat{x}}_{ij o}^k+\pi \\ {}s.t.\\ {}\sum \limits_{r\in {R}_{ko}}{u}_r{y}_{rj o}=1\\ {}\sum \limits_{r\in {R}_{ko}}{u}_r{y}_{rj}-\sum \limits_{i=1}^m{v}_i{\widehat{x}}_{ij}^k\le 0\\ {}\kern5.25em j\in {N}_p,\mathrm{for}\ {N}_p\in {M}_{R_{Ko}},\\ {}{u}_r,{v}_i\ge \varepsilon .\end{array}} $$

Stage 3 Determine aggregate efficiencies

The entire efficiency score defined in (4) of department jo is now derived by taking a weighted average of the subgroup scores gained in stage 2 using \( {W}_{R_{kjo}} \) defined in (2).

$$ T{E}_{jo}=\sum \limits_{R_k\in {L}_{N_{po}}}{W}_{R_k jo}{\vartheta}_{R_{kojo}} $$

Impulse response function analysis

Prior studies [52, 53] examining reforms in the healthcare industry show that quality variables may or may not have an impact on efficiency. To assess the impact of a reform, an IRF analysis evaluating how variable(s) of interest responds to impulses of shock events can be used. In this study, the IRF of hospital efficiencies to hospital accreditations (shock events) is defined as the difference between two predictions, consistent with prior studies [54]:

$$ \mathrm{IRF}(k)\equiv E\left[{y}_{t+k-1}|{d}_{t-1}=1,{y}_s,{d}_{s-1},s>t\right]-E\left[{y}_{t+k-1}|{d}_{t-1}=0,{y}_s,{d}_{s-1},s>t\right] $$

where k represents years after the occurrence of dt shock events; yt indicates the efficiency for a firm in year t; and E[.| .] specifies the best, mean-squared error predictor. This predictor implies that the IRF is made up of the best, mean-squared, and multi-step predictions using an autoregressive estimation technique. However, the IRF estimated from Eq. (5) is sensitive to misspecifications in the data-generating process and can be highly nonlinear. Therefore, in accordance with Kapelko et al. [14], this study estimated Eq. (2), which is a reduced-form ordinary least squares regression model with heteroskedasticity and autocorrelation robust standard errors:

$$ {y}_{t+k-1}={\delta}_{ok}+{\delta}_{ok}^{\ast }t+{\sum}_{r=1}^R{\delta}_{irk}{y}_{t-r}+{\sum}_{l=1}^L{\delta}_{2 lk}{d}_{t-l}+{\sum}_{l=1}^{k-1}{\gamma}_{2l}{d}_{t+k-1-l}+{v}_{tk}^{\ast } $$

where δo captures firm fixed effect; \( {\delta}_o^{\ast } \) represents time fixed effect; \( {\sum}_{r=1}^R{\delta}_{irk}{y}_{t-r} \) is lagged efficiency; \( {\sum}_{l=1}^L{\delta}_{2 lk}{d}_{t-l} \) indicates lagged shock events; and \( {\sum}_{l=1}^{k-1}{\gamma}_{2l}{d}_{t+k-1-l} \) embodies shock events that happen within the forecast horizon k. The third sum corrects the aforementioned biases [54]. r is the number of lags for yt, l is the number of lags for dt, and \( {v}_{tk}^{\ast } \) represents residuals that can be extended as \( {v}_{tk}^{\ast }={\sum}_{m=1}^{K-1}{\gamma}_{3m}{u}_{t+k-1-m}+{u}_{t+k-1} \).

Empirical results

The empirical analysis is divided into two parts. First, the DEA performance measure model is used to measure the performance of the Department of Medicine, Department of Surgery, and Department of Other Specialist Medicine. Second, the method of impulse responses is applied on the veterans hospitals. The impact of the New Hospital Accreditation was separately investigated based on the impulse responses of the three departments.

Efficiency analysis

The study object comprised 15 veterans hospitals, which were divided into three DMU subsets (N1 to N3) for the Department of Internal Medicine, Department of Surgery, and Department of Other Specialist Medicine. Each subset has different outputs, such as DMU in N1 and outputs R1 and R2.

Following the method proposed in Section 3, Model (1) was adopted to solve the stage 1 problem and facilitate an assignment of the inputs to the subunits that make up the DMU. To bind the values of α, an expert interview on the relevant investment resources of the departments was conducted, such that a representative apportioning occurs. This step aimed to achieve rationality and accuracy for this study. Survey data from a sample of the hospitals proposed the following ranges:

N1:(0.045,0.946), N2:(0.100,0.900), N3:(0.002,0.997).

The ranges vary according to DMU subgroup Np. For example, DMU subgroup N1 contains two subunits (R1 and R2). Thus, a minimum of 4.5% and a maximum of 94.6% of each input can be allocated to any subunit. R1, R2, R3, and R4 efficiency values were obtained. The setting of the weight was converted in accordance with Model (2). However, the objects of this study are 15 hospitals with three departments and 14 years of information. The DMU is too large to list and is presented only with the final result.

Table 6 shows the annual operating efficiency analysis of R1, R2, R3, and R4, which decreased from 2004. According to Chang et al. [34], this decrease is due to severe acute respiratory syndrome (SARS), which originated in Hong Kong and China in 2003. In Taiwan, 664 SARS cases were reported, including 73 deaths and 346 contagions. SARS had an important impact on healthcare and economics for Taiwan. The people were loathed to seek medical aids from hospitals until 2004.

Table 6 Efficiency of hospital changes from 2002 to 2015

When the New Hospital Accreditation was implemented in 2007, the efficiency of R1, R2, R3, and R4 for the third to fourth years showed negative growth. Fig. 3 shows the trend of efficiency changes. It also shows that the New Hospital Accreditation improved the efficiency of each output, especially for R3 of the Department of Surgery. The level of efficiency was better than that of other output performance. In addition to the positive efficiency of R1 in 2013, the remaining R2,R3, and R4 showed positive growth from 2011 to 2012. To understand the efficiency changes, we collected and organized the number of changed fixed assets and other subjects of veterans hospitals from 2003 to 2015.

Fig. 3

Efficiency of hospital changes from 2002 to 2015

As shown in Table 7 (asset information in 2002 could not be collected), the hospitals generally began to increase investment in assets in 2004 and the amount of investment reached its peak in 2007. The study sample showed that the hospital continued to appoint the doctors, nurses, and other healthcare workers to serve patients with the policy, as shown in Figs. 4 and 5.

Table 7 Changes of assets (2003–2015) (unit: NT$)
Fig. 4

Changes in the number of physicians from 2002 to 2015

Fig. 5

Changes in the number of nurses and other personnel from 2002 to 2015

The idea of the New Hospital Accreditation being “humanized” wholly regards the patient as the core and creates favorable indoor hospital environments to meet the needs of the patients, psychology, and society. The added regulation on “human resources and quality promotion” recommends counseling to recruit and train relevant medical staff. Therefore, the hospital strengthens the patient-centered environment and process improvement of investment and construction every year to meet the high standards of evaluation. In summary, continuous investment in hospital assets and healthcare workers cause increase of costs, thereby resulting in decreased operating performance. This behavior is consistent with the spirit of evaluation of patients’ medical service quality. Therefore, when the “Second Edition of New Hospitals” was implemented in 2011, efficiency level showed a positive growth in 2012. This growth shows that the New Hospital Accreditation is helpful for the improvement of hospital operating efficiency.

As shown in Table 8, the best efficiency of individual output is R2 of the Department of Medicine for the performance of the various departments, but the Department of Surgery is the most efficient in terms of R1 and overall efficiency. This result can be used as a benchmark for other departments in terms of overall efficiency. The efficiency ratings of the Department of Medicine, Department of Surgery, and Department of Other Specialist Medicine were 0.854, 0.868, and 0.828, respectively. The Department of Surgery indicated the most balanced efficiency.

Table 8 Overall performance of various departments

As shown in Table 9, the proportions of the total efficiency of Taipei Veterans General Hospital system, Taichung Veterans General Hospital system, and Kaohsiung Veterans General Hospital system are 20.54%, 9.52%, and 4.76%, respectively. The performance of Taipei Veterans General Hospital is the best and can be used as a benchmark for other hospitals. Taichung Veterans General Hospital ranked second. However, this result should be taken with caution. The difference in the operating efficiency between the Taipei system and the other two systems is very large. Managers should explore this difference regardless of problems in investment focus or system management.

Table 9 Performance of the Medical Systems

Impulse responses

The new hospital evaluation considered shocking events in 2007 and 2011 to examine the impulse responses of the operations of the various departments. As shown in Table 10, the operating efficiency of the Department of Medicine, Department of Surgery, and Department of Other Specialist Medicine significantly increased by 1.73%, 1.32%, and 3.69%, respectively, in the first year after the implementation of the new hospital evaluation benchmark in 2007. The second edition of the evaluation system was implemented in 2011. In the first year after the implementation, the operating efficiency of the Department of Medicine and Department of Surgery significantly increased by 1.77% and 1.32%, respectively, whereas that of the Department of Other Specialist Medicine significantly increased by 3.94%.

Table 10 Impulse responses of the three departments

The study found that the Department of Surgery and Department of Other Specialist Medicine saw a significant positive improvement in efficiency in the first year of implementation of the reform of the two medical evaluations. However, the departments showed significant reductions in the second year. Thus, we cannot estimate the operating efficiency of the follow-up changes due to access to the data year. However, the results can help decision-makers determine whether the performance is attributed to the implementation of the system or the impact of hospital management. The Department of Medicine in the two medical evaluations saw significant positive efficiency improvement in the first and fourth years. This result shows that the New Hospital Accreditation has a great influence in improving the operational efficiency of the Department of Medicine.

The results of the study are consistent with the finding that international implementation of the policy has a deferred effect. Generally, the operating efficiency of various departments showed positive improvements in the long term. The results of the evaluation with the DEA are not exactly the same, but inconsistencies are negligible.

Fig. 6 plots the impulse response evaluation at 95% confidence interval for the impulses in 2007 and 2011. The figure indicates that confidence intervals widened for long forecast horizons, which shows increasing uncertainty with regard to the long-term impact of investment spikes. The confidence intervals of the figure show that certain impulse responses are statistically insignificant, as shown in Table 10.

Fig. 6

Impulse response estimates by impulse events in 2007 and 2011 and its components to investment spike, with 95% confidence intervals


Hospital performance evaluation in existing studies has been limited to the comparison of different properties of hospitals. Only a few studies have analyzed the operation of relevant departments. Hence, performance measurements of hospitals are questionable. To accurately evaluate the performance of hospitals and relevant departments, this study established the evaluation structure of the Department of Medicine, Department of Surgery, and Department of Other Specialist Medicine. This study imported the co-input and nonhomogeneous output of DEA performance measurement model to replace traditional evaluation methods and measure the operating performance of various departments in veterans hospitals. This study solves the problem of co-input or nonhomogeneous output, contributes to academic research, evaluates the performance of hospital departments, and addresses deficiencies in the literature.

The performance of the Department of Surgery is the most balanced and can be used as a model for other departments. The performance of the Department of Medicine is superior to that of the Department of Other Specialist Medicine. This study found that the asset investment of veterans hospitals started to increase in 2004 and peaked in 2007. In this period, the hospitals also appointed relevant healthcare workers. This strategy affected the performance of the various departments of the hospital, but the program is consistent with the objective of “patient as the center.” The program aims to improve the medical environment and increase medical services. The “Second Edition of New Hospitals” was implemented in 2011 and the efficiency of all departments continued to improve.

This study is the first to use the method of impulse responses on veterans hospitals. The New Hospital Accreditation was issued based on the impact of 2007 and 2011 incidents. The evaluation separately inspected the impulse responses of the three departments as a reference for subsequent policy. The efficiency of each department improved in the first year after the implementation of the new hospital evaluation benchmark in 2007. However, the Department of Surgery and Department of Other Specialists showed a significant decline in the second year. In summary, the results of the study are consistent with the international implementation experience. The system and policy implementation will be conducted through the evolution process stage. The improvement of hospital efficiency had a deferred effect. Generally, the operating efficiency of the various departments has long-term positive improvements.

This study provides managers with a reference for policy formulation and improvement. Accordingly, improvements in medical quality and operating efficiency can be achieved. Therefore, the maintenance and improvement of medical quality should be given focus in hospital management to achieve a competitive advantage. However, the improvement of medical quality requires conversion of resources. This study used the long-term information of hospitals to analyze operational efficiency and help managers understand efficiency changes. Managers should also uphold this point. They cannot be short-sighted in the face of efficiency changes. Future hospital research can combine medical “big data,” which can improve the analysis and prediction of medical-related messages.


  1. 1.

    Data envelopment analysis (DEA) is a method for assessing the relative efficiencies of a set of decision-making units (DMUs) [11]. However, traditional DEA models generally suppose that DMUs constitute a homogeneous set where all DMUs have a common (input and output) bundle. The supposition of homogeneity among DMUs may not be applicable in other situations, especially in the healthcare industry.


  1. 1.

    Zhang X, Tone K, Lu Y (2018) Impact of the local public hospital reform on the efficiency of medium-sized hospitals in Japan: an improved slacks-based measure data envelopment analysis approach. Health Serv Res 53(2):896–918

    Article  Google Scholar 

  2. 2.

    Lynch JR, Ozcan YA (1994) Hospital closure: an efficiency analysis. J Healthc Manag 39(2):205

    Google Scholar 

  3. 3.

    Ozcan YA, McCue MJ (1996) Development of a financial performance index for hospitals: DEA approach. J Oper Res Soc 47(1):18–26

    Article  Google Scholar 

  4. 4.

    Ozcan YA (2008) Health care benchmarking and performance evaluation. Springer, US

    Google Scholar 

  5. 5.

    Ozcan YA, Lins ME, Lobo MSC, Da Silva ACM, Fiszman R, Pereira BB (2010) Evaluating the performance of Brazilian university hospitals. Ann Oper Res 178(1):247–261

    Article  Google Scholar 

  6. 6.

    Chou T-H, Ozcan YA, White KR (2012) Technical and scale efficiencies of Catholic hospitals: does a system value of stewardship matter? In: Advanced decision making methods applied to health care. Springer, pp 83–101

  7. 7.

    Ozcan YA, Legg JS (2014) Performance measurement for radiology providers: a national study. Int J Healthc Technol Manag 14(3):209–221

    Article  Google Scholar 

  8. 8.

    Narcı HÖ, Ozcan YA, Şahin İ, Tarcan M, Narcı M (2015) An examination of competition and efficiency for hospital industry in Turkey. Health Care Manag Sci 18(4):407–418

    Article  Google Scholar 

  9. 9.

    DePuccio MJ, Ozcan YA (2017) Exploring efficiency differences between medical home and non-medical home hospitals. Int J Healthc Manag 10(3):147–153

    Article  Google Scholar 

  10. 10.

    Ozcan YA, Khushalani J (2017) Assessing efficiency of public health and medical care provision in OECD countries after a decade of reform. CEJOR 25(2):325–343

    Article  Google Scholar 

  11. 11.

    Charnes A, Cooper WW, Rhodes E (1978) Measuring the efficiency of decision making units. Eur J Oper Res 2(6):429–444

    Article  Google Scholar 

  12. 12.

    Cook WD, Harrison J, Imanirad R, Rouse P, Zhu J (2013) Data envelopment analysis with nonhomogeneous DMUs. Oper Res 61(3):666–676

    Article  Google Scholar 

  13. 13.

    Li Y, Lei X, Morton A (2018) Performance evaluation of nonhomogeneous hospitals: the case of Hong Kong hospitals. Health Care Manag Sci:1–14

  14. 14.

    Kapelko M, Lansink AO, Stefanou SE (2015) Analyzing the impact of investment spikes on dynamic productivity growth. Omega 54:116–124

    Article  Google Scholar 

  15. 15.

    Nunamaker TR (1983) Measuring routine nursing service efficiency: a comparison of cost per patient day and data envelopment analysis models. Health Serv Res 18(2, Part 1):183

    Google Scholar 

  16. 16.

    Sherman HD (1984) Hospital efficiency measurement and evaluation: empirical test of a new technique. Med Care 22(10):922–938

    Article  Google Scholar 

  17. 17.

    Banker RD, Conrad RF, Strauss RP (1986) A comparative application of data envelopment analysis and translog methods: an illustrative study of hospital production. Manag Sci 32(1):30–44

    Article  Google Scholar 

  18. 18.

    Linna M (1998) Measuring hospital cost efficiency with panel data models. Health Econ 7(5):415–427

    Article  Google Scholar 

  19. 19.

    Giuffrida A, Gravelle H (2001) Measuring performance in primary care: econometric analysis and DEA. Appl Econ 33(2):163–175

    Article  Google Scholar 

  20. 20.

    Worthington AC (2004) Frontier efficiency measurement in health care: a review of empirical techniques and selected applications. Med Care Res Rev 61(2):135–170

    Article  Google Scholar 

  21. 21.

    Hollingsworth B (2008) The measurement of efficiency and productivity of health care delivery. Health Econ 17(10):1107–1128

    Article  Google Scholar 

  22. 22.

    Huang Y-GL (1990) An application of data envelopment analysis: measuring the relative performance of Florida general hospitals. J Med Syst 14(4):191–196

    Article  Google Scholar 

  23. 23.

    Sexton TR, Leiken AM, Nolan AH, Liss S, Hogan A, Silkman RH (1989) Evaluating managerial efficiency of veterans administration medical centers using data envelopment analysis. Med Care 27(12):1175–1188

    Article  Google Scholar 

  24. 24.

    Valdmanis VG (1990) Ownership and technical efficiency of hospitals. Med Care 28(6):552–561

    Article  Google Scholar 

  25. 25.

    White KR, Ozcan YA (1996) Church ownership and hospital efficiency. J Healthc Manag 41(3):297

    Google Scholar 

  26. 26.

    Chirikos TN, Sear AM (2000) Measuring hospital efficiency: a comparison of two approaches. Health Serv Res 34(6):1389–1408

    Google Scholar 

  27. 27.

    Ferrier GD, Rosko MD, Valdmanis VG (2006) Analysis of uncompensated hospital care using a DEA model of output congestion. Health Care Manag Sci 9(2):181–188

    Article  Google Scholar 

  28. 28.

    Hua Z, Bian Y, Liang L (2007) Eco-efficiency analysis of paper mills along the Huai River: an extended DEA approach. Omega 35(5):578–587

    Article  Google Scholar 

  29. 29.

    Barbetta GP, Turati G, Zago AM (2007) Behavioral differences between public and private not-for-profit hospitals in the Italian national health service. Health Econ 16(1):75–96

    Article  Google Scholar 

  30. 30.

    Liu C, Laporte A, Ferguson BS (2008) The quantile regression approach to efficiency measurement: insights from Monte Carlo simulations. Health Econ 17(9):1073–1087

    Article  Google Scholar 

  31. 31.

    Veihola M, Aroviita P, Kekomäki R, Linna M, Sintonen H (2008) Discarded cellular components and the technical efficiency of component preparation. Eur J Health Econ 9(4):325–331

    Article  Google Scholar 

  32. 32.

    Nayar P, Ozcan YA (2008) Data envelopment analysis comparison of hospital efficiency and quality. J Med Syst 32(3):193–199

    Article  Google Scholar 

  33. 33.

    Chu H-L, Wang C-C, Shiu SF (2009) Effect of participating in Taiwan quality Indicator project on hospital efficiency in Taiwan. J Health Care Finance 35(4):32–41

    Google Scholar 

  34. 34.

    Chang S-J, Hsiao H-C, Huang L-H, Chang H (2011) Taiwan quality indicator project and hospital productivity growth. Omega 39(1):14–22

    Article  Google Scholar 

  35. 35.

    Kounetas K, Papathanassopoulos F (2013) How efficient are Greek hospitals? A case study using a double bootstrap DEA approach. Eur J Health Econ 14(6):979–994

    Article  Google Scholar 

  36. 36.

    Dowd B, Swenson T, Kane R, Parashuram S, Coulam R (2014) Can data envelopment analysis provide a scalar index of ‘value’? Health Econ 23(12):1465–1480

    Article  Google Scholar 

  37. 37.

    Ferrera JMC, Cebada EC, Zamorano LRM (2014) The effect of quality and socio-demographic variables on efficiency measures in primary health care. Eur J Health Econ 15(3):289–302

    Article  Google Scholar 

  38. 38.

    Castelli A, Street A, Verzulli R, Ward P (2015) Examining variations in hospital productivity in the English NHS. Eur J Health Econ 16(3):243–254

    Article  Google Scholar 

  39. 39.

    Gascón F, Lozano J, Ponte B, de la Fuente D (2017) Measuring the efficiency of large pharmaceutical companies: an industry analysis. Eur J Health Econ 18(5):587–608

    Article  Google Scholar 

  40. 40.

    Jordà Ò (2005) Estimation and inference of impulse responses by local projections. Am Econ Rev 95(1):161–182

    Article  Google Scholar 

  41. 41.

    Furceri D, Zdzienicka A (2012) How costly are debt crises? J Int Money Financ 31(4):726–742

    Article  Google Scholar 

  42. 42.

    Bernal-Verdugo LE, Furceri D, Guillaume D (2013) Banking crises, labor reforms, and unemployment. J Comp Econ 41(4):1202–1219

    Article  Google Scholar 

  43. 43.

    Kuiper WE, Lansink AGO (2013) Asymmetric price transmission in food supply chains: impulse response analysis by local projections applied to US broiler and pork prices. Agribusiness 29(3):325–343

    Article  Google Scholar 

  44. 44.

    Chung H, Fang P, Bao C, Shih W (2008) Evaluating operative performance of clinical section in hospital using data envelopment analysis-a case of regional teaching hospital. J Healthc Manag 9(1):36–52

    Google Scholar 

  45. 45.

    Donabedian A (1980) The definition of quality and approaches to its assessment (explorations in quality assessment and monitoring)s. Health Administration Press, Ann Arbor

    Google Scholar 

  46. 46.

    Landon BE, Epstein AM (2001) For-profit and not-for-profit health plans participating in Medicaid. Health Aff 20(3):162–171

    Article  Google Scholar 

  47. 47.

    Pilyavsky AI, Aaronson WE, Bernet PM, Rosko MD, Valdmanis VG, Golubchikov MV (2006) East–west: does it make a difference to hospital efficiencies in Ukraine? Health Econ 15(11):1173–1186

    Article  Google Scholar 

  48. 48.

    Chang H, Chang W-J, Das S, Li S-H (2004) Health care regulation and the operating efficiency of hospitals: evidence from Taiwan. J Account Public Policy 23(6):483–510

    Article  Google Scholar 

  49. 49.

    Chang H, Cheng M-A, Das S (2004) Hospital ownership and operating efficiency: evidence from Taiwan. Eur J Oper Res 159(2):513–527

    Article  Google Scholar 

  50. 50.

    Banker RD, Charnes A, Cooper WW (1984) Some models for estimating technical and scale inefficiencies in data envelopment analysis. Manag Sci 30(9):1078–1092

    Article  Google Scholar 

  51. 51.

    Cooper WW, Li S, Seiford LM, Tone K, Thrall RM, Zhu J (2001) Sensitivity and stability analysis in DEA: some recent developments. J Prod Anal 15(3):217–246

    Article  Google Scholar 

  52. 52.

    de Castro Lobo MS, Ozcan YA, da Silva AC, Lins MPE, Fiszman R (2010) Financing reform and productivity change in Brazilian teaching hospitals: Malmquist approach. CEJOR 18(2):141–152

    Article  Google Scholar 

  53. 53.

    Kacak H, Ozcan YA, Kavuncubasi S (2014) A new examination of hospital performance after healthcare reform in Turkey: sensitivity and quality comparisons. Int J Public Policy 10(4–5):178–194

    Article  Google Scholar 

  54. 54.

    Teulings CN, Zubanov N (2014) Is economic recovery a myth? Robust estimation of impulse responses. J Appl Econ 29(3):497–514

    Article  Google Scholar 

Download references

Author information



Corresponding author

Correspondence to Wen-Min Lu.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Lin, F., Deng, Y., Lu, W. et al. Impulse response function analysis of the impacts of hospital accreditations on hospital efficiency. Health Care Manag Sci 22, 394–409 (2019).

Download citation


  • Impulse response function
  • Data envelopment analysis
  • Hospital efficiency
  • Hospital accreditation
  • Nonhomogeneous departments

MSC Codes:

  • 68M20
  • 90C39