Among the high priority outcomes sought in reforming American health care are greater provider accountability, better processes of care and improved clinical outcomes, more satisfying care experiences for both patients and caregivers, and greater operational efficiency. Performance measurement is an essential tool for implementing strategies aimed at achieving these goals.
In this issue of the Journal, Powell and colleagues describe a number of unintended consequences of implementing performance measurement in the Department of Veterans Affairs (VA) Health Care System,1 an early adopter of system-wide performance management.2–5 While there are significant limitations in generalizing the findings of this retrospective qualitative study, it is an important contribution to the growing body of evidence documenting the complexities of health care performance measurement and a poignant reminder that performance measurement is a tool that cuts both ways.
Powell et al, report the findings of 59 semi-structured individual in-person interviews of primary care staff members and facility leaders at four VA facilities of varying size and levels of performance.1 They found local implementation of VA’s national performance measurement system led in some instances to provision of inappropriate clinical care, decreased provider attention to patient concerns and service, and compromised patient education and autonomy, as well as some adverse effects on primary care team dynamics. They additionally observed notable variation among the facilities in how performance data were shared with front line clinicians, strategies to improve performance, and application of rules. Concerns about the burden of reporting, clinical importance of some measures, inflexibility of automated clinical reminders and inequity in allocating financial rewards for improved performance were also commonly voiced in the interviews. While their study was not designed to determine the circumstances that led to the unintended consequences, they noted that in many cases the problems appeared to stem from local implementation methods rather than from the nationally determined performance measure definitions and policies. They observed both unintended positive and negative consequences, but described only negative effects in this report.
Performance measurement is a tool widely used in diverse industries to monitor progress towards achieving identified goals and is increasingly being used in health care, although still limited due to health care’s poorly developed infrastructure supporting measurement processes and application of measure results, as well as the nascent state of health care performance measurement science. Current health care reform efforts portend far greater use of performance measurement.
Multiple perils and pitfalls of performance measurement have been identified in recent years, especially in development and selection of measures, data collection, reporting and use of results.6–16 Similar to any generic tool, outcomes achieved with performance measurement depend in significant part on the specific ways the tool is utilized. Prominent recurring themes in the evolving literature about health care performance measurement systems are the need for clear performance measurement objectives, assiduous attention to measure specifications (including how individual patient circumstances and preferences will be addressed), and tight linkage of measures to outcomes or clinical processes known to be connected to outcomes.6,8–12 The critical importance of the context within which measurement results will be used has also become clear.7–11 The types of unintended consequences described by Powell et al. are not unique and emphasize the importance of local implementation in shaping how well broad performance measurement requirements inspire innovation and drive quality improvement.8–11,16
Measurement and public reporting of performance results were key features of the new performance management system implemented in the Veterans Health Administration (VHA) in 1995 as part of a multi-pronged change initiative.2–5 This integrated change strategy led to rapid and dramatic improvements in quality of care, service satisfaction and operational efficiency,2–5,17–23prompting the Institute of Medicine, among others, to recommend that many VA quality management practices be broadly adopted in U.S. health care. However, the VHA’s leadership has turned over five times in the last 11 years, and the way performance measurement has been used during this time has changed. The changed context of performance measurement in the VA Health Care System is noteworthy because performance measurement drives behavior, and the specific ways measures are used will drive different behaviors.7–11
Initially, VHA’s performance measurement system sought to drive adoption of evidence-based clinical and management practices, encourage innovation in care processes for conditions not yet having best practices, and nurture an organizational culture of high reliability and continuous improvement.2–5,23 A limited number of performance measures were linked to key strategic goals explicitly specified in annual negotiated performance contracts with regional Veterans Integrated Service Network (VISN) directors, who in turn implemented performance contracts with local facility managers and clinical leadership.3–5,23 The performance contracts and the associated performance measurements were used primarily in a prospective manner that emphasized data-driven process improvement and forward-looking change. Achieving high levels of performance on the measures was a clearly desired outcome, but the measures were understood to represent a static and incomplete representation of performance in general and quality of care in particular.
Goals specified in the performance contracts were often designed to be “stretch” goals, and the VISN and facility directors were given significant autonomy to innovate and design strategies tailored to their particular needs, resources and environments of care.4,5,23 The VISNs were viewed as innovation laboratories for many areas of desired improvement.24 While challenging, the goals were completely or substantially attained more often than not.4,5 The combination of VA’s compelling mission, the clarity of performance expectations, healthy intra-organizational competition, substantial local autonomy and a sense of professional fulfillment appeared to provide a powerful incentive package that drove rapid improvement. Multiple non-financial awards and modest financial awards for managers were used to recognize accomplishments,4,5 but individual practitioners had no direct financial incentives. In this regard, it is important to note that the financial resource allocation (payment) system used by the VHA at the time did not support the objectives of the transformation strategy, and for this and other reasons a new tiered, capitation-based global payment system was implemented in 1997 to better align system finances with desired outcomes.2–5
When system-wide performance measurement was originally implemented in the VA, clinicians were widely engaged in selecting the measures and improvement strategies, and administrative and clinical managers worked closely to implement them. Illustrative of how this broadly collaborative approach was applied to a specific condition was the VHA’s participation in the Diabetes Quality Improvement Project (DQIP) that yielded marked improvements in care for diabetes.20,25,26
In recent years, performance measurement in the VHA has become more retrospectively focused on compliance with centrally promulgated national policies and performance goals. These policies and associated performance measures have often been directed by VA leadership and national program directors, sometimes in response to external demands stemming from untoward outlier events, with little or no input from front line practitioners, coordination with other reporting requirements, or flexibility to tailor local improvement strategies. The number of performance measures has markedly increased, and there has been a tendency to use less sensitive dichotomous or composite measures.27 The Congress authorized a form of pay-for-performance system for VA clinicians in 2004 that directly linked performance results with personal economic benefit, albeit modest by private sector standards.
The combination of a compliance-focused performance measurement climate and the opportunity for personal economic benefit has caused much greater emphasis to be placed on measure results, which has appeared to dampen willingness to innovate and encourage behaviors aimed at heightening performance measure results.28 Some VHA providers have opined that “obedience to the measures” has become the paramount objective. These and other things have been viewed with concern by many VHA leaders who see them as undermining the culture of continuous quality improvement that had taken root in the VA Health Care System and signaling a return to the dysfunctional centralized command and control management methods of the old VA. To what extent the changed climate of performance measurement in the VHA may have contributed to the behaviors observed by Powell et al, is unclear. Current VHA leadership has informally acknowledged these concerns and expressed a desire to re-orient the performance measurement system.
Performance measurement is an important and powerful tool that has the potential to drive marked improvements in American health care generally, as it did in the VA Health Care System in the late 1990s. However, as reported by Powell et al, in addition to producing intended positive improvement, performance measurement may also produce unintended negative consequences. Whether positive or negative effects predominate will depend especially on how performance measure results are used. Successful performance measurement systems seek to maximize desired beneficial outcomes and minimize unwanted negative effects by clearly defining performance goals, measuring what matters, ensuring stakeholder involvement at every level of the performance measurement process, coupling top-down or externally imposed measurements with flexibility to address local circumstances, and assuring an organizational climate that uses measure results to facilitate learning and continuous improvement.
References
Powell AA, White KM, Partin MR, Halek K, Christianson JB, Neil B, et al. Making the grade – Unintended consequences of implementing a national performance measurement system into local practice. JGIM 2012; doi:10.1007/s11606-011-1906-3
Kizer KW, Dudley RA. Extreme makeover: transformation of the Veterans Health Care System. Annu Rev Public Health. 2009;30:1–27.
Edmondson EA, Golden BR, Young GJ. Turnaround at the Veterans Health Administration. N9-607-035. Boston, MA: Harvard Business School. 2006.
Trevelyan EW. The Performance Management System of the Veterans Health Administration. Harvard School of Public Health Case Study. Cambridge, MA: Harvard School of Public Health. 2002.
Thibodeau N, Evans JH, Nagarajan NJ, Whittle J. Value creation in public enterprises: An empirical analysis of coordinated organizational changes in the Veterans Health Administration. Acc Rev. 2007;82:483–520.
Goddard M, Davies HTO, Dawson D, Mannion R, McInnes F. Clinical performance measurement: part 1 – getting the best out of it. J R Soc Med. 2002;95:508–10.
Goddard M, Davies HTO, Dawson D, Mannion R, McInnes F. Clinical performance measurement: part 2 – avoiding the pitfalls. J R Soc Med. 2002;95:549–51.
Adair CE, Simpson E, Casebeer AL, Birdsell J, Hayden KA, Lewis S. Performance measurement in healthcare: part II – State of the science findings by stage of the performance measurement process. Healthcare Pol. 2006;2:56–78.
Bevan G, Hood C. What’s measured is what matters: Targets and gaming in the English public health care system. Public Service Progamme Discussion Paper Series: No. 0501. December 2005.
Freeman T. Using performance indicators to improve health care quality in the public sector: a review of the literature. Health Serv Manag. 2002;15:126–37.
Werner RM, Asch DA. Clinical concerns about clinical performance measurement. Ann Fam Med. 2007;5:159–63.
Kerr EA, Krein SL, Vijan S, Hofer TP, Hayward RA. Avoiding pitfalls in chronic disease quality measurement: A case for the next generation of technical quality measures. Am J Managed Care. 2001;7:1033–43.
Heidenreich PA, Sahay A, Kapoor JR, Pham MX, Massie B. Divergent trends in survival and readmission following a hospitalization for heart failure in the Veterans Affairs Health Care System 2002 to 2006. J Am Coll Cardiol. 2010;56:362–8.
Pronovost PJ, Lilford R. A road map for improving the performance of performance measures. Health Affairs. 2011;30:569–73.
Walter LC, Davidowitz NP, Heineken PA, Covinsky KE. Pitfalls of Converting Practice Guidelines Into Quality Measures: Lessons Learned From a VA Performance Measure. JAMA. 2004;291:2466–70.
Lester HE, Hannon KL, Campbell SM. Identifying unintended consequences of quality indicators: a qualitative study. BMJ Qual Saf. 2011;20:1057–61.
Jha A, Perlin JB, Kizer KW, Dudley RA. Effect of the transformation of the Veterans Affairs Health Care System on the quality of care. N Engl J Med. 2003;348(22):2218–27.
Trivedi A, Matula S, Miake-Lye I, Glassman PA, Shekelle P, Asch S. Systematic review: comparison of the quality of medical care in Veterans Affairs and Non-Veterans Affairs settings. Med Care. 2011;49(1):76–88.
Trivedi AN, Grebla RC. Quality and equity of care in the Veterans Affairs Health-Care System and in Medicare Advantage health plans. Med Care. 2011;49(6):560–8.
Kerr EA, Gerzoff RB, Krein SL, Selby JV, Piette JD, Curb JD, et al. Diabetes care quality in the Veterans Affairs Health Care System and commercial managed care: the TRIAD study. Ann Intern Med. 2004;141(4):272–81.
Asch SM, McGlynn EA, Hogan MM, Hayward RA, Shekelle P, Rubenstein L, et al. Comparison of quality of care for patients in the Veterans Health Administration and patients in a national sample. Ann Intern Med. 2004;141(12):938–45.
Keating NL, Landrum MB, Lamont EB, Bozeman SR, Krasnow SH, Shulman LN, et al. Quality of care for older persons with cancer in the Veterans Health Administration versus the private sector. A cohort study. Ann Intern Med. 2011;154(6):727–36.
Baker GR, MacIntosh-Murray A, Porcellato C, Dionne L, Stelmacovich K, Born K. Veterans Affairs New England Healthcare System (Veterans Integrated Service Network 1). New England, US. In, High Performing Healthcare Systems – Delivering Quality by Design. Toronto, Ontario, Canada. Longwoods Publishing Company. 2008.
Kizer KW. The “New VA”: a national laboratory for health care quality management. Am J Med Qual. 1999;14:3–20.
Sawin C, Walder D, Bross D, Pogach L. Diabetes process and outcome measures in the Department of Veterans Affairs. Diabetes Care. 2004;27(Suppl 2):b90–4.
Fleming BB, Greenfield S, Engelgau MM, Pogach LM, Clauser SB, Parrot MA. for the Diabetes Quality Improvement Project Group. The Diabetes Quality Improvement Project. Diabetes Care. 2001;24:1815–20.
Pogach LM, Rajan M, Aron DC. Comparison of weighted performance measurement and dichotomous thresholds for glycemic control in the Veterans Health Administration. Diabetes Care. 2006;29:241–6.
Jha A. Recommendations from the Under Secretary for Health's International Roundtable on Clinical Quality and Patient Safety. Washington, DC. Veterans Health Administration. April 2009.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Kizer, K.W., Kirsh, S.R. The Double Edged Sword of Performance Measurement. J GEN INTERN MED 27, 395–397 (2012). https://doi.org/10.1007/s11606-011-1981-5
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11606-011-1981-5