Advertisement

A Suggested Model for Integrating Community Indicators with Performance Measurement. Challenges and Opportunities

  • Patria de Lancer JulnesEmail author
  • Cheryle Broom
  • Soyoung Park
Case Article
  • 106 Downloads

Abstract

In this article, we discuss a retrospective study that examined the integration of community indicators (CI) with performance measurement systems (PM). The Community Indicators Consortium (CIC) promoted integration through its maturity model and related projects and research endeavors. Interviews with knowledgeable individuals from organizations considered exemplars of integration by the CIC suggests that political leadership and collaboration with stakeholders may act as both drivers and challenges to integration. The information derived from CI-PM integration efforts gets used primarily for reporting out and in some cases helps to inform budget decisions. Reported impacts of integration include greater trust and credibility, change of behavior within agencies involved, and changes in service delivery. Though most of the integration efforts studied continue to advance, there is no guarantee of their sustainability and at least one of the efforts has ceased. New perspective from some interview participants suggests acknowledgement of the aspirational value of CI-PM integration. We conclude with recommendations for further exploration to assess its potential as a framework for citizen engagement providing more guidance over public policy and concurrently improving the outcomes of public service.

Keywords

Community Indicators Performance Measurement Community Engagement; Sustainability 

Community Indicators and Performance Measurement at Crossroads

Community indicators projects are concerned with making communities sustainable and supporting policy change (Dulhy and Swartz 2006; Innes and Booher 1999; Murphy-Greene and Blair 2004). In 1987 the United Nations’ Brundtland Commission, which focused on preventing environmental degradation caused by economic development, defined sustainability as meeting “‘the needs of the present, without compromising the ability of future generations to meet their own needs’” (Basiago 1995, p. 109). However, a broader understanding of sustainability has since emerged that takes into account the full spectrum of factors that have an impact on communities. As such, community indicators gather and analyze quantitative data to report on indicators that show past, current, and future trends reflecting the interplay between social, environmental, and economic factors affecting a region’s or community’s well-being (Phillips 2003, p.1.).

A more recent movement, which some argue is superior to community sustainability efforts (Ahern, 2011) is concerned with making communities resilient. Calls for resilience strategies focus on building communities’ ability to withstand and absorb sudden shocks and disturbances, to recover and transform while maintaining basic functions and identity (Giannakis & Bruggeman, 2017; Shaw, 2012). These stresses and shocks could be natural or human-made disasters and events (Sharifi, 2016). One key element of a resilience strategy is having the capacity for making decisions about programs intended to build and support resiliency. As with community indicators, this capacity for decision-making requires information about the achievement of goals and strategies. In other words, like community well-being and policy change efforts, resilience initiatives also need data on specific indicators to help assess goal attainment and strategic success.

Performance measurement, the ongoing collection, monitoring and analysis of quantitative data about program inputs (resources), outputs (products or activities), and outcomes (results), can help to support the information needs of community indicator projects and other community-level initiatives such as resilience efforts. Performance measurement efforts often focus on the clients that programs serve, “targeting what is to be measured, selecting the indicators [e.g., client level input, outputs, outcomes, etc.], collecting and analyzing the data, and reporting the results” (Pasha 2018, p. 215). Though they follow a similar process, both community indicator projects and resilience strategies focus exclusively on community-level outcomes. In spite of this difference, Greenwood (2008) believed that bringing together performance measurement and indicators at the community level could improve both the use of indicators and of performance measurement by decision makers and citizens and lead to better community-level outcomes.

It was this promise of improved use and outcomes that led the Community Indicators Consortium (CIC) to start a new initiative in 2006 to understand, encourage, and support the integration of community indicators with performance measurements (CI-PM) through its integration project (CIC 2007). Created in 2004, CIC is a network of organizations and individuals engaged in the development, dissemination, and understanding of community indicators worldwide. Its work is rooted in the “belief that information sharing, collaboration, and open dialogue are key to advancement of people, the quality of life, and the sustainability of our shared environment” (Farnam and Holden 2014, p. 1063).

To promote integration CIC (2010) developed a construct that they named a “maturity model”, described below (with further detail in Table 1), that shows the what, why, when and who required to move CI and PM from separate (Stage 1) endeavors to integration (Stage 4) (CIC 2010). For CIC, mature integration of CI and PM occurs when community indicators drive government/agency’s performance measurement efforts in a transparent and results-based governance and decision-making system that is consistent with citizens’ priorities (CIC 2010).
Table 1

Suggested Community Indicators-Performance Measures (CI-PM) integration (maturity) model

 

Separate CI & PM projects

Stage 2

Stage 3

Mature integration CI-PM

What

Community Indicators (CI)

Citizen-driven CI’s determine PM impacts linked to quantifiable & measurable results: community needs, sustainability, resource allocation, data-driven policy decisions, & next steps for decision options and priorities

Metrics quantifying values, community, conditions, outcomes & results important to wide-ranging residents within a community

Visioning process involving citizens, key stakeholders, and governmental and nongovernmental entities

Metrics focus on community and programmatic outcomes involving decision- and policy-makers through consensus-building

Performance Measures (PM)

Metrics documenting the outputs of services provided by a government or nongovernmental entity

Linkages between strategic and annual performance planning with metrics documenting the outputs and outcomes of services provided by a government or nongovernmental entity

Citizens and other key community stakeholders participate in the development of output and outcome metrics through forums, feedback systems, or advisory bodies

Why

Community Indicators (CI)

Transparent results-based governance & decision making consistent with citizen priorities for positive community change, community capacity building, economic development & land use, sustainability, reporting for citizen accountability & civic trust

Knowledge-producing story of where a community is today, in relation to where it’s come from

Evidence of citizen priorities as reflected by indictors through credible and reliable data that stimulate public dialogue and debate

Evaluation and public debate determine the whys of community conditions, strategies developed and implemented, and resources identified and committed to improve community conditions

Performance Measures (PM)

Knowledge-producing managerial performance system to achieve efficiencies and improve costs in relation to programs and departments/ divisions/ agencies

Improved data and performance based budgeting and resource allocation through credible and reliable data. More effective service delivery ROI to citizens in allocating limited resources at all levels demonstrated

Outcomes of programs and services demonstrated that reflect citizens and other key stakeholders’ priorities

When

Community Indicators (CI)

Evidence on demand – knowing where residents get their information, what their priorities are & what info they want to know about – within defined time periods, used for regular tracking & strategic decision making

Historical measures and trends over time alerting the need for improvement

Leading and lagging indicators benchmarked for measurable improvement or decline

Long-term and annual goals and targets established periodically and progress measured and publicly reported

Performance Measures (PM)

Annual performance measures

Annual measures progress linked to annual budget development and decisions

Strategic and annual performance goals influence budget discussions and decisions. Community indicators influence the strategic and annual performance goals. Strategic and annual performance goals’ progress publicly reported

Where

Community Indicators (CI)

Localized neighborhoods to any defined group within a geographic area, regardless of political boundaries

Defined community area

Defined community area and demographic groups

Defined community area, demographic groups, neighborhoods and street-level data or larger state and neighboring/regional areas crossing political boundaries. Compared to other defined areas, as appropriate

Performance Measures (PM)

Local government departments/divisions within a defined geopolitical boundary

Contributions of programs and services to changes in community conditions identified along with those of other public agencies, nongovernmental entities, and businesses

Regional and intergovernmental collaboration and comparisons

Who

Community Indicators (CI)

Committed accountability for improvements & collaborative advocacy, inputs & use (diverse government, public/citizen, nonprofit & business engagement & participation at all stages) as leadership changes over time

Community residents likely to control or influence community conditions

Key community stakeholders from all sectors (diverse institutional and business leaders, civic and community groups, local government, policy- and decisionmakers) likely to control or influence community conditions

Coalitions, networks, compacts and other community organizing efforts form to lead community change

Performance Measures (PM)

Government entity management

Policy-makers and elected officials

Networks of community stakeholders including government and nongovernment entities, legislative bodies, regional partners, and coalitions

Prepared by the Community Indicators Consortium, March 31, 2010

CIC has recognized a number of government and nonprofit organizations in the United States and abroad for their CI-PM integration efforts. However, we do not know to what extent integration has progressed and what factors may be responsible for this progression. In this article, we describe the first systematic study, a retrospective study, to understand the process of integration and the results of these efforts in the entities that CIC has recognized as exemplars of integration. The study provides insights into the feasibility of integrating CI and PM and about whether integration can achieve the benefits promoted by its vocal supporters. Below we provide background on CIC and describe the maturity model. Then we present the theoretical framework that guided the study, the methodology, and findings. We conclude with suggestions for advancing the practice and understanding of CI-PM integration.

Toward Integration

As early as 2005, CIC was exploring the value of clear linkages between community indicators and performance measures. A three-year grant in 2009 from the Alfred P. Sloan Foundation provided CIC an opportunity to undertake research on CI-PM integration. CIC established a work group of experts, an awards program for exemplary practices, case studies, a repository for relevant data, and outreach and dissemination activities such as conferences and webinars (Farnam and Holden 2014). The work group initially drafted a CI-PM maturity model as a way of describing what characterized CI-PM integration efforts that were identified and what full integration might mean. To refine and better understand CI-PM integration, the draft model was discussed and presented for input at conferences and other venues and was also enhanced by analyzing the efforts of entities that were recognized as exemplary for incorporating features of CI-PM integration. The resulting model was intended to be descriptive rather than prescriptive and it remains a draft pending further research, which CIC has not undertaken.

Draft Maturity Model

CIC believed that CI and PM can not only complement each other but also can be combined into a more powerful tool for citizens and decision makers to effect policy change. CIC believed that the draft maturity model was a key component of CIC’s pursuit of understanding, if not advancing, CI-PM integration and was optimistic of the potential benefits of integration (CIC 2007). Development and enhancements to the draft maturity model was a means for clarifying features and levels of integration. Moreover, in a number of conferences and workshops, the work group, conference attendees and communities identified as practicing some integration could use the model to build common knowledge of this evolving phenomena. As indicated in their 2007 report to Sloan (CIC 2007, p. 4), the work group identified potential benefits of linking indicators and measures, including:
  • providing evidence of government service performance improvement related to citizens’ priorities, as reflected by their indicators;

  • increasing citizens’ confidence in their local government’s progress towards societal goals reflected by the indicators and measures;

  • enhancing the use of data by citizens and public officials for public debate, decision-making, and allocation of scarce resources; and,

  • increasing the clarity of the contributions made or needing to be made by various sectors of society to improve community conditions.

To explain the movement of CI and PM toward integration, CIC’s proposed maturity model progresses on a continuum from two separate endeavors (Stage 1) to what would be considered full or mature integration (Stage 4) (CIC 2010). Through consensus the work group framed their work-in-progress model to provide definitions that illustrated the best practices of CI and PM integration endeavors at that time. Table 1 below depicts the draft model.

In Stage I the what are two systems of metrics: 1) Community indicators metrics quantifying values, conditions, and outcomes important to residents in the community, and 2) Performance measurement metrics documenting the outputs of services provided by a government entity. Whereas in Stage IV citizen-driven CIs determine the desired impact that a PM system is working toward. As Greenwood reflected, how shall the twain meet?

Responding to the how, Broom and Lomax (2012) argued that an integrated CI-PM system that will ultimately help to achieve a community’s desired outcomes should be comprised of at least five components: engagement, strategic planning, alignment, evaluating and reporting, and information use. Engagement means that citizens are equal partners and are allowed to be part of setting priorities through a process such as community visioning, which is similar to strategic planning in an organizational setting (Bryson 2011). Moreover, the goals, targets and measures developed need to be consistent with community aspirations and indicators (Martin and Kettner 2010). In addition, integration requires that government or other decision-makers align the budget and the strategic plan so that agreed-upon action to achieve community-related goals can be successfully implemented.

Monitoring achievement of performance targets, which track back to the CIs, is a critical step as this allows for regular data collection, analysis, and reporting. This helps to determine whether the community conditions are improving and what may be contributing to those results. This component also provides accountability to the community and evidence for the public and decision-makers to use in policy-making budget decisions. The process is iterative in that the community, policy makers, and implementers build on what they have learned through this regularly reported information.

The entities used as cases for this study, seemed to have applied many or some of the components of the maturity model. They did this in a way that was unique to their circumstances, and to varying degrees with varying results. Our study relied predominantly on interviews, available public information and CIC’s research. At the onset, we knew these CI-PM integration efforts were undertaken prior to CIC formulating the draft maturity model. Thus, the model did not drive the questions explored here. Instead, processes and outcomes of integration efforts were of interest. However, after the initial data collection we used the draft maturity model as a guide for estimating where the entities studied would be on the continuum of CI-PM integration depicted by the model. Therefore, conclusions about the apparent level of integration are preliminary and require further investigation.

Theoretical Framework

Both CI and PM face common challenges that could derail efforts to integrate them. These include a traditional lack of use of the information derived from these systems, methodological difficulties, and properly representing the values that citizens and other stakeholders hold (CIC 2007; de Lancer Julnes 2009; Innes and Booher 1999; Smith et al. 2008). Community indicators reflect “a high-level view of the world” and performance measures reflect a “lower-level view” (Greenwood 2008, p. 59). Community indicators also tend to be highly individualized to reflect community characteristics, which can make their widespread adoption and application difficult (Njuki et al. 2008).

In addition, the integration of CI-PM creates greater levels of complexity and requires coordination. As with the levels of coordination required for performance measurement systems, integration calls for inclusion of a wider array of individuals, organizations, activities, and relationships (Ditillo et al. 2015; Miller et al. 2008).

Due to the dearth of research on the use of community indicators for decision-making to achieve community-level impact and engagement, the most appropriate literature for gaining an understanding of the drivers, challenges, and process of integration of CI-PM is provided by the performance measurement literature. Particularly relevant for understanding the barriers and opportunities that exist for CI-PM integration is the model of utilization of performance measurement developed in de Lancer Julnes and Holzer (2001), and further elaborated in de Lancer Julnes (2009). Developed from an empirical study, their model is the first to explain the factors that affect adoption and implementation of performance measurement systems (de Lancer Julnes 2009; de Lancer Julnes and Holzer 2001).

According to de Lancer Julnes and Holzer (2001), adoption means that performance measures have been developed. That is, the organization has developed a performance measurement system that contains program performance information, which provides a capacity for action. Implementation means that the knowledge derived from the system is used in some fashion-- the knowledge is converted into action (de Lancer Julnes 2009). The utilization model articulates the drivers and hindrances encountered when developing information systems, such as performance measurement and community indicators, that purport to provide information for decision-making, planning, improvement, accountability, and reporting.

Two major categories of factors influence the utilization of performance measures: rational/technocratic and political/cultural factors (de Lancer Julnes and Holzer 2001). Due to space limitations, here we provide only a brief description of the relevant factors encompassing these categories.

Rational /Technocratic

In de Lancer Julnes (2009), the rational/technocratic factors represent organizational capacity. Building organizational capacity requires the availability of resources, access to training on performance measurement, an explicit focus on goal attainment, and formal internal or external requirements to develop a performance measurement system (de Lancer Julnes and Holzer 2001). Without this capacity, organizations may not be able to acquire the hardware and software required to house such systems, for example. Likewise, they may not be able to hire or train personnel with the adequate knowledge to collect, analyze, and report on performance or community indicator data. Nevertheless, while necessary, this capacity is not sufficient for implementation to occur (de Lancer Julnes 2009). Implementation, defined by de Lancer Julnes and Holzer (2001) as knowledge converted into action, requires additional organizational support. This is because of the inherent risks associated with exposing unflattering program performance information (de Lancer Julnes 2009; Nielsen and Jacobsen 2018).

Politics/Culture

The impact of politics and culture has often been neglected or only vaguely dealt with in studies on utilization of knowledge, and particularly the implementation of performance measurement systems. Willis et al. (2003) echoed this sentiment in a report detailing a comparative analysis of the impact of CompStat –the accountability tool used by many police departments to fight crime. The authors stated that as with any other managerial tool, the extent of success of CompStat depends on how the tool “interacts with the capacity, work routines, cultural values and management styles of the police departments” (Willis et al. 2003, p. v).

For de Lancer Julnes and Holzer (2001), the political and cultural factors include openness to change, as demonstrated by a positive attitude toward change and reward systems that support risk taking. It also includes internal interest groups such as management and non-management employees supporting performance measurement efforts. Moreover, de Lancer Julnes and Holzer 2001 found that having external interest groups such as citizens demanding accountability and elected officials calling for performance measurement could lead to implementation.

Relatedly, the process of leadership helps to set the political and cultural context for the utilization of performance measurement. Leaders model change and encourage the development of the motivation to change (Argyris and Schon 1996). They model “openness, risk taking, and reflection necessary for learning” (Cummings and Worley 2009, p. 543). Organizations where leaders exhibit these behavior are considered learning organizations because they tend to focus “on developing and using its information and knowledge capabilities in order to create higher-valued information and knowledge, to change behaviors, and to improve bottom-line results” (King 2001,p. 14). Thus, these leadership characteristics may be necessary particularly for organizations that seek to develop and use information systems to drive change (Nielsen and Jacobsen 2018).

Methodology

The study presented here used an exploratory approach to understand the process, challenges, drivers, and achievements of a new phenomenon--the integration of CI and PM. We used a purposive sample in that we studied organizations considered exemplars by the Community Indicators Consortium (CIC). We contacted all the organizations for whom the CIC had developed “real stories” (six entities) detailing their integration efforts, regardless of the extent of sustainability of their success. We also contacted the most recent CIC integration award winners, and those with honorable mentions (three entities1,2).

We gathered data by means of in-depth telephone interviews conducted from July to September 2013 with key representatives who were responsible for doing the work related to CI-PM integration. Interviews with individuals knowledgeable about the subject in question helps the researcher understand “complex phenomena within their contexts” (Baxter and Jack 2008, p. 544) and to discern the “how” and “why” of the situation under study. We also culled information from the stories on the CIC website and from documentation publically available on each of these entities’ websites. Additionally, Cheryl Broom, co-author of this manuscript, had firsthand knowledge about these integration efforts as she served as co-chair of the CI-PM integration workgroup.

We conducted 11 interviews, lasting approximately one hour; there were two different interviews for two of the entities studied. Appendix Table 3 lists the participants and entities studied. The interview protocol consisted of open-ended questions. These questions were guided by the analytical framework and what we learned from CIC’s “real stories.” The questions examined several domains including: 1) drivers for integration; 2) the process of integration (actors, challenges, etc.); 3) how the system is used; 4) its impact; and 5) sustainability.

Analysis of Interview Data

Interviews were recorded, transcribed, and coded for analysis. We used Nvivo (9.0 version) to identify repetitive patterns and conduct descriptive analysis. For this, we identified key concepts based on the literature and coded each sentence or paragraph into multiple categories for in-depth analysis. A draft of the study report was submitted to the participants for their review. Participants agreed with our interpretation and representation of the interview data.

Findings

Based on the description outlined in the draft maturity model, we believe most of the entities we studied were somewhere at or between Stage II or III of the continuum depicted in the maturity model. Some had a more robust community indicators or community engagement component. Jacksonville CC and Truckee Meadows Tomorrow fell into this category. Formal connectivity of community priorities to government performance measurement and budgeting systems were strongest in Calgary, Albuquerque and Virginia. King County was working on a countywide strategic planning system, which included public input, to align countywide priorities to its internal performance management program. Broward County Children’s Services CI-PM integration efforts, while more focused than the six examples above, achieved some integration with its process. Similarly, Whole Child Leon Healthy Infant Partnership had some positive results but limited integration. Moreover, unlike the others we studied, it was not sustained.

Table 2 presents a summary of findings from the interview data. It shows the dimensions we used for our inquiry as well as the percent of times each theme (factors and outcomes of integration) appeared during the interview. Drivers of PM-CI integration efforts and their sustainability highlight the importance of cultural and political factors described in the literature (e.g., Cummings and Worley 2009; de Lancer Julnes 2009) as exemplified by the number of times leadership and collaboration were mentioned during the interviews.
Table 2

Summary of findings

Dimensions Analyzed

Themes

Number of Comments

Percentage

Drivers

Political leadership

12

48

Collaboration

8

32

Organization needs

5

20

CI-PM Uses

Reporting and monitoring

9

39

Linking goals and indicators

8

35

Budget considerations

6

26

Impact of CI-PM

Changes in attitudes and behaviors

10

56

Changes in service delivery

8

44

Challenges and Sustainability

Political leadership

22

32

Collaboration

21

31

Process

15

22

Resources

10

15

The most critical factors, as suggested by how often they were mentioned during the interviews, are in bold.

Drivers for Integration of Community Indicators and Performance Measurements

Key drivers for developing functional CI-PM systems vary but the three most critical, based on a total of 25 comments are: Political leadership (12 or 48%); collaboration (8 or 32%); and organization needs (5 or 20%).

Political Leadership

Refers to political will to support integration and includes having buy-in and leadership of top elected officials and key staff, such as senior managers (de Lancer Julnes and Holzer 2001). For Virginia, the only state where governors are not permitted to run for consecutive re-election, this means that after each election they need to get, rather quickly, the support of the governor, who serves as chair of the Council on Virginia’s Future. This council was created by legislation at the urging of the business leaders and others who wanted to see the government do a better job tackling and measuring longer term issues. To fulfill this need, they created a “performance leadership and accountability system for the state government –Virginia Performs” (Council on Virginia Future 2014). Also critical for Virginia is the support of the top leadership of State House and the Senate, as they are part of the Council along with six or seven business or community leaders from around the state and two members of the Cabinet—the Secretary of Finance and the Chief of Staff of the Governor. The Council is supported by a staff of five, including a director and a deputy director.

In King County, this impetus came from some council members, one of whom later became county executive. With elected official interest in the legislative and executive branches, lead staff collaborated on developing legislation and approaches to pursue a countywide vision for addressing public priorities strategically. Moreover, other county public officials also had representatives serve on a countywide work group led by the Auditor’s Office.3 This approach provided an opportunity for participation of all elected officials’ offices in the planning process. That King County was able to accomplish such collaboration is significant given that the county has 14 elected officials in the legislative and executive branches plus 78 elected judges in the judicial branch.

Collaboration

Relates to public involvement in partnerships with diverse actors, including governmental and nongovernmental actors (e.g., nonprofits, businesses), to develop community indicators that reflect the community’s conditions as well as citizens’ priorities and to use the information to drive decisions. In the City of Jacksonville, the Jacksonville Community Council Inc. (JCCI), which is the longest running community-driven community indicators project in the U.S. (created in 1985), is a model of collaboration. JCCI partners with other groups in the community such as the Chamber of Commerce, the city council, the school superintendent, the head of the transportation authority, and numerous other external business and nonprofit leaders to discuss what is important and what indicators should be measured. This type of collaboration, driven by an independent community indicators project, was also observed in Nevada.

Truckee Meadows Tomorrow (TMT) was originally a partnership with a private nonprofit to respond to legal mandates. The partnership was an outgrowth of a regional planning effort that began in the 1980s and later expanded in 1991 to include quality of life indicators in the local government plans mandated by state laws (Besleme et al. 1999). The goal of the effort was to guide development and to have an “‘orderly management of growth for the region’” (Marsh 2012, p. 5). This desire led the Truckee Meadows Regional Planning Agency to partner with the Economic Development Authority of Western Nevada’s community-based nonprofit Truckee Meadows Tomorrow to “Create and promote public consensus on the concept of quality of life…to assist in economic development efforts” (Besleme et al. 1999, p. 23). TMT was able to operate with funding provided from grants, especially from the Washoe Health System, a private non-profit corporation. Since taking the lead for the community’s CIs, TMT “has developed community indicators through extensive public engagement and Washoe County has ensured the indicators are integrated into their performance measurement system,” said Karen Hruby former executive director of TMT.

Instances of intense collaboration with multiple actors were also evident in integration projects led by government. Examples include the Council on Virginia’s Future and King County, described earlier. Another example is Albuquerque where, in 1998, the mayor introduced a resolution, that was unanimously supported by the council, to establish its Indicators Progress Commission (IPC). This was two years after the city had published its first progress report on community conditions and indicators, and 23 years after a revision to the City charter that “mandated a link between annual operating budgets and the city’s long-term goals” (Schnaible and Shogry 2012, p. 4). The IPC is composed of 12 city residents charged with overseeing a process for developing community goals and desired community conditions, and then measuring the City’s progress toward those ends.

In Calgary, probably the furthest along in the CIC ideal integration model, the Office of Sustainability, under the imagineCalgary Plan, developed its 100-year community vision by means of a process of strong public engagement with more than 18,000 citizens participating. This high-level plan was then taken and used to help align the work of the city all the way down to departments’ business plans. To bridge the city’s three-year budget cycle with the 100-year vision plan, “We have created a 10-year plan, our 2020 Sustainability Direction, that has objectives and targets and indicators in there that are trying to help us achieve that line of sight to our community targets” stated Carolyn Bowen, at the time manager of the city’s Office of Sustainability. This community vision was then integrated with land use and transportation plans.

Organization needs

Represents a rational/technocratic factor that refers to obtaining organizational goals (de Lancer Julnes 2009). For example, Denise Carbol, senior planner in the Citywide Planning and Design in the City of Calgary, felt that integration was necessary because they saw the need to connect population growth and financial costs. Ms. Carbol stated, “We have to bring the financial numbers forward to the Council and to the citizens of Calgary to let them understand the cost of growth.” For Calgary, a key goal for integration was balancing growth and sustainability goals from the “triple bottom line” perspective adopted by the City, which entails “an approach that considers economic, social, environmental, and smart growth and mobility implications in the decision-making processes” (City of Calgary 2014, n.p.n.).

Having a strategic plan and a mission statement also seemed to encourage integration as the strategic plan can serve as a vehicle to integration of PM with CI through the linkage of long term to short-term goals. This is what the ‘imagineCalgary’ plan works toward by using both performance measures and community indicators; their system emphasizes their 100-year vision linked to a 10-year plan to achieve related goals and targets (the 2020 Sustainability Direction).

Likewise, in King County, Chantal Stevens, who at the time of the interview was principal performance management analyst in the Auditor’s Office and member of the CIC Board, said that the county developed a strategic plan with mission and goals based on public input. The county had earlier established a system to track agency goals and targets (AIMs High). That system needed to be broadened to include tracking community priorities reflected in the strategic plan. King County’s Deputy Director for Performance, Michael Jacobson, outlined an approach for moving from the “desires of our community that we are trying to achieve to a set of strategies for how we are going to achieve them.” He indicated that organizationally King County needed to establish levels of plans in its management system down to the business plan tactical level. At that point, agency budgets are proposed, which are supposed to reflect that they “need to deliver this kind of program and performance,” said Mr. Jacobson.

Uses and Impact of Integration

Questions on the uses of the information from CI-PM integration yielded 23 comments. Reporting and monitoring was mentioned the most (9 or 39%). The second most mentioned use of the information was to link goals and indicators (8 or 35%). To support budget considerations was the least mentioned use of the information (6 or 26%). The relatively small number of comments is not surprising. Previous studies have shown a tendency by government entities toward symbolic use of the information (e.g, Charbonneau and Bellavance 2015; de Lancer Julnes 2009) and critics such as Perrin (2015) often bemoan the lack of evidence that performance measurement influences decision-making.

Although there was not much elaboration on the part of participants in terms of specific examples of uses or impacts, for some (JCCI, for example) monitoring provides insights on what is working or not. In addition, reporting results publicly helps to provide transparency, accountability, and sense of ownership. Moreover, some participants suggested that reporting the information might increase community understanding and interest in participating in community indicators projects. Moreover, JCCI’s Ben Warner stated that this openness could lead to trust in the information. This trust, said Mr. Warner, is reflected in the fact that stakeholders in Jacksonville use the information produced by JCCI. For Karen Hruby from TMT, reporting the status of achieving a performance measure that is linked to an indicator may also boost the chances of the results being taken into consideration for budget allocation.

The second most mentioned use of the information refers to linking the indicators to already established goals and targets. In Albuquerque, for example, the IPC holds a goals forum every four years to confirm desired community conditions. These goals become the top tier of the city’s performance measurement and budgeting system. For example, a top tier goal is human and family development. This has the sub-goal that residents are to be literate and educated. A related community indicator is adult educational achievement rates. This is linked to performance measures for city services such as tracking circulation and visitation at libraries and involvement in reading programs. As described by Chris Payton, former Executive Budget Analyst in Albuquerque, this example shows “full integration between the performance measures and community indicators.”

Similarly, King County adopted a framework with community priorities, the driver for its strategic plan goals and AIMs High performance measurement tracking and reporting program. One of these priorities is environmental sustainability; the goal is to safeguard and enhance King County’s natural resources and environment. It has four objectives for which performance data such as reduction of greenhouse gas emissions is regularly tracked.

The fewest comments on use of information were about linking the information from the integrated system to budget requests to support budget decisions. One example is the Children’s Service Council of Broward County (CSC), a special purpose district that receives funding from property taxes. The council allocates resources based on its strategic plan. According to CSC’s president and CEO Cindy Arenberg Seltzer, the plan “provides the credible linkage between community indicators and performance measures.” In addition, this linkage helps to provide justification to support grant applications. In another example provided by the JCCI participant, Jacksonville’s sheriff used community indicators to illustrate how budget cuts and reduction in the number of sworn officers per capita, along with increased costs, led to higher response times. The sheriff used that information to propose restoration of some positions in his budget.

In most cases, however, the influence of community indicators and performance measures remains rather tenuous. In spite of current expectations for outcome-based and performance-based budgeting, participants felt that it would be very difficult to get to that point, even when CI-PM integration efforts are backed by legal mandates. Nevertheless, some of the participants said that their entities continue to attempt to link CI and PM to the budgeting process and reflect some, albeit limited, progress in making that connection (e.g., Albuquerque, Virginia and King County). In general, most participants agreed that understandably budget decisions are driven primarily by political considerations as each elected official has his/her own priorities and constituents they need to respond to.

The questions about the impact of CI-PM integration generated 18 comments. The reported impacts were primarily related to changes produced in attitudes and behavior of government officials and other stakeholders (10 comments), as well as changes in service delivery (8 comments).

With regard to attitudes and behavior, one of the participants from Calgary stated that the integration effort has helped staff in the agencies to understand how their projects contribute to the community as a whole. They now look at the impacts of their decisions from a broader organizational-wide perspective rather than an individual department perspective. An example given was that of the waste and recycling group who “are not just concerned about achieving the waste target, but also concerned about how it influences the environment and broader society.”

Likewise, the participant from TMT said that before the CI-PM integration system, departments in certain agencies did not talk to each other, but now more of them pay attention to indicators and work together to examine the collective impacts of their individual department’s actions. She said:

We’re providing environmental health spraying for mosquitoes. Typically, decision-making to do that was strictly for environmental health decision-making. Now they look at it in terms of what it means for other health elements such as how it impacts people’s ability to continue to make it to their jobs instead of staying home because they are sick.

In another example, Cindy Arenberg Seltzer of CSC in Florida said that their data collection methodology, analysis, and reporting approaches, the results of which are shown in their budget document, have helped build their credibility. In turn, this has translated into attracting funding for their programs and those of their providers and, thus, improving the outcomes for the recipients of their programs. As an example of this she discussed the creation of the Kingship Care services. Seltzer explained:

Kingship Care is really a work of art. It is actually a really good example of community indicators turning into action… Our school board members at the time were seeing a lot of kids coming to the school system who are living in kinship care situations and have no support. We did some research to get our numbers; we held focus groups around the community to see what the needs were and what was really happening. We found that indeed there was a problem. We had a lot of grandparents taking care of grandchildren…[who] just didn’t know what services they were eligible for; they were having challenges with the legal system [determining] whether they had legal custody or not of the children…. Research showed that there wasn’t much out there that had been done…. So, we pulled pieces of what was working in different aspects and we had some very clear ideas of what we wanted to accomplish. And that’s [how kinship care was created and has] been a core service that we’ve done for 10 years now.

In Virginia, Virginia Performs has focused on root causes of social problems and pushed through some legislation to alleviate these problems. The deputy director, Gerald Ward, said that this took some work internally as agencies needed to take more ownership of “these things that they normally say, ‘I can’t be responsible for obesity rates, or teen pregnancy rates, or infant mortality rates, for example, because there are so many factors that impact that.’” He also added that getting to that point requires a continuous process of dialogue and education and an understanding that if “this is important for the State you need to find a way to execute this role.” This is a good example of integration, said Mr. Ward, in terms of “people lining things up…discussions are different now…. the agency is taking ownership for things like high school graduation rate, dropout rates, teen pregnancy rates, smoking and things like that.”

Dr. Barbara Markiewicz, the consultant who led the Whole Child Leon Healthy Infant Partnership provided another example of changes resulting from integration efforts. This two-year effort was created by the Healthcare Advisory Board appointed by the county commissioners at the urging of a physician. The physician was interested in a comprehensive child health plan that would eliminate racial disparities and improve child health outcomes. Although the initiative is no longer active, Dr. Markiewicz said that the county now has quarterly free screening for younger children and some healthcare providers who had not talked to each other in the past are currently doing so.

We also found some documented evidence of community-level outcomes. Washoe County integrated TMT’s literate community indicator in the community’s library system (Marsh 2012). Accomplishments from the library’s program efforts include helping more than 1500 individuals meet literacy, educational and employment needs.

Challenges and Sustainability of Integration Efforts

During the interviews we learned of significant challenges to sustaining integration efforts and that some of our participants felt full integration (as defined to date) remains elusive, in spite of significant progress (e.g., Albuquerque, King County, and TMT). In one instance the effort could not be sustained (The Whole Child Leon) and in another it was recently reenergized (TMT). The questions generated 68 comments.

As before, political leadership (22 or 32%) and collaboration with citizens and other stakeholders (21 or 31%) were the most salient factors. The process of integration itself including the lack of understanding of what drives outcomes generated 15 comments (22%). Resources to support the efforts generated only 10 comments (15%). Below we discuss the top three factors. In our estimation resources, which include having adequate financial means, staff to implement the system, and the appropriate expertise, are directly or indirectly related to the other factors (de Lancer Julnes 2009).

Political Leadership

Key components are lack of buy-in from officials and agencies and term limits of elected officials. For example, in Calgary it took time to secure approval for the plan when a new mayor took office. Part of the problem with new administrations is their lack of familiarity with these efforts. Thus, they tend to ignore them, or in worst-case scenario, oppose them. As stated by Chris Payton from Albuquerque, oftentimes new public officials and their staff “are less able to understand the goals, processes and the linking of performance information and performance data to budgets.” This is not an uncommon perception, said Jay Fountain, former assistant director of research for the Government Accounting Standards Board (CIC 2012). He stated that in general elected officials are not very cognizant “about CIs or PMs, much less the value of their integration. Or, if they do know about them, they are often hard-pressed to know how best to put them in context and make them truly useful” (CIC 2012, p. 2).

The sentiment expressed above was echoed by Mr. Werner who said that implementation and sustainability of integration efforts are also vulnerable in cities where there isn’t a strong city manager form of government as is the case in Jacksonville. Newly elected mayors are not likely “to know the importance of integration and typically are not knowledgeable of it” so they may not be interested in using the system. Werner said that JCCI will always “work with mayors to help them understand the importance of performance measures and integrating them with community indicators processes.” Nevertheless, he added that the situation remains challenging because elected officials are often facing competing demands in terms of “how to use limited money for services they prefer.” Thus, these preferences may take precedence over the information CI-PM systems may suggest.

Collaboration with Citizens and other Stakeholders

Participants realized that successfully integrating CI and PM and sustaining the effort required ongoing and meaningful participation from diverse groups. Without a good ongoing communication system to receive information and to give feedback to the public, the likelihood of building commitment to the effort and accountability to the public is very low. Also, indicators, reasoned a participant, will have limited value unless they reflect needs and opinions of many community residents.

The Whole Child Leon Healthy Infant Partnership confronted many challenges related to collaborative arrangements in an environment of competing demands and values. The Healthcare Advisory Group members barely had time to come to the quarterly meetings, “no less complete an assignment or attend other meetings outside of official quarterly meetings,” said Dr. Markiewicz. She also said that these providers were underfunded and overworked, but, at the same time, “still wanting to see as many clients as they could.” Thus, while they were committed to providing the service, their lack of resources kept them from collaborating and having a greater impact through the partnership.

The literature emphasizes the need for community efforts to involve diverse groups (Mathie and Greene 1997). As suggested earlier, managing these groups can be challenging because of individual objectives and biases that may not align with those that would be of benefit to the entire community. This was the case for the Leon County partnership. The numerous service providers that were part of the partnership were themselves competitors for funding. Thus, they were somewhat resistant to collaboration. Dr. Barbara Markiewicz said that “There was resistance to measuring success. There was resistance on the part of communities to measure outcomes as communities prefer measuring the process.” These competing interests along with a lack of partnership leadership contributed to the eventual discontinuation of the effort.

For other participants in this study, citizen engagement was critical to the sustainability of the integration effort. To illustrate, citizens such as young professionals and new entrepreneurs were viewed by TMT as important actors who could “exert influence through pressure on local government to pay attention to community issues.” Successful integration for Calgary requires a “system approach” where citizens are meaningfully engaged in a process that involves them as well as other groups. Indeed, the Office of Sustainability of the City of Calgary has a “mandate to embed sustainability within the corporation and accelerate community engagement around sustainability through the imagineCALGARY Partnership and other networks” (City of Calgary 2014). To that end their integration effort for sustainability involves a number of departments working on issues from transportation, to environment and smart growth. Their strategy to achieve integration is to link the measures (community indicators) derived from citizen input to the measures that departments are already using, and educating people about the importance of having sustainable communities.

Process

The process of integrating CI-PM faces technical and methodological challenges because of legacy systems, lack of agreement, and conflicting expectations. To illustrate, Chantal Stevens, from the King County Auditor’s office, referenced three indicator and measurement endeavors that involved King County: Communities Count, The Benchmark Program and the County’s Strategic Plan and Performance Measurement System. She said that although community indicators were created collaboratively with stakeholders including government, some of the indicators developed were meant to support several activities outside of county government. Therefore, “the challenge is to bring it all together into an integrated approach.”

Additionally, putting a new integrated system of CI-PM into an existing, established, and successful process can be daunting. In particular, participants talked about the difficulties in maintaining linkages between CI-PM and budgets. One participant from Calgary stated “It’s not, in many cases, so clear cut and dry. It is just not that easy to go from our communication to the actual process of how you align our work to our community indicators and our sustainability plan.”

The different unit of analysis of CI and PM efforts also made linking services and programs’ performance to desired community outcomes challenging. The CEO of CSC found it difficult to make the “credible link between the performance measures down at the program level up to the community indicators.” Participants from Jacksonville and Calgary also noted that this problem is compounded by the perennial difficulty of tracking outputs and impacts, and obtaining reliable and credible data.

Chris Payton, from the City of Albuquerque, said that not holding agencies accountable for results makes it difficult to link the adopted goals of the Indicators Progress Commission (IPC) to the budget process. Also, without the accountability piece there is no incentive to collect, use, and report achievement data in an integrated way. Even though IPC was created by resolution, it doesn’t have the power to require agencies to integrate their program’s performance measures and the community desires as expressed by the IPC’s goals.

JCCI faced similar challenges. Although it has a fairly mature indicator tracking system and can relate indicators to performance measures, it remains difficult for JCCI to keep the PM systems managed by other entities consistently tracking and reporting data that links to the CIs.

Conclusions and Recommendations

Our analysis of interviews with knowledgeable participants of CI-PM integration efforts suggests that moving from two separate systems of CIs and PMs to an integrated system based on CIC’s draft maturity model (Stage I to Stage IV) is possible but may face significant challenges. The challenges are similar to those found in the utilization of performance measurement systems (e.g., de Lancer Julnes 2009; Carlucci et al. 2015). For example, political leadership was referenced the most as the impetus for integration, but also received the highest percentage of comments related to sustainability challenges. Lessons learned from CIC’s Real Stories reinforce the importance of political leadership to sustain the effort over time.

Collaboration scored second in the drivers and challenges categories. In successful efforts government and community organizations partner in establishing integrated CI-PM systems. They engage the community in determining the priorities and employ collaborative mechanisms to ensure those priorities are met (Ho 2007). However, maintaining collaboration over time among many groups--often with competing interests--is a significant challenge, as evidenced by TMT and Leon County. Communities, such as Broward and Jacksonville, have established and maintained infrastructures with broad-based partnerships.

The third most frequently referenced challenge to sustainability of CI-PM integration relates to process, which covers many factors. Primary among them is the actual linkage of community priorities and desires to performance measures and goals and linking both to budget allocations, which is critical to successful integration. Even those entities that appear to be the most successful at integrating CIs and PMs struggled to achieve this level of integration. In our view this illustrates that CI-PM integration is potentially doable and has value, but ensuring its sustainability and use remains at a crossroads. In particular, our interview data seem to indicate that the political environment in which these efforts operate will dictate the extent to which connecting actions to budget allocations and informing decisions based on integrated CI-PM systems will be able to occur and whether these efforts will be allowed to take hold.

In spite of the challenges, our findings indicate that in most instances integration efforts continued as did interest in improving the performance of public services and opportunities for public engagement. JCCI and CSC’s had well established CI-PM programs they were improving. Calgary and King County were endeavoring to improve linkages of government strategic plans and performance measures to community indicators. Albuquerque and Virginia had mandated systems and continued to strengthen the relationship of their public councils to their government’s budget and program plans. With an improved economy and renewed interest in linking community priorities to public services, TMT was getting underway again. The only CI-PM program not moving forward was Leon’s health partnership.

CIC was a leader in conducting research on CI-PM integration. We are not aware of any others following suit. As noted above, the entities we studied pursued integration organically without CIC’s proposed model in mind. Subsequent to completing our primary research in 2013, the authors had follow-up discussions with several of those interviewed and who were part of CIC’s integration project. Their perspectives acknowledged the aspirational value of CI-PM and the constraints or limitations we reported.4 The literature is extensive on the challenges of improving public services, satisfaction with those services and community engagement in prioritizing those services. Further exploration of the construct CIC proposed could provide an opportunity to advance achievement of those challenges and the benefits CIC posited in 2007.

With that in mind, we offer the following questions for greater investigation: 1) If a community is interested in pursuing CI-PM integration, what factors should be taken into consideration? Our study shows that context matters; 2) How can we achieve common purposes through integration of CI-PM? As shown here, competing values remain a challenge; 3) What data are available or can be generated to demonstrate the impacts of CI-PM? Most of our current knowledge comes from anecdotal accounts; 4) What kinds of models of engagement should CI-PM integration efforts adapt to ensure success? And 5) How can we build upon CIC’s maturity model to better understand the meaning of integration? The notion of “integration” and the integration framework need further refinement. This may result in enhancing the CI-PM maturity model--or replacing it with a framework that succeeds in being more than aspirational.

Footnotes

  1. 1.

    We were not able to reach representatives from the Government of Australia, a CIC award winner.

  2. 2.

    In some cases, those for whom the CIC had developed “real stories” were also award winners.

  3. 3.

    The County Auditor’s Office is an independent function with the County Auditor appointed by the County Council.

  4. 4.

    2018 and 2019 discussions with Karen Hruby, Michael Jacobsen, Allen Lomax (former Co-Chair of the CIC CI_PM integration workgroup) and Chantal Stevens (Current CIC Executive Director).

Notes

Compliance with Ethical Standards

Conflict of Interest

None.

Ethical Compliance

The Institutional Review Board at the University of Baltimore approved this study. All participants signed an informed concern document.

References

  1. Ahern, J. (2011). From fail-safe to safe-to-fail: Sustainability and resilience in the new urban world. Landscape and Urban Planning, 100(4), 341–343.  https://doi.org/10.1016/j.landurbplan.2011.02.021.CrossRefGoogle Scholar
  2. Argyris, C., & Schon, D. (1996). Organizational learning: A theory of action perspectives (2nd ed.). Reading: Addison-Wesley.Google Scholar
  3. Basiago, A. D. (1995). Methods of defining 'sustainability'. Sustainable Development, 3(3), 109–119.CrossRefGoogle Scholar
  4. Baxter, P., & Jack, S. (2008). Qualitative case study methodology: Study design and implementation for novice researchers. The Qualitative Report, 13(4), 544–559.Google Scholar
  5. Besleme, K., Maser, E., & Silverstein, J. (1999) A community indicators case study: addressing the quality of life in two communities. http://rprogress.org/publications/1999/CI_CaseStudy1.pdf Accessed June 2013
  6. Broom, C., & Lomax, A. (2012). Challenges and opportunities of community indicators—Performance measures integration. Annual conference of the American Society for Public Administration, Las Vegas, NV.Google Scholar
  7. Bryson, J. (2011). Strategic planning for public and nonprofit organizations: A guide to strengthening and sustaining organizational achievement (4th ed.). Hoboken, New Jersey.Google Scholar
  8. Carlucci, D., Schiuma, G., & Sole, F. (2015). The adoption and implementation of performance measurement process in Italian public organisations: the influence of political, cultural and rational factors. Production Planning and Control, 26(5), 363–376.Google Scholar
  9. Charbonneau, É., & Bellavance, F. (2015). Performance management in a benchmarking regime: Quebec's municipal management indicators. Canadian Public Administration, 58, 110–137.CrossRefGoogle Scholar
  10. City of Calgary. (2014). Retrieved from http://www.calgary.ca/CA/cmo/Pages/Triple BottomLine.aspx Accessed September 2013
  11. Community Indicators Consortium. (2007). Creating stronger linkages between community indicator projects and government performance measurement efforts. http://www.communityindicators.net/system/medias/49/original/CIC_2007_Linkages_Final_Report.pdf?1273695674. Accessed April May 2013
  12. Community Indicators Consortium. (2010). Community Indicators-Performance Measures (CI-PM) integration descriptive (maturity)model. http://www.communityindicators.net/system/medias/64/original/CI-PM_Descriptive__Maturity__Model-draftv4.pdf?1275505793. Accessed June 2013
  13. Community Indicators Consortium. (2012). Community indicators and performance measures integration — A need or a luxury. http://www.communityindicators.net/system/medias/326/original/BriefingsNo1-final.pdf?1341586764. Accessed June 2013
  14. Council on Virginia Future. (2014). Retrieved from http://future.virginia.gov/aboutUs/council/. Accessed June 2013
  15. Cummings, T. G., & Worley, C. G. (2009). Organization development & change (9th ed.). Stamford: Cengage Learning.Google Scholar
  16. de Lancer Julnes, P. (2009). Performance-based management systems. Effective implementation and maintenance. Boca Raton: CRC Press.Google Scholar
  17. de Lancer Julnes, P., & Holzer, M. (2001). Promoting the utilization of performance measures in public organizations: An empirical study of factors affecting adoption and implementation. Public Administration Review, 61(6), 693–708.  https://doi.org/10.1111/0033-3352.00140.CrossRefGoogle Scholar
  18. Ditillo, A., Liguori, M., Sicilia, M., & Steccolini, I. (2015). Control patterns in contracting-out relationships: it matters what you do, not who you are. Public Administration, 93(1), 212–229.CrossRefGoogle Scholar
  19. Dulhy, M., & Swartz, N. (2006). Connecting knowledge and policy: The promise of community indicators in the United States. Social Indicators Research, 79, 1–23.CrossRefGoogle Scholar
  20. Farnam, J., & Holden, M. (2014). Community Indicators Consortium. In A. C. Michalos (Ed.), Encyclopedia of Quality of Life and Well-Being Research. Dordrecht: Springer Netherlands.Google Scholar
  21. Greenwood, T. (2008). Bridging the divide between community indicators and government performance measurement. National Civic Review, 97(1), 55–59.CrossRefGoogle Scholar
  22. Giannakis, E., & Bruggeman, A. (2017). Economic crisis and regional resilience: Evidence from Greece: Economic crisis and regional resilience. Papers in Regional Science, 96(3), 451-476.  https://doi.org/10.1111/pirs.12206.CrossRefGoogle Scholar
  23. Ho, A. (2007). Engaging citizens in reporting policy results and community conditions: A manager’s guide. Washington, D.C.: the IBM Center for the Business of Government.Google Scholar
  24. Innes, J., & Booher, D. E. (1999). Indicators for sustainable communities: A strategy building on complexity theory and distributed intelligence. Planning Theory and Practice, 1(2), 173–186.CrossRefGoogle Scholar
  25. King, W. R. (2001). Strategies for creating A learning organization. Information Systems Management, 18(1), 1–20.  https://doi.org/10.1201/1078/43194.18.1.20010101/31261.3.CrossRefGoogle Scholar
  26. Marsh, Z. (2012). Citizen-driven performance: Truckee Meadows Tomorrow and Washoe County, Nevada, Real Stories Series No.1 from Community Indicators Consortium. Retrieved from http://www.communityindicators.net/system/medias/90/original/TruckeeMeadowsStory_1_.pdf?1283885397. Accessed June 2013
  27. Martin, L., & Kettner, P. (2010). Measuring the Performance of Human Service Programs. Thousand Oaks: Sage.Google Scholar
  28. Mathie, A., & Greene, J. (1997). Stakeholder participation in evaluation: How important is diversity? Evaluation and Program Planning, 20(3), 279–285.CrossRefGoogle Scholar
  29. Miller, P., Kurunmaki, L., & O’Leary, T. (2008). Accounting, hybrids and the management of risk. Accounting, Organizations and Society, 33(7–8), 942–967.CrossRefGoogle Scholar
  30. Murphy-Greene, C., & Blair, J. (2004). Binational vital signs: A quality of life indicator program for the San Diego–Tijuana metropolitan region. Review of Policy Research, 21(5), 681–697.CrossRefGoogle Scholar
  31. Nielsen, P. A., & Jacobsen, C. B. (2018). Zone of acceptance under performance measurement: Does performance information affect employee acceptance of management authority? Public Administration Review, 78(5), 684–693.  https://doi.org/10.1111/puar.12947.CrossRefGoogle Scholar
  32. Njuki, J., Mapila, M., Kaaria, S., & Magombo, T. (2008). Using community indicators for evaluating research and development programmes: Experiences from malawi. Development in Practice, 18(4–5), 633–642.  https://doi.org/10.1080/09614520802181913.CrossRefGoogle Scholar
  33. Pasha, O. (2018). Can performance management best practices help reduce crime? Public Administration Review, 78(2), 217–227.  https://doi.org/10.1111/puar.12856.CrossRefGoogle Scholar
  34. Perrin, B. (2015). Bringing accountability up to date with the realities of public sector management in the 21st century. Canadian Public Administration, 58, 183–203.  https://doi.org/10.1111/capa.12107.CrossRefGoogle Scholar
  35. Phillips, R. (2003). Community indicators. American Planning Association Report Number 517. Accessed March 30, 2019: https://www.planning.org/publications/report/9026850/
  36. Schnaible, J., & Shogry, T. (2012). Improving community conditions: a framework linking community indicators, city budgets, government performance measures and performance management in albuquerque, new mexico, Real Stories Series No.3 Retrieved from http://www.communityindicators.net/system/medias/91/original/AlbuquerqueRealStory.pdf?1283885433. Accessed June 2013
  37. Sharifi, A. (2016). A critical review of selected tools for assessing community resilience. Ecological Indicators, 69, 629-647.  https://doi.org/10.1016/j.ecolind.2016.05.023.CrossRefGoogle Scholar
  38. Shaw, K. (2012). “Reframing” Resilience: Challenges for Planning Theory and Practice. Planning Theory & Practice, 13, 308-312.  https://doi.org/10.1080/14649357.2012.677124.CrossRefGoogle Scholar
  39. Smith, M., Busi, M., Ball, P., & Van der Meer, R. (2008). Factors influencing an organizations’ ability to manage innovation: a structured literature review and conceptual model. International Journal of Innovation Management, 12(4), 655–676.CrossRefGoogle Scholar
  40. Willis, J., Mastrofski, S., and Weisburd, S. (2003). Compstat in practice: An in- depth analysis of three cities. the police foundation. Accessed March 30, 2019: https://www.policefoundation.org/wp-content/uploads/2015/06/Willis-et-al.-2004-Compstat-in-Practice.pdf .

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Patria de Lancer Julnes
    • 1
    Email author
  • Cheryle Broom
    • 2
  • Soyoung Park
    • 3
  1. 1.Marxe School of Public and International AffairsBaruch College, City University of New York (CUNY)New YorkUSA
  2. 2.King CountyUSA
  3. 3.Incheon National UniversityIncheonSouth Korea

Personalised recommendations