Keywords

Why Performance Measurement Is Important to Canada

Canadian governments, under greater scrutiny and facing demands for more accountability over the use of public funds, are applying increasing pressure to measure the performance of their public higher-education systems. This is a trend shared by many other jurisdictions. At a loftier level, governments also recognize that the outputs of higher-education systems—the highly educated students they graduate, the research and innovation they spawn and the communities they support—are essential to a robust and vibrant society that is competitive economically and that sustains a high quality of life. This is especially true in Canada where the higher-education system is essentially public. In contrast to many other countries, Canada has no private postsecondary education sector of the scope, magnitude or capacity to provide the higher education demanded by its citizens and required by the country. Canada is also more reliant on its higher-education institutions for its overall research activity than many other OECD countries (Science, Technology and Innovation Council 2015). Higher-education spending is a significant draw on the public purse, superseded in Ontario only by public expenditures on health care, K-12 education, and children’s and social services. In short, it is in the interest of governments and an expectation of them, that they will know how effectively the public funds allocated to their higher-education institutions are being used and how well their higher-education system and institutions are performing.

The Pragmatist Versus the Idealist

But how does one design a meaningful performance tool that is useful to the government? The pragmatist looks for a tool that can be implemented quickly to appease government’s (and the public’s) impatience with a lack of accountability and to satisfy demands for evidence of value for money. A pragmatist develops a performance tool using available data and existing measures. These are often a mixture of inputs, outputs and outcomes. They may or may not reflect jurisdictional priorities for higher education (there may be no clear jurisdictional priorities and, if there are, the fit between those priorities and the data at hand may be no more than serendipity). Aware of these shortcomings, and mindful of the inevitable pushback from higher-education institutional partners that the wrong indicators may have been selected, or that the data may be shoddy or oversimplified given the complex and unique characteristics of higher education, the pragmatist often tends toward embracing a larger and larger pool of indicators.Footnote 1

However, a large number of indicators may, in actual fact, hinder the need to be precise and articulate about what really matters to a jurisdiction and, therefore, the key priorities on which the system must deliver. If it turns out that the pragmatist’s strategy is, after a period of time, deemed no longer satisfactory, the pragmatist returns to the pool of available data and starts the exercise again in the hope of achieving a better outcome.

The idealist approaches the task of developing a performance measurement tool in a different manner. The idealist recognizes that the exercise starts first by agreeing on jurisdictional priorities for higher education. These define the outcomes the jurisdiction desires from the system. The idealist understands that to satisfy the essence of having priorities, the desired outcomes must be few in number. As Jim Collins, author of Good to Great, observed: “If you have more than three priorities, you don’t have any” (Collins 2001). Then, and only then, the idealist asks what information is required to best and most directly measure progress and achievement of these priorities. If these measures exist already, then fine. But, if not, the idealist understands the need to build the capacity and invest in what is necessary to measure these critical things. To the idealist, while part of the motivation of the performance regime may be accountability, the dominant focus is the drive for improvement—making the higher-education system even better. So, the dominant goal is not a performance tool that reports or ranks, but rather one that drives engagement, analysis, strategic investment and change.

HEQCO’s Role in Measuring Postsecondary Performance

HEQCO was established by legislation in 2005 (Government of Ontario 2005) “to assist the Minister in improving all aspects of the postsecondary education sector, including improving the quality of education provided in the sector, access to postsecondary education and accountability of postsecondary educational institutions.” HEQCO is managed by its board of directors (members of which are appointed by the government) and, as a crown agency, is independent of government. It is a relationship that the government has respected throughout the history of the agency, even when HEQCO has published research or views that may have been critical of government policies or actions.Footnote 2

The act grants HEQCO a very broad research mandate. For current purposes, the most relevant clause of the act is the legislated requirement that HEQCO “evaluate the postsecondary education sector, report to the Minister on the results of the evaluation and make the report available to the public.” Thus, HEQCO is in the enviable position to dance in the space between pragmatist and idealist. We are affiliated with government, yet, as an independent agency we are neither bound nor constrained by present-day politics. We are keenly aware of and sensitive to the dominant political and policy issues of the day (we would be negligent if we were not), yet we work in a time frame that allows us to look down the road and anticipate the dominant issues and policies several years in the future. By legislation and modus operandi, our work and advice are to be based on evidence and not political expediency. Governments can either accept or reject the advice we provide. In fact, a clever government uses us strategically to do work or float ideas and policies that would be politically difficult for it to do directly.

Our most recent published performance review of the postsecondary system was in 2015 (Weingarten et al. 2015). Many earlier HEQCO research reports contributed piecemeal to this assessment. Our first comprehensive evaluation of overall postsecondary performance was delivered in twin publications: The Productivity of the Ontario Public Postsecondary System (HEQCO 2012) and Performance Indicators (HEQCO 2013a). Both these reports situated Ontario’s performance within the context of a mix of international and Canadian indicators across four domains: quality, access, productivity and social impact. Canadian Postsecondary Performance: Impact 2015 (Weingarten et al. 2015) was our second comprehensive examination of performance. Patterned after the Social Progress Index (Porter and Stern 2015), it measured 34 indicators that addressed access to the system, value to students and value to society. A more detailed description of the Canadian Postsecondary Performance: Impact 2015 report and its findings can be obtained in Weingarten and Hicks (forthcoming).

Development of an Improved Performance Measurement Tool: Measuring Only What Matters

The Canadian Postsecondary Performance: Impact 2015 report made some important contributions to the design of postsecondary performance tools. It emphasized outputs and outcomes, not inputs. It provided a user-friendly presentation of the exhaustive data characteristic of performance reports as well as a web-based tool that allowed customization of the analysis for the reader’s individual interests. It provided strong evidence that funding levels (at least those evident across Canada’s 10 provinces) are uncorrelated with system performance, thus focusing attention on what systems achieve with the money they receive rather than on the amount they receive. It identified important data gaps in what is needed to have an even more informative and useful report. See Weingarten and Hicks (forthcoming) for further elaboration.

We did not abandon these design considerations in thinking through how a new performance tool could be improved. In retrospect, though, we had too many indicators because we were trying to appeal to too many audiences with too many interests. If the tool were to be improved, we would have to sharpen our thinking about the motivation and purpose of a postsecondary performance tool. This would lead inevitably to a sharp decrease in the number of indicators but, at least for the audience intended, a more relevant, informative and useful set.

Given HEQCO’s mandate to provide advice that assists the government in improving its postsecondary system, our primary audience is government and the public postsecondary system it funds, supports and regulates. Thus, an improved performance tool has to allow the government to measure the effectiveness and impact of its practices and policies and to assess whether changes in policies and practices are steering the system in the right direction.

From this perspective, the obvious and most relevant issue is what matters most to the Ontario government. This is what should shape the design and details of the performance tool.

The difficulty is that in the hurly-burly of the political churn, it is not always clear what matters most to a government. We have argued before (Weingarten 2016) that it is sometimes hard to tell what matters most to the Ontario government because there are far too many goals (some of which contradict others) and their relative prominence seems to vary from one government decision to the next. But if one follows over time the various policy statements and funding announcements from government (as we do) and spends enough time with politicians and bureaucrats (as we do), then one can decipher the dominant goals the government sets for its public postsecondary system—there is no need (or appetite) for a grand visioning exercise.

In sum, as we contemplated the design of an improved performance assessment tool for Ontario, we gravitated to the following design considerations:

  • We would focus on outcomes.

  • We would measure outcomes that meaningfully address Ontario’s priorities.

  • We would articulate them in simple language.

  • We would take the time and energy to develop the required data collection machinery and measures, if these were not already available, and we would present them in an easy to digest format.

Postsecondary Priorities: What Matters Most to Ontario?

Equity of Access

Postsecondary access has been Canada’s and Ontario’s dominant policy priority over the past three decades as we imagine it has for many jurisdictions.

At its most general level, access means more students participating in postsecondary education. For many years, this was the Canadian goal and it has been largely met. Driven by demographics, a belief that a postsecondary credential is essential for success in labour markets and enabled by enrolment-based funding mechanisms, institutions added more than 200,000 additional students to the Ontario system in the past decade alone. As the OECD’s Education at a Glance shows (OECD 2017), Canada is among the world-leading countries in overall postsecondary attainment rates among adults and, as we have shown (Weingarten et al. 2015), Ontario is a lead jurisdiction within Canada.

Access can take many forms. As noted above, it may simply reflect a goal of more postsecondary students. But, once such a goal has been achieved, as it has been in Ontario, access may be more specifically defined. For example, a goal of greater access may mean a desire for increased enrolment in certain programs or disciplines resulting, perhaps, from the distinctive economic needs or plans of the jurisdiction. In Ontario, the specific access goal is the desire to increase the participation rate of students who are currently underrepresented in postsecondary studies. These are the very students that derive, perhaps, the greatest value from a postsecondary education. Ontario’s goal to make access more equitable was stated most clearly in the mandate letter from the provincial premier to the minister of advanced education (Government of Ontario 2016a) that stated the need to “develop an access strategy to address the non-financial barriers to postsecondary education for underrepresented groups, including students from low-income backgrounds, students with disabilities and mature students.” This charge was accompanied by long called-for and significant policy and process reforms to the Ontario student financial-aid system in 2016, which were designed to spur greater participation in postsecondary education by currently underrepresented students.

A High Quality Education Where Students Acquire the Knowledge and Skills Needed to Succeed

Given that a quintessential goal of higher education is to give students a meaningful and useful education that allows them to succeed in life, measures of what students learn and how well they learn these things should be central to any postsecondary performance measurement instrument.

Multiple student surveys have demonstrated consistently that the dominant (although not exclusive) reason students attend postsecondary education is to get a credential needed to get a good job. And for decades, the dominant reason governments have supported public higher education is to graduate students with the skills and knowledge to nurture, feed, sustain and grow a robust and competitive economy.

We see many of the quality-focused measures of the past—like graduation rates, job placements for graduates and student satisfaction—to be proxy indicators for this central question: Are students learning and will they have learned the right things by the time they leave?

Our capacity (and motivation) to come up with meaningful indicators of what students learn is assisted greatly by the growing movement to articulate and measure learning outcomes—that is, what students should know and be able to do as a result of their postsecondary education. Similarly, the province has identified the quality of the student experience as a major policy priority and the use of learning-outcome measurements as a major vehicle for assessing whether this objective is being achieved (Herbert 2015). This policy and practice objective has been encouraged by many years of HEQCO research promoting the importance of articulating and measuring relevant learning outcomes in the Ontario postsecondary sector (HEQCO 2013b; Weingarten 2017c).

Sustainable Institutions

One obvious requirement of a well-functioning postsecondary system is that it and the institutions within it are sustainable and, therefore, have the financial and academic means to deliver on the expectations society and students impose on them.

In its starkest and simplest form, sustainability means that an institution’s revenues and expenditures are in balance. An obvious sign of unsustainability is when an institution runs out of cash and can’t meet an upcoming expense such as an impending loan payment or its payroll obligations. In the private sector, this is when one declares bankruptcy or insolvency. A much less obvious but more likely (and already evident) consequence of unsustainability within publicly funded, higher-education systems is that institutions make decisions that slowly erode the quality of the academic and student experience; that is, to maintain financial sustainability, academic sustainability is put at risk.

We recognize that sustainability is a condition that must be satisfied before outcomes such as more equitable access and higher quality can be pursued and achieved. Without it, the dialogue between government and institutions becomes stuck on questions of money and never moves on to questions of what we wish to collectively achieve, like more equitable access and higher quality learning.

The government has a particularly significant obligation to assure the sustainability of its public higher-education system. The quintessential roles of government are to be responsible stewards of public funds and to ensure the quality of public institutions. When the sustainability of a sector is questioned, as is the case now for the Ontario postsecondary system, the government is obliged to act. No government wishes to be in the position of having to bail out a public college or university facing financial exigency. In fact, the most recent changes to the formula by which Ontario colleges and universities are funded were designed specifically to stabilize the financial sustainability of its institutions that were at the greatest fiscal risk (MAESD 2017).

In sum, based on recent policy statements, reports and actions, what appears to matter most to Ontario is a postsecondary system that provides:

  • Better access for underrepresented groups.

  • A higher quality education where students learn the knowledge and skills to succeed.

  • Sustainable institutions.

If a performance measurement instrument is to measure what matters most to government, then it must contain indicators that are directly relevant to understanding the current state of the Ontario system on these three dimensions and have the capacity to measure whether changes in government policy or institutional behaviour are driving the system to improve on them.

A Performance Measurement Tool that Measures Only What Matters

Equity of Access

The Ontario government has been vocal about its commitment to increase the province’s postsecondary attainment rate to 70% by 2020, a target that has been largely achieved. As a result, the current dominant access goal has shifted to the achievement of equitable access for students in underrepresented groups. The government has identified these to be Indigenous, Francophone, and first-generation students as well as students with a disability, and those from low-income families.

To increase the participation rate of these cohorts, the government has introduced special funding envelopes and other initiatives. The most significant policy and practice initiative has been the recent (Government of Ontario 2016b) fundamental transformation of the student financial-assistance program. The revised program front-end loads more grants (as opposed to loans) to students from low-income families (thus allowing the government to legitimately claim “free tuition” for students who come from families below a certain income level.)

At a very pragmatic level, equity of access will have been achieved when the participation and graduation rates of underrepresented groups, such as those students from families with low incomes, equal those of students from the most advantaged groups currently well represented within our colleges and universities.

This suggests an indicator that is simple, clear and measurable. If the target group is students from low-income families, for example, one can look for changes in the relationship between family income and postsecondary participation rates. Figure 1 shows the current relationship (Government of Ontario 2016b). We will have achieved greater equity of access when the participation rate for students from low-income families approaches and meets that of students from higher-income families, as shown as the 2025 goal in Fig. 1.Footnote 3

Fig. 1
figure 1

Higher-education participation rates of 18–21-year-olds living at home, by parental income in Ontario, 2013

A Higher Quality Education Where Students Learn the Skills Needed to Succeed

Institutions report many things to the Ontario government—for example, enrolment (in Ontario counted in multiple ways and reported differently to different parts of government), capital spending, deferred maintenance, scholarships and bursaries, faculty and staff numbers, wages, salaries and benefits, research funding, executive compensation, etc. Ironically, despite the fact that we are dealing with an education system, there is almost no reporting about what students learn. We could not envision a postsecondary measurement tool that did not have academic quality and learning at its core.

There are different opinions about how to measure academic quality. In May 2017, we assembled a group of international experts for a two-day workshop to explore how this might be done (Weingarten 2017b).Footnote 4 The options ranged from quality measures based on inputs (e.g., number of teaching staff, class sizes), to proxies for learning (e.g., student satisfaction and engagement, graduation and persistence rates) and institution-wide standardized tools measuring attributes like critical thinking (e.g., Collegiate Learning Assessment).

Input measures provide no evidence about what is actually learned. Even proxy measures for learning may not reflect whether learning has happened; for example, students who indicate a high level of satisfaction with their educational experience may not have learned the skills they require. Proxy measures also raise concerns that teachers or institutions could engage in behaviours to improve these numbers while actually decreasing quality or the amount learned. For example, institutions might reduce standards to increase graduation rates or professors might decrease course requirements and standards to enhance the student ratings of course or professor satisfaction (OCUFA 2017).

For both philosophical and pragmatic reasons, we gravitated to measures of learning that were direct, using psychometrically reliable and valid instruments. And we adopted a pragmatic approach to defining quality by focusing on a limited number of attributes and skills that we believe all would agree are things a postsecondary graduate should have acquired (Weingarten 2017a).

Some suggest that this approach is akin to searching for the holy grail of quality measurements. We are not great believers in holy grails. But, as pragmatists, we realize that governments, like all good administrators, know that you signal what is important by how you spend your time and money. If a government spends its time measuring enrolments and allocates public funds on the basis of enrolment, then people infer that what matters is enrolment and they act accordingly. This has been the experience of colleges and universities in Ontario for decades (HEQCO 2015). If we want to signal that what students learn is important, then we have to measure it and tie it to funding. Ontario is moving in that direction—for now rhetorically but presumably, with time, in practice.

All Ontario colleges and universities measure learning and report it via the traditional transcript that lists the courses students have taken and the grades they have received. These transcripts index the student’s mastery of the course material. We are interested primarily in student learning of basic skills and competencies that most people would agree should be products of a postsecondary education—such as a certain level of literacy, numeracy, critical thinking and problem-solving ability. It is not irrelevant that these skills are exactly the ones employers value the most in their future hires (World Economic Forum 2016) and the ones they indicate as most lacking in current graduates (Grant 2016).

Quality measurements of skills are not without precedent in education. Primary and secondary education systems have long emphasized measurement of basic skills like literacy, numeracy and problem solving. In Ontario, skills of school students are assessed at regular intervals through provincial and international tests like the Education Quality and Accountability Office’s Grades 3, 6, 9 and 10 assessments of reading, writing, mathematics and literacy (EQAO n.d.), and the Programme for International Student Assessment (OECD n.d.). Canada has periodically measured similar skills in adults through international assessments like the Programme for the International Assessment of Adult Competencies (PIAAC), but never for the specific purpose of determining what, or how much, is learned in our postsecondary institutions.

Direct measurement of skills is not something that is done routinely in the Ontario higher-education system although we have promoted the benefits of doing so (HEQCO 2013b). This is where the luxury of being the idealist kicks in, especially for an organization that has research at its core.

We initiated a large trial to see whether the direct measurement of these skills was possible and whether these measurements could be incorporated into a postsecondary performance measurement tool.

We issued an Expression of Interest to recruit colleges and universities to participate in this trial. Although we targeted postsecondary institutions in Ontario, we received expressions of interest, and eventual participation, by institutions outside the province. The number of institutions interested in participating in this trial far exceeded our expectation. In the end, 11 colleges and nine universities participated in the trial. Their motivation for participating was a genuine interest in being involved in an exercise that directly measures skills and competencies that postsecondary leaders hope their graduates would acquire.

HEQCO provided administrative support and managed the trial. Each institution was responsible for receiving ethics approval for the research project. We tailored the specifics of testing within each institution to best serve the particulars and interests of institutions, as long as decisions conformed to the general pilot requirements and retained our ability to aggregate the data.

The measurement instrument we used for this pilot, which we termed the Essential Adult Skills Initiative (EASI), was the commercial version of the assessment tool used by PIAAC to measure literacy, numeracy and problem solving in technology-rich environments in colleges and universities (OECD, nd). We selected Education & Skills Online (ESO), a product of the OECD, because, like PIAAC, the ESO has undergone an extensive and rigorous validation process and is available to test takers in English and French. Its development has been supported by the European Commission and, in Canada, the Council of Ministers of Education. It is currently administered by the Educational Testing Service. This means that assessment data collected by EASI can be compared to national and international norms.

Rather than focusing simply on the mastery of the mechanics of vocabulary or arithmetic operations, this instrument assesses the real-world application of literacy, numeracy and problem solving in technology-rich environments to provide a picture of a student’s ability to use them to navigate and respond effectively. In addition to its international reputation as one of the best measures of adult skills, the ESO is useful to students. The ESO is an adaptive assessment tool, so questions become progressively easier or more difficult depending on the test taker’s performance. Unlike comparable assessment tools, the ESO shares this vital information with each test taker, providing personalized scores that can be downloaded after each of the three major test components are completed.

When a testing window was opened within an institution, students in the programs to be tested were given three weeks to complete it. Students were recruited to participate by email invitations. Each week, students who had not completed the assessment received reminders to participate. The pilot included an evaluation of the effectiveness of various forms of incentives to motivate students to take the test.

EASI was designed with a value-added perspective by measuring skill levels of students in the first year and in the final year of selected programs. We recognize that different institutions accept students at varying levels of skill capacities. We are not interested in the absolute levels of these skills so much as we are in the improvement of these skills over the course of postsecondary education.Footnote 5

There are other value-added skills assessments underway such as the Wabash National Study of Liberal Arts Education (Wabash 2017) and the Collegiate Learning Assessment Plus (Council for Aid to Education 2017).

EASI is a multi-year pilot project. Testing began in fall 2016 and was completed by the end of 2017. The results will be reported publicly in 2018.

Important lessons have been learned from the testing conducted already. Most importantly, the pilot has revealed that it is possible to administer system-wide assessments that provide meaningful data about the learning that takes place in our institutions. Even though we worked with 20 different institutions, all of the logistical and methodological issues (e.g., privacy of data, students taking the test on different IT platforms, etc.) were satisfactorily resolved. At this point, the pilot has demonstrated that it would be easy to scale up this type of assessment to a full system, provincial or national level. We are also encouraged by the response of some institutions that participated in this trial that are both enthusiastic and motivated to push this pilot even further and to a larger scale.

Sustainable Institutions

We are in the midst of releasing a series of publications that speak to the challenges of sustainability of public postsecondary systems and how they can be addressed. This includes a framework paper addressing what we mean by sustainability in higher education and its most important components (Weingarten et al. 2016), followed by additional papers that illuminate the revenue side of the sustainability challenge (Weingarten et al. 2017a, b). In early 2018, we released an additional paper illuminating the expenditure side of the sustainability challenge (Weingarten et al. 2018b), followed by a capstone paper that presents options to government and institutions for improving the sustainability of the institutions and the system (Weingarten et al. 2018).

These publications reveal a number of useful measures of sustainability, many of which, by agreement, are reported by Ontario colleges and universities to the government. We refer the reader to these papers for details of these possible indicators and why evaluation of the sustainability of a system and the institutions within it is important.

What Is Left to Be Done?

This paper describes the thinking leading to the development of an improved measurement tool to assess the performance of the Ontario postsecondary system. The proposed instrument will give the government a useful tool to assess the impact and outcomes of the policy and practice changes it had instituted to steer the postsecondary system based on its highest priority goals. The tool, therefore, has a limited number of indicators that measure outcomes and goals that matter most to the government. The presentation of the information will be simple and accessible—a dashboard.

For Ontario, the outcomes that matter the most are equity of access, a quality education where students acquire the skills needed to succeed and sustainable institutions. We present a version of this dashboard in Fig. 2.

Fig. 2
figure 2

Ontario’s performance dashboard: measuring only what matters

Next on our to-do list is to identify the detailed measures that will make up each indicator. We are at various stages on this journey. For equity of access, we are focusing primarily on one measure—participation in higher education by income—for which the relevant data is readily available. For quality of learning, we have described the ongoing pilot project we have launched with 20 partner institutions. There is considerable work remaining to move Ontario to the point where the direct measurement of learning is scaled up across the province. With regard to sustainable institutions, at the conclusion of our ongoing research and publication series, we will have canvassed a rich set of relevant concepts and data. We will then consult institutional and government partners to identify which of these can best reveal a forward-looking picture of institutional sustainability.

Over the longer term, we will need to consider one additional domain of higher-education outcomes: research. Our primary focus is on the learning mission of public higher education, so we have omitted development of research outcomes or impact measurement for now. Arguably, even a learning-mission focus ought to examine research, in as much as research is, at the very least, a competing focal point and expenditure pressure for higher-education institutions that may have an impact (positive or negative) on learner-focused outcomes like access and learning. We acknowledge the importance of research outcomes to Ontario and will be adding research considerations to our dashboard in the second generation of its development.

The power of our approach lies in using a deliberately limited number of indicators that are tied to a limited number of jurisdictional priorities, and in taking the time to build direct and robust data to measure these. This approach is, in our opinion, valuable and translatable to any jurisdiction. Our specific priorities and measures are, of course, Ontario-centric. Yours ought to be and will be relevant to your jurisdiction.