1 Introduction

Our objective is to trace some of the brief history of efficiency analysis. In so doing we hope to rekindle some memories, and also to help readers learn some lessons from some events that have transpired. We intend to touch on four frequently overlapping developments in the field (although the JEL Classification Codes Guide does not yet recognize it as a field). The developments involve the significant roles played by alternative modes of invention, graduate students, international travel, and influential but under-appreciated scholars.

Michael Farrell started it all in the year of Sputnik, but it was two decades until the development of stochastic frontier analysis (SFA) and data envelopment analysis (DEA). During that period Farrell amassed two citations per year (not including four noteworthy papers that escaped the SSCI net).Footnote 1 We call this period the era of missed opportunities. We were reading and talking with the likes of Sir John Hicks, Tjalling Koopmans, Herbert Simon, R. M. Cyert and J. G. March, T. W. Schultz, Armen Alchian, W. A. Niskanen, Harvey Leibenstein, Oliver Williamson and W. J. Baumol (count the Nobel Prize winners). These authors were writing about causes and consequences of failure to optimize, or what might be called departures from Chicago equilibrium. We were exposed to managerial choice, efficient points and facets, the quiet life, bounded rationality, satisficing, rules of thumb, expense preference, bureaucracy, agency problems, X-efficiency and the likes. Schultz (1964) even stated what he called the economic efficiency hypothesis as “There are comparatively few significant inefficiencies in the allocation of the factors of production in traditional agriculture.” A year later Hopper (1965) reporting on field work he conducted in 1954, developed a methodology for testing the economic efficiency hypothesis and conducted an empirical test of the hypothesis on a sample of Indian farms. Like Tax (1953) before him, Hopper found farmers to be “poor but efficient” because the penalty for inefficiency was so large. With the benefit of hindsight, it is clear that we were exposed to the possibility of (in)efficient allocations, but we were not thinking outside the box, about converting these compelling tales into an analytical model that could be implemented empirically. And when we did develop such models, we failed to gain much inspiration from our predecessors.Footnote 2

We call the decade immediately following the development of SFA and DEA the era of the railroad tracks. We were travelling in the same direction, without converging or exploiting synergies. One paradigm was stochastic but parametric, the other non-parametric but deterministic. One group populated departments of economics, the other schools of business, public policy and engineering; the two groups published in different journals and attended different conferences. With few exceptions never the twain did meet.Footnote 3 This state of affairs began to change in the late 1980 s, thanks to the vision and efforts of a pair of unsung heroes whose contributions we extol in Sect. 5. We call the past quarter century the era of enlightenment. Convergence has proceeded as we have begun to read, and publish in, each other’s journals and attend each others’ conferences. Indeed both journals and international conferences devoted, at least in part, to efficiency and productivity analysis are now proudly multi-disciplinary. The distinction between “stochastic but parametric” and “non-parametric but deterministic” has blurred almost to the point of extinction. Restrictive features of both paradigms have been relaxed and new paradigms have been developed.

The remainder of the paper unfolds as follows. In Sect. 2 we discuss the contrasting modes of development of SFA and DEA. In Sect. 3 we discuss the contributions of graduate students to the development of the field. In Sect. 4 we relate some unintended consequences of travel. In Sect. 5 we praise some unsung heroes, without whose inspired contributions our field would not be where it is today. The paper concludes with some observations on what a long strange trip it’s been.Footnote 4 We emphasize that throughout our discussion is anecdotal and personal, although we believe these discussions illustrate important truths. Other scholars will have their own, hopefully complementary, personal anecdotes.

2 Invention

The development of theory generally follows from what we observe around us. However, the development of SFA followed a very different path than that of DEA.

The developers of SFA observed a theoretical definition of a production function as a maximum concept (Koopmans 1957; Essay I). They also observed econometric estimates of production functions that intersected data sets, leaving some producers above, and others below, what was in theory a maximum function. The contradiction grated, and eventually drove the development of SFA. Data sets came later, usually, but not always, showing that “it worked,” although serious empirical applications were slow in coming.

The developers of DEA followed the opposite path, using serious industrial applications to inspire the creation of analytical frameworks that eventually led to DEA. They call it “applications driven theory,” by which they mean “one starts with an actual application and sees it through to a successful conclusion. This guarantees relevance. One then generalizes what was done and publishes the result, as a test of its contribution to scientific knowledge.” This strategy, developed by Charnes and Cooper through the years, has an interesting history.

Bob Mellon, a former student of Cooper’s, went to work with the engineers at the Philadelphia Refinery of Gulf Oil company, which was then the largest producer in the world of aviation grade gasolines. These engineers were seeking to improve their methods for producing gasolines, and Mellon came to Cooper for help, and Cooper asked Charnes to join the team. This led to the first industrial application of linear programming, and it proved to be so successful that the engineers developed a computer (in that pre-computer age) and extended the use of these developments to other refineries and other products at Gulf. The generalization then took form in Charnes et al. (1952).

Cooper was surprised and pleased by the subsequent developments, which took the form of numerous phone calls and letters from both sides of the then existing “iron curtain.” These inquiries were not restricted to oil companies. For instance one of them led to an application in the production of ball bearings at the Philadelphia plant of SKF industries. This again resulted from a response to one of Cooper’s former students, Bob Ferguson, who was employed by a consulting firm in Pittsburgh. This generalization was published as Charnes et al. (1953), which produced similar reactions and resulted in the birth of goal programming, which in turn led to important developments in DEA. This occurred as follows: Bob Ferguson and the consulting firm with which he was associated were also retained by the industrial appliances division of General Electric Company to develop an executive compensation plan that would help GE to retain younger members of their staff against offers from competing companies. The consulting firm had collected extensive data to which it applied statistical regression techniques to obtain a formula for calculating these strategies. However, the results were unsatisfactory.

Ferguson came again to Charnes and Cooper, and the three of them developed goal programming, which was initially called “inequality constrained regression,” because it produced the following results. Although precise estimates of competitor salary offers were not available, it was possible to estimate upper and lower bounds which could be represented as inequalities. Constraints could also be introduced to reflect the organization structure of this division of GE so the salaries of subordinates would not exceed the salaries of their superiors. This was all accomplished by requiring the estimates to satisfy corresponding linear systems of inequality constraints. Goal programming replaces least squares with a sum of absolute value regressions that yield median rather than mean estimates so it is not troubled by outliers and like problems. It gives rise, however, to a nonlinear objective function that is to be minimized, and which the team was able to transform to an equivalent linear form. Hence the entire problem could take the form of a linear programming problem. This generalization was published as Charnes et al. (1955).

The effects of applications driven theory took additional forms. For instance, Charnes and Cooper were approached by Symonds, who was in charge of refinery research for the Standard Oil Company of New Jersey, which was then the largest producer of heating oil, which the company referred to as a high risk product that was “charged with a public interest.” The response to this problem was the development of chance constrained programming, which led to the publication of Charnes et al. (1958).

Thus far the applications have been industrial and instigated primarily by former students, and have led to the first industrial application of linear programming and generated important extensions of linear programming. Everything is leading to the final step from linear programming to DEA. However, the final application came 20 years later, and was educational rather than industrial, and was instigated by a current, rather than a former, student. It built on previous modeling developments described above to lead directly to the development of DEA. Because the student involved was current at the time, we defer discussion of this final step in the development of DEA to Sect. 3.2.

Lesson #1:

Necessity may be the mother of invention, but necessity took very different forms in the development of SFA (a theoretical contradiction) and DEA (a sequence of real world challenges).

3 Students

Graduate students can learn by doing as our research assistants, and eventually they provide an important conduit through which the gospel is spread. Farrell had none, and his influence spread slowly, while the developers of SFA and DEA were blessed, and their influence spread quickly.Footnote 5 However, graduate students can play a far more significant role than just spreading the gospel. We provide two examples in which graduate students have helped create the gospel.

3.1 Chapel Hill

Sydney Afriat (1972) published an influential (but too often overlooked) contribution to the field of efficiency analysis. With the benefit of hindsight we think of his paper as establishing a link between Farrell and both SFA and DEA. He specified linear programs that extended Farrell’s rudimentary program and anticipated those of CCR and BCC (Banker et al. 1984). He also introduced a beta distribution for technical efficiency, noting that “…a production function…, together with a probability distribution … of efficiency, is constructed so that the derived efficiencies … have maximum likelihood,” thereby anticipating SFA. He developed linear programs for the estimation of cost efficiency, and took note of its technical and allocative components. It took years for the rest of us to catch up with Afriat.Footnote 6

But Afriat was not working in a vacuum. In 1971 he was supervising two talented graduate students, Charles Geiss and Robert Dugger, at the University of North Carolina. Geiss had introduced Afriat to Farrell’s work, and developed computer programs to implement Farrell’s production model and extensions of it, and to test alternative restrictions on production models with their corresponding efficiencies. Both Geiss and Afriat presented their work at the 1971 Summer Meeting of the Econometric Society.Footnote 7

Soon thereafter Afriat left Chapel Hill for the cooler climes of Canada, bequeathing Dugger to Lovell, a gift for which they will be eternally grateful, and Geiss to another professor. Inheriting Dugger’s dissertation at about midpoint, Lovell struggled to hang on, but eventually the dissertation was approved. Along the way Dugger introduced Lovell to Farrell’s work, and the work of several subsequent pioneers ranging alphabetically from Afriat and Aigner to Sitorous and Timmer. In his dissertation Dugger wrote mathematical programs for DEA and free disposal hull (FDH) analysis (Deprins et al. (1984)), and applied them to a panel data set having multiple inputs and multiple outputs. He estimated technical, allocative and cost efficiency, he tested hypotheses on the structure of technology, and he developed a sophisticated outlier detection technique. All this in 1974! It was quite a ride. We still wonder how Geiss and Dugger were familiar with Farrell’s work when Afriat and Lovell were not.

Shortly after Geiss and Dugger followed Afriat out of Chapel Hill, Peter Schmidt and Lovell were discussing Dugger’s dissertation. Lovell was thinking about its connection to the previous literature we mention in Sect. 1, and Schmidt was pondering its deterministic nature. Lovell suggested that he could clarify the link, and Schmidt conjectured that he could convert Dugger’s frontiers to stochastic frontiers. Both objectives were accomplished, and Schmidt and Lovell wrote a paper. In December 1975 Schmidt presented this paper at the ASSA meetings in Dallas, where he met Dennis Aigner, who was also presenting a paper. The two papers were nearly identical, and had been submitted to different journals. The two papers were withdrawn, easily merged, and resubmitted as what became the ALS paper. Imagine what might have happened had Afriat stayed in Chapel Hill, or had Aigner and Schmidt not met in Dallas.

Lesson #2:

Graduate students have made critical contributions to frontier analysis, and professors have reaped some of the accolades.

3.2 Pittsburgh

We return to the development of DEA, a tale that also involves a graduate student. Events unfolding in Pittsburgh just a few years later were eerily similar to those that occurred in Chapel Hill. Charnes and Cooper had met Farrell when he visited Carnegie Institute of Technology in the early 1950 s, before he published his efficiency paper. Two decades later at Carnegie Mellon University, Cooper was the Dean of the School of Urban and Public Affairs (now the Heinz III School). He was approached by a student, Edwardo Rhodes, who wanted to write a dissertation based on the activities of Program Follow Through, a large federal government study directed to extending Program Head Start, which was directed to educating disadvantaged students through grade three of elementary school. Rhodes and his committee had tried a wide variety of statistical methods with unsatisfactory results. Assuming the job of supervisor of the dissertation, Cooper had similar experiences.

One day Rhodes introduced Cooper to Farrell’s paper, which was based on Koopmans’ (1951) activity analysis and involved the use of numerous matrix inversions. Charnes and Cooper (1961) had previously shown that activity analysis could be formulated as a linear programming problem, which provided a much more efficient computational method. Cooper invited Charnes to join the committee, and Charnes showed how the model could be reformulated as a fractional programming model which increased the interpretive power of the model by showing how it generalized the customary output-to-input definition of efficiency used in science and engineering. Here, too, Charnes and Cooper had shown that the generalization of this ratio form could be transformed into an equivalent linear programming problem. Finally, of course, this transformation provided access to the duality theory of linear programming. This collaboration led to a dissertation and the CCR paper, the most highly cited paper in the field, and inaugurated DEA. Once again, the student enlightened the supervisor(s). How Rhodes knew Farrell’s work when Charnes and Cooper did not remains a mystery.

Lesson #3:

see Lesson #2.

4 International travel

Travel is rewarding in many ways; we visit interesting destinations, we encounter new cultures, we can be tourists as well as scholars, and we make new acquaintances. Of more relevance in this context, travel, especially of the international sort, provides a second important conduit through which the gospel is spread, or even created. We provide three anecdotes in which international travel has helped create and spread the gospel. Once again we begin with Farrell.

4.1 Farrell

Early in his career Farrell travelled widely, most significantly to the Cowles Commission, then at the University of Chicago, to the Carnegie Institute of Technology, and to the University of California at Berkeley. Farrell accumulated human capital at Cowles and disseminated it at Berkeley. At Cowles Farrell met Gerard Debreu (and his dead loss function and his coefficient of resource utilization) and Koopmans (and his activity analysis and his efficient allocation of resources).Footnote 8 We have noted that Farrell’s impact in Pittsburgh took nearly 20 years to germinate, and we do not know what impact Pittsburgh had on Farrell. As for Berkeley, again we know no details, but Farrell’s imprint was all over the San Francisco Bay area in the late 1960s. We do not know exactly how Farrell’s imprint spread, but it clearly went through the Giannini Foundation of Agricultural Economics, and it made its first public appearance at the 39th annual meeting of the Western Farm Economics Association, published the following year as Boles (1967), Bressler (1967), Seitz (1967) and Sitorus (1967). Farrell’s model provided the backdrop for each paper. Boles discussed computational issues. Bressler discussed cost efficiency and its technical and allocative components. Both discussed relaxing Farrell’s constant returns to scale restriction. Seitz applied the model to electricity generation, incorporating technical change, and discussed how to incorporate characteristics of the operating environment into the analysis. Sitorus applied the model to traditional agriculture, incorporating weak disposability of “redundant factors of production,” thereby anticipating a large and growing literature that uses frontier techniques to incorporate environmental disamenities into productivity and shadow pricing analyses. Shortly thereafter the Ford Foundation Program for Research in University Administration at Berkeley published reports summarizing the results of applying Farrell’s model, as extended and implemented by the Berkeley agricultural economists, to a large sample of higher education institutions; a good example is Carlson (1972), who relies heavily on the work of Boles, Bressler, Seitz and Sitorus.Footnote 9

Lesson #4:

Travel allows us to amass human capital, and to disseminate its fruits.

4.2 Schmidt and Lovell

On a trip to Washington DC in September 1980 Schmidt visited the Center for Naval Analyses just across the Potomac. In a lecture on frontiers Schmidt said that in SFA we can estimate the sample mean inefficiency E(u), but not inefficiency for individual firms E(ui). James Jondrow was in the audience, and suggested calculating the conditional mean E(ui|vi + ui) as an estimator of individual firm inefficiency. Exactly 12 days later, on a somewhat longer trip to already snowbound Moscow, Lovell visited the Central Economics and Mathematics Institute of the Soviet Academy of Sciences. He gave a lecture on frontiers in which he said the same thing: in SFA we can estimate the sample mean inefficiency, and although it certainly was desirable to be able to estimate the inefficiency of each observation in the sample, it was not possible to do so. After the lecture a student, Ivan Materov, approached him and said it was indeed possible, and showed him a derivation of E(ui|vi + ui), and a simpler derivation of the conditional mode, M(ui|vi + ui), which he preferred because of its appealing interpretation as a maximum likelihood estimator. When Lovell returned to the US, Materov’s computer printout in tow, he called Schmidt. Not long thereafter Jondrow et al. (1982) appeared, a giant leap forward for SFA. Who knows how long it would have taken them to see the light had they not travelled and had their erroneous claims corrected?

Lesson #5:

See lessons #2 and #4.

4.3 The Grossherzog Friedrich roundtable

The third travel tale is less significant, but more colorful, than the first two. While attending an international conference in Karlsruhe, West Germany, in 1980, Lovell witnessed two economists arguing in a bar, in the evening after the formal sessions had concluded and the real work had just begun. A young American, Raymond Kopp, with a freshly-minted PhD, was trying to convince his older Canadian colleague, Erwin Diewert, of the veracity of a proposition concerning the ability to decompose cost efficiency obtained from a translog cost frontier into technical and allocative components. Diewert explained that this proposition was false. The argument lasted a week, in the presence of a bevy of witnesses, and was documented with scribbled diagrams and equations on beer-soaked bar napkins. The eventual outcome was the publication of the (correct) proposition in Kopp and Diewert (1982), and the proposition has become influential.

Lesson #6:

See lesson #4.

5 Unsung heroes

In the half century since Farrell’s original contribution, the field has grown immensely, thanks to the contributions of a number of scholars who as a result have become well known. However, the field also has a number of unsung heroes, whose contributions have not been sufficiently recognized. We mention two in particular.

5.1 Doğramaci

Ali Doğramaci (with the assistance of Nabil Adam) organized a biennial series of “Conferences on Current Issues in Productivity” at the Rutgers University Business School in Newark, NJ, beginning in 1979 and concluding in 1991. The “Current Issues” part of the title suggests, accurately, an objective of including speakers from government and industry as well as academe. Indeed this conference, though relatively small, was as diverse as any we can recall attending. Participants at the conferences we managed to attend include a winner of the Nobel Prize in Economic Sciences (Wassily Leontief), a future Chairman of the Board of Governors of the US Federal Reserve System (Ben Bernanke), distinguished productivity scholars such as Solomon Fabricant, Irving Kravis, Bela Gold, Yair Mundlak, Zvi Griliches, Dale Jorgenson, Sydney Afriat and Paul Romer, representatives from government (USDA, GAO, Department of Commerce, BLS, Census Bureau, various Federal Reserve Banks) and industry (AT&T, International Paper Co.), scholars from overseas (Canada, West Germany, France, Belgium, the Netherlands, Sweden, Norway, Poland, Chile, Australia, New Zealand, Japan, China) and, last but not least, graduate students. These conferences also brought together practitioners of the SFA and DEA arts, marking the first large-scale crossings of our paths. Just imagine the networking and subsequent collaborations, not to mention the ubiquitous Portuguese cuisine.

In addition to bringing us together, Doğramaci made a seminal contribution during these conferences. At the 1987 conference (and perhaps 1985 as well) he convened informal gatherings (including Zachary Rolnik of Kluwer Academic Publishers, Julien van den Broeck of the University of Antwerp, and a few others) with the objective of establishing a journal devoted to productivity analysis. You are reading that journal, the first issue of which appeared in April 1989 with Doğramaci as Editor-in-Chief.Footnote 10

5.2 Lewin

Arie Lewin made two suggestions to Lovell during the 1980s. Both had lasting impacts on his career, and one had a far wider impact. In 1980 Lewin suggested to Lovell that he present an SFA paper in a DEA track at the 1981 TIMS/ORSA meetings in Toronto. Lovell accepted, and there he met Charnes and Cooper for the first time, the first of whom mounted a vociferous attack on his presentation, as he was known to do. Lovell’s introduction to the world of DEA and TIMS/ORSA (now INFORMS) was memorable. The discomfort notwithstanding, that experience was the beginning of two lasting friendships. A few years later Lewin suggested to Lovell that he host a conference on “Parametric and Nonparametric Approaches to Frontier Analysis” at the University of North Carolina. Lewin secured funding from the US National Science Foundation, and he and Lovell organized the conference that brought together about 50 participants from the SFA and DEA research communities. The conference, held in 1988, was a big success, and we have been talking to each other ever since. The contributed papers were refereed, and the survivors were published as Lewin and Lovell (1990).Footnote 11

5.3 Others

Both Doğramaci and Lewin have funded, organized and hosted large international conferences. So have those behind the European Workshop on Efficiency and Productivity Analysis, the North American Productivity Workshop, and the Asia–Pacific Productivity Conference, and each of their predecessors in Louvain-la-Neuve, Athens and Taipei. Wolfgang Eichhorn hosted a series of endurance affairs at the University of Karlsruhe, and George Kozmetsky hosted conferences at the IC2 Insitute at the University of Texas at Austin. Every one of these conferences has generated enormous benefits, most of which have been external to the hosts, who are unsung heroes.

Lesson #7:

We have been recipients of externalities generated by conference organizers and journal editors willing to provide forums for the exchange and development of human capital.

6 What a long strange trip it’s been

It has indeed been a long, and occasionally strange, trip. We have looked back on the trip with the unusual objective of highlighting the importance of modes of invention, and the contributions of graduate students, international travel, and some unsung heroes. Along the way we have noted what, in our opinion, have been the pivotal people, events and publications. Our genealogy appears in Fig. 1, in which, unlike nature’s trees, our family tree starts at the top rather than the bottom. Farrell acknowledged the influence of Koopmans (1951) (and a similarity of his work with that of Debreu (1951)), and the Giannini writers acknowledged the influence of Farrell, although almost no one has acknowledged the influence of the Giannini writers. Farrell’s influence on Afriat and Charnes and Cooper was indirect, being routed through graduate students in each case. Afriat’s influence on ALS was also indirect, the link again provided by graduate students. Finally, M&B were influenced by Afriat, but again indirectly, through Richmond (1974). We leave to the reader the challenge of creating the next row (branch? ring?) of our family tree.

Fig. 1
figure 1

Genealogy

We have succeeded in spreading the gospel; efficiency papers have appeared in all of the top journals within our field. However, we are far more impressed by the spread of the gospel far beyond our field. Efficiency papers have appeared in outlets as varied as Journal of Grey System, Veterinarni Medicina, New Medit, Waste Management, Mitigation and Adaptation Strategies for Global Change, Anatolia, Evaluation and Program Planning, Journal of Applied Electrochemistry, International Journal of Environment and Pollution, Journal of Applied Animal Research and Austral Ecology, to mention just a few. Researchers are using our gospel to analyze topics such as Asian brown clouds, the tapir in Honduras, the moment of inertia and the ratio of gyration, crown shyness for lodgepole pines, allometry of reproduction, and spousal influence on time–space prism vertices. But the gospel is also being used to address such important policy issues as discriminatory hiring, water pricing, multidimensional poverty, environmental degradation, corruption, HIV immunology, and the regulation of providers of public services such as health care, education, transport, and various utilities. Such widespread diffusion of the gospel is gratifying.

Having provided seven lessons to be learned from our brief anecdotaI history, we conclude with a pair of empirical observations that generate two lessons yet to be learned.

Lesson #8:

Citations to DEA foundation papers outnumber citations to SFA foundation papers by a ratio of 2:1, in both SSCI and Google Scholar. What lesson is to be learned from this fact, and by whom?

Lesson #9:

The BLS (2009) and the OECD (2001) both state, unequivocally, that productivity growth has several drivers, one of which is efficiency improvements. Yet one area to which the gospel has not spread is what, for lack of a better term, we call conventional growth accounting and index number productivity analysis. The widespread praise for, and equally widespread reliance on, superlative indexes illustrates our point. A superlative index provides a “good” approximation to its theoretical counterpart, but only under certain conditions, one of which is the efficient allocation of resources in each time period. Yet this requirement is rarely noted and, to the best of our knowledge, has not been tested in the productivity index field, despite the fact that it is routinely rejected in efficiency studies. What lesson is to be learned from this fact, and by whom?