Optimization models were placed on a high pedestal in finance with the seminal thesis of Harry Markowitz on mean-variance portfolio optimization; the thesis was written in 1950–1951 and published as a monograph in 1959 (Markowitz 1959). Markowitz’s work provided the basis for a descriptive theory of portfolio choice: how investors make decisions. This led to further research in financial economics, with the development of a theory on price formation for financial assets (by William Sharpe) and on corporate finance taxation, bankruptcy, and dividend policies (by Merton Miller). These descriptive contributions of the behavior of financial agents were recognized by a joint Nobel Prize in 1990. The prescriptive part of the theory—how investors should make optimal decisions—was also acclaimed by practitioners, and mean-variance models proliferated. The book by Zenios (2007) gives the state-of-the-art in practical portfolio optimization models developed over the fifty years since Markowitz’s contribution.

During this period problems surfaced: mean-variance portfolio optimization was found to be sensitive to perturbations of the input data (Best and Grauer 1991). Since the estimation of market parameters is error prone, the models are severely handicapped. In theory they produce well-diversified portfolios, but in practice they generate portfolios biased toward estimation errors. With advances in financial engineering, variance was replaced by more sophisticated risk measures. We have seen value-at-risk (VaR) becoming an industry standard and written into the Basel II and III accords to calculate capital adequacy or calculate insurance premiums or set margin requirements. Although VaR is computationally intractable to optimize, practitioners ignored this handicap since regulators do not require optimization of the risk metric. In a seminal paper, Artzner et al. (1999) provided an axiomatic characterization of coherent risk measures, and conditional value-at-risk (CVaR) emerged as one such measure. CVaR rose to prominence with the work of Rockafellar and Uryasev (2000), who showed that it can be minimized as a linear program. CVaR optimization emerged as a credible successor to mean-variance models: it is coherent and computationally tractable and has found numerous practical applications.

With these developments the idea of optimizing some risk measure to obtain well-diversified portfolios remains as much at the core of financial engineering now as it was in the 1950s. But as the risk measures evolve and data—high frequency, real-time, structured, or unstructured big data—proliferate, so does the need for optimization models. Research focuses on developing optimization models for the appropriate risk measure, devising efficient algorithms that take advantage of the typically sparse structure of the application, and reducing dimensionality. Integrating the optimization model with data estimation procedures and dealing with data ambiguity through robust optimization methodologies—using the robust optimization frameworks of either Ben-Tal et al. (2009) or Mulvey et al. (1995)—is also a fertile ground for research with both theoretical rigor and practical relevance. We can think of these applications as micro-financial engineering: dealing with a single portfolio.

Macro-financial engineering focuses on enterprise risk optimization, i.e., optimizing the risk exposure of an enterprise as a whole instead of one portfolio at a time. This approach requires multiple portfolios to be optimized simultaneously, in conjunction with the associated liabilities, and in the context of the policy and regulatory constraints imposed on the institution. The resulting large-scale optimization models need to balance abstraction with realism, to demonstrate their practical viability, to generate new insights into enterprise risk management practices, and to be efficient. Optimization models for planning under uncertainty—dating back to the pioneering work of Dantzig (1963)—have been playing a key role in such applications. The two-volume handbook by Zenios and Ziemba (2007) gives a good overview of the state-of-the-art and highlights the significance to practitioners and scholars alike.

But the use of optimization models is not restricted to portfolio optimization at the micro level or enterprise risk management at the macro level. Another broad class of problems deals with the use of such models to design novel financial instruments and price them in incomplete markets or as tools for calibrating model parameters given observed market data. For example, the calibration of term structure models is crucial in financial engineering. The calibration is often cast as an optimization problem where the challenge is to develop models that provide good prediction of relevant quantities such as interest rates while agreeing with current market rates.

Finally, optimization models have been attracting attention in the context of sovereign debt issues. We can think of this as a macro approach as well, viewing a sovereign as an enterprise with assets and liabilities. The Bank of Canada has been a pioneer in sovereign debt management, and it has been calling for an optimization framework (Bolder and Rubin 2011). Wright (2012) discusses sovereign debt restructuring for crisis countries. There is much ongoing work in this area, and it has intensified in the aftermath of the sovereign debt crisis in the eurozone. The economists and legal experts studying this problem often refer to “optimization” in quotes because it is not clear at the outset what needs to be optimized or even if there is a consensus optimization criterion among policy-makers. In this context, optimization models can be viewed more as a framework to structure the decision-making process than as a decision-making algorithm per se.

With these developments in mind, we are launching with this special issue the new area of Financial Engineering for Optimization and Engineering. It will include high-quality theoretical and applied papers on financial engineering with an optimization engine. Theoretical papers may deal with the development of optimization-based methods for financial engineering needs, innovations in optimization theory motivated by financial engineering problems, engineering advances made possible using optimization theory, or investigations into the performance of markets using optimization tools. Applied papers may deal with the use of optimization methods to address problems of broad practical significance faced by commercial banks, investment banks, insurance agencies, hedge funds, corporations, and sovereigns. Case studies may deal with the deployment of optimization-based decision-support systems in the finance industry, or the use of optimization models and theory in a practical setting to advance the practice of financial engineering.

We also welcome papers from practitioners and researchers that explain the needs of financial engineering to the optimization community. Such papers must address broad problem areas and not focus on narrow or well-understood problems.

Recognizing that financial engineering has produced great success stories in the development of finance but is also held responsible for some of its greatest excesses, we will publish case studies of both success and failure that provide deep insights into the benefits and pitfalls of optimization in this area.

All submissions to this special issue went through OPTE’s usual refereeing process, being independently evaluated for their relevance to financial engineering and their rigorous use of mathematical optimization. The resulting collection, we are pleased to say, gives a broad representation of the topics outlined above.

In A Multistage Stochastic Programming Asset-Liability Management Model: An Application to the Brazilian Pension Fund Industry, de Oliveira, Filomena, Perlin, Lejeune, and Ribeiro de Macedo develop a stochastic programming model to generate dynamic investment decisions to meet liabilities over time arising from defined benefit pensions in Brazil. The authors consider solvency constraints in the form of a probabilistic constraint that enforces a VaR requirement. Multiple scenario trees are generated using appropriate stochastic differential equations, and resampling is used to finalize investment decisions. In low-interest-rate environments the models suggest that pension fund managers will have to increase investments in risky assets or increase contributions from pension members.

The problem of dynamic portfolio optimization is also considered in Dynamic Portfolio Choice: A Simulation-and-Regression Approach by Delage, Denault, and Simonato. A discrete-time approximate dynamic programming approach is used, where simulation and regression of the portfolio weights are considered instead of exclusively using regression for state variables such as the expected returns of assets. The proposed framework helps to mitigate the curse of dimensionality and is able to handle some important cases of nondifferentiable utility such as CVaR.

Index tracking is a well-known and important investment strategy that aims to closely follow a given benchmark portfolio. In weakly efficient markets, it may achieve better returns than the use of an index. Sharma, Agrawal, and Mehra in Enhanced Indexing for Risk Averse Investors Using Relaxed Second Order Stochastic Dominance consider a linear programming formulation based on a relaxed second-order dominance constraint by employing underachievement and overachievement variables to achieve higher expected returns than an index. The model produces more expected excess returns and efficient portfolios than the strategies of minimizing the maximum underdeviation from an index, from almost-second-order stochastic dominance models and from equal-weighted portfolios.

Kwon and Wu in Factor-Based Robust Index Tracking consider a cardinality-constrained index-tracking model that seeks to maximize expected return subject to limits on risk and tracking error. The model is an extension of mean-variance optimization. To mitigate sensitivity to the parameter estimation, a robust factor model is developed whereby uncertainty sets for the expected returns and the factor-loading matrix are generated. The resulting robust model is a second-order conic integer program. Computational results based on tracking the S&P 100 show the advantages of the robust-factor-based model over the nonrobust approach.

Robust optimization can generate portfolios that are immune to noise and uncertainty in the parameters. However, such portfolios can be overly conservative. In Adjusted Robust Mean-Value-at-Risk Model: Less Conservative Robust Portfolios, Lotfi, Salahi, and Mehrdoust consider mean-VaR portfolio optimization using parametric methods to compute the VaR. To reduce estimation error, the authors formulate tractable adjustable robust versions of the problem that are shown to be robust in terms of both solution and structure without being too conservative.

Smirnov, Lapshin, and Kurbangaleev in Deriving Implied Risk-Free Interest Rates from Bond and CDS Quotes: A Model-Independent Approach consider a broad framework for generating a market-consistent term structure of risk-free interest rates (and the term structure of issuer-specific hazard rates) motivated by the need to avoid the use of credit agency ratings. Sovereign bond and credit default swap data are used, and the consistency of the data is checked through a sequence of linear optimization problems that aim to uphold no-arbitrage-like conditions. An advantage of the linear optimization approach is that the framework becomes model-independent. Then, with consistency of the data, a term structure of risk-free rates can be fitted using any valid model such as the Nelson–Siegel model. The authors illustrate the advantages of their framework in constructing the Eurozone risk-free curve.

Finally, Consiglio and Zenios in Stochastic Debt Sustainability Analysis for Sovereigns and the Scope for Optimization Modeling discuss debt sustainability analysis for sovereigns and outline the potential for optimization. The authors draw upon their experience with EU Treasury Departments, Central Banks, public debt management offices, and supranational institutions to discuss the challenges and opportunities in modeling problems of sovereign debt management and restructuring in crisis countries. Using a Bank of Canada database they indicate that up to half of the world’s sovereigns have been afflicted by debt crises, and the amounts involved are in the hundreds of billions USD. They discuss the well-documented failures of existing methods of analysis. Optimization modeling can look at many angles of the problem, and there are several open issues to challenge researchers.