# Options (New Perspectives)

**DOI:**https://doi.org/10.1057/978-1-349-95189-5_2553

## Abstract

This article provides an overview of risk-neutral valuation methodology and presents historical milestones in the development of quantitative finance. It also discusses current challenges and new perspectives in model choice, pricing and hedging.

## Keywords

Arbitrage Bachelier, L. Brace-Gatarek-Musiela model Barrier options Call options Continuous-time models Copulas Credit default obligations Credit default swaps Credit risk Derivatives Exotics Heath–Jarrow–Morton model Hedging Incomplete markets Libor Martingales Model calibration Model specification Option valuation Options Real options Reduced-form models of default Risk measures Risk-neutral pricing Risk-neutral valuation Stochastic integration theory Structural models of default Swap market models Term structure models Value at risk Vanilla options Volatility smile Yield curve## JEL Classifications

G1In 1973, Black, Scholes and Merton developed a method for the valuation of a European option based on the idea of perfect replication of its payoff. Their approach demonstrates how to act in an uncertain environment so that relevant risks are controlled. Around the same time, trading of options on common stocks started in the Chicago Board Options Exchange. Theory met practice and an exciting and fruitful journey started on the crossroads of economics, finance and mathematics. Its impact was phenomenal in both academia and industry. New areas of research were created, and numerous educational and training activities were established. The derivatives market grew at an unprecedented rate and influenced the development of other markets. Complex mathematical modelling and technical sophistication, predominant elements in theory and applications in engineering and natural sciences, now entered the theory and practice of finance. This was not the first time that stochastic modelling touched finance. At the beginning of the twentieth century, in his pioneering doctoral work, Bachelier (1900) proposed a stochastic model, based on normality assumptions on their returns, for stock prices. In many aspects, however, his work was ahead of its time and had no impact for years to come.

*K*and a given maturity,

*T*. Their model, powerful and simple, assumed a liquid market environment consisting of a non-defaultable bond and a stock. The bond yields constant interest rate

*r*, while the stock price,

*S*

_{t}, is modelled as a log-normal diffusion process having constant mean rate of return, μ, and volatility parameter,

*σ*. Applying Ito’s formula – a fundamental result of modern stochastic calculus – they were able to build a dynamic self-financing portfolio, (

*α*

_{t},

*β*

_{t}), 0 ≤

*t*≤

*T*, that replicates the option payoff, that is, for which

*α*

_{T}+

*β*

_{T}= (

*S*

_{T}−

*K*)

^{+}. For all

*t*, the option price, ν

_{t}, is, then, given by the current portfolio value, ν

_{t}=

*α*

_{t}+

*β*

_{t}. Stochastic and differential arguments yield the price process representation ν

_{t}=

*C*(

*S*

_{t},

*t*), with the function

*C*satisfying the partial differential equation

*C*(

*S*,

*T*)= (

*S*−

*K*)

^{+}. The components of the replicating portfolio turn out to be

*α*

_{t}=

*S*

_{t}

*C*

_{S}(

*S*

_{t},

*t*) and

*β*

_{t}=

*C*(

*S*

_{t},

*t*) −

*α*

_{t}, representing the amounts invested, respectively, in the stock and bond.

The construction of the price and hedging policies, as well as the specification of various sensitivity indices (greeks), thus amount to solving linear partial differential equations, a relatively easy task given the existing technical body in mathematical analysis.

The industry rapidly adopted the Black and Scholes model as a standard for the valuation of simple (vanilla) options. Soon after, more complex products were created and traded, like options on fixed-income securities, currencies, indices and commodities. Gradually, the options market experienced great growth and its liquidity reached very high levels (for a concise exposition see, for example, Musiela and Rutkowski 2005).

In parallel, substantial advances in research took place. In 1979, Harrison and Kreps laid the foundations for the development of the risk-neutral pricing theory. They created a direct link between derivative valuation and martingale theory. For a finite number of traded securities and under general assumptions on their price processes and related payoffs, they established that the price of a replicable contingent claim corresponds to the expected value, calculated under the risk-neutral probability of the (discounted) claim’s payoff. These results were further developed and presented by Harrison and Pliska (1981). In the years that followed, the theory was extended and a model-independent approach for pricing and risk management emerged. In a generic derivatives model, the (discounted) prices of primary assets are represented by a vector-valued semi-martingale \( {\boldsymbol{S}}_s=\left({S}_s^1,...,{S}_s^m\right) \), defined in a probability space (Ω, $\mathrm{F}$, (${\mathrm{F}}_{t}$), ℙ) where ℙ is the historical measure. The (discounted) payoff, *C*_{T}, is taken to be an ${\mathrm{F}}_{T}$-measurable random variable.

*and*

**S***C*

_{T}, is given by the conditional expectation

*E*

_{ℚ}(

*S*

_{s}|${\mathrm{F}}_{t}$)=

*S*

_{t},

*t*≤

*s*≤

*T*. The derivative prices, themselves martingales under ℚ, are linear with respect to their payoffs, time and numeraire consistent and independent of their holder’s risk preferences.

Fundamental questions in risk-neutral valuation are related to existence and uniqueness of the derivative price. Uniqueness turns out to be equivalent to the replicability of all claims in the market. Such a market is classified as complete. Stochastic integration theory was used to establish that market completeness is equivalent to uniqueness of the risk-neutral martingale measure ℚ. In this case, the price is given by (2) and, thus, exists and is unique. If, however, the market is not complete there is multiplicity of equivalent martingale measures. In this case, perfect replication is abandoned and absence of arbitrage becomes the key requirement for price specification and model choice. In an arbitrage-free model, a judicious choice of the pricing measure is made and the price is still represented as in (2). In many aspects, market completeness and absence of arbitrage are complementary concepts. Their relationship has been extensively studied with the use of martingale theory and functional analysis. Important results in this direction are formulated in the First and Second Fundamental Theorems of Asset Pricing (see, among others, Bjork 2004; Delbaen and Schachermayer 2006).

The risk-neutral valuation theory, built on a surprising fit between stochastic calculus and quantitative needs, revolutionized the derivatives industry. But its impact did not stop there. Because the theory provides a universal approach to price and manage risks, the option pricing methodology has been applied in an array of applications. Indeed, corporate and non-corporate agreements have been analysed from an options perspective. Option techniques have also been applied to the valuation of pension funds, government loan guarantees and insurance plans. In a different direction, applications of the theory resulted in a substantial growth of the fields of real options and decision analysis. Complex issues related, for example, to operational efficiency, financial flexibility, contracting, and initiation and execution of research and development projects were revisited and analysed using derivative valuation arguments (see the review article of Merton 1998).

Since the 1970s, theoretical developments, technological advances, modelling innovations and creation of new derivatives products have been proceeding at a remarkable rate. During this period, theory and practice have been shaping each other in a unique challenging and intense interaction. The rest of the article is, mainly, dedicated to this dimension.

## Theory and Practice in Derivatives Markets

The Black and Scholes model included various assumptions that are not valid in practice. Interest rates and volatilities are not constant, trading is not continuous, defaults occur and information is not complete. How did academic research and industry reality react to and handle these issues? Albeit there are very distinct priorities, needs and goals, shortcomings of the theory not only did not limit its applicability but prompted a remarkable progress between the theoretical and the applied worlds. Models were developed and innovative computational techniques were invented, and used in practice, for new complex products (exotics). Progress did not occur simultaneously. While theory developed mostly in bursts, practice continued the use of basic models which often involved self-contradictory assumptions. However, despite internal modelling inconsistencies, industry applications offered valuable intuition and feedback to the abstract theoretical developments.

*f*(

*t*,

*T*), defined by

*B*(

*t*,

*T*) represents the price, at time

*t*, of a zero-coupon discount bond with maturity

*T*. To facilitate the analysis of the forward curve, Musiela (1993) introduced an alternative parametrization, namely,

*r*(

*t*,

*x*) =

*f*(

*t*,

*t*+

*x*), which exhibited the importance of infinite dimensional diffusions and stochastic partial differential equations in finance. This helped to find answers to a number of practical questions related to the yield curve dynamics. Indeed, the issue of consistency between the yield curve construction and its evolution was resolved. Additionally, the support of the yield curve distribution has been studied and the mean reversion, or, more mathematically, stationarity of the entire yield curve dynamics has been addressed.

Clearly, the infinite dimensional analysis was useful in a study of the dynamics of the forward rates for all maturities. There was, however, still a problem that needed to be looked at, namely, that the forward rates *f* (*t*, *T* ) are not traded in the market, and the Libor and swap rates are together with options on them. Moreover, information contained in these option prices should be taken into account in the specification of the yield curve dynamics. Because the market trades caps and swaptions in terms of their Black and Scholes volatilities, it would be advantageous to develop a term structure model that is consistent with such practice, a task seen by many academics at that time, as impossible for its apparent internal inconsistency.

In a series of papers by Miltersen, Sandmann, Sondermann, Brace, Gatarek, Musiela, Rutkowski and Jamshidian (see Part II of Musiela and Rutkowski 2005, for a detailed exposition of these works), a new modelling framework for term structure dynamics was put in place. The so-called Libor, also known as BGM (Brace–Gatarek–Musiela), and swap market models resolved the outstanding issue of the link between the traded instruments and the mathematical description of their dynamics. In essence, they provided a model-independent framework for the analysis of the interest rates dynamics when coupled with the advances – taking place in parallel – in the modelling of volatility smile dynamics. The latter issue is discussed next.

The Black and Scholes model assumes constant volatility and hence, within this model, a call option with arbitrary strike is priced with the same volatility. However, call options of different strikes are priced differently by the market which ‘allocates’ into the Black and Scholes formula a strike-dependent volatility generating the so-called volatility smile. This is clearly inconsistent with the assumption of the model. It turns out, however, that a complete collection of call prices, for all strikes and maturities, uniquely determines the one-dimensional distributions of the underlying forward price process, under a probability measure which should be interpreted as a forward measure to the option maturity. In a series of papers, Dupire (1993) shows how to construct martingale diffusions with a given set of one-dimensional distributions, demonstrating, once more, that the market practice is theoretically sound and internally consistent when analysed from the perspective of the appropriate model. The Black and Scholes model is used only to convert the quoted volatility into a price and it is no longer used for the pricing of vanilla options. Moreover, there are many ways of constructing martingales with a given set of one-dimensional marginals, and the question is not so much how to construct one but, rather, which one to choose and under which criteria. The important message here is that, again, one can now look at the problem in a completely model-independent way, provided all objects – namely, the underlying assets, the associated probability measures and the relevant market information – are correctly interpreted.

Obviously, the theory and practice, at least in the equity, foreign exchange and interest rates derivatives markets, have moved to a different level and reached a certain degree of maturity. Of course, important challenges remain but experience since the 1970s defines clearly a path to follow.

## Current Challenges and Perspectives

### Credit Risk

A fundamental assumption of the Black and Scholes model is that the underlying securities do not default. However, default is a realistic element of financial contracts and very relevant to any firm’s performance. Credit-linked instruments have, by now, become a central feature in derivatives markets. These are financial products that pay their holders amounts contingent on the occurrence of a default event ranging from bankruptcy of a firm to failure to honour a financial agreement. Examples include, among others, credit default swaps (CDS), credit default obligations (CDO) and tranches of indices. Their market has grown more than eightfold in recent years and, undoubtedly, credit risk is, today, one of the most active and challenging areas in quantitative finance.

There are various issues that make the problems in credit risk difficult, from both the modelling and the implementation point of view. The first challenge is how to model the time of default. In academic research, there are two well-established approaches, the structural and the reduced. In the structural models, it is postulated that uncertainty related to default is exclusively generated by the firm’s value. Modelling default, then, amounts to building a good model for the company’s assets and determining when the latter will fall below existing liabilities. However, such default times are, typically, predictable which is not only unrealistic, but, also, difficult to implement due to limited public information about the firm’s prospects. In the other extreme, the reduced-form models, the default time is associated with a point process with an exogenously given stochastic jump intensity. The intensity essentially measures the instantaneous likelihood of default. Reduced models are more tractable for pricing and calibration but the default times are completely sudden (totally inaccessible), a non-realistic feature. Recently, efforts have been made to bridge the two approaches by incorporating the limited information the investors might have about the firm’s value. This information-based approach is gradually emerging but a number of modelling and technical serious issues remain to be tackled. See, among others, Bielecki and Rutkowski (2002) and Schönbucher (2003).

Even though the above models are theoretically sound, their practical implementation is so difficult that it makes them, effectively, inapplicable. The main problem stems from the high dimensionality and inability to develop computational methods that track ‘name by name’ the valuation outputs. For this reason, the focus in the industry has shifted to an alternative direction centred on modelling the joint distribution of default times. An important development in this direction is the use of a copula function, a concept introduced in statistics by Sklar (1959). The aim is to define the joint distribution of a family of random variables when their individual marginal distributions are known. Such marginal distributions may be, frequently, recovered from the market, as is the case with CDS that yield implicit information on the underlying name’s default time. Today, the most widely used copula is the one-factor Gaussian one, proposed by Li (2000). Its popularity lies in the ability to obtain the sensitivity, and thus information on hedging, of the derivative price in a name by name correspondence.

### Model Specification

As has been mentioned earlier, the theory has long departed from perfect replication, and practice never relied on it. Absence of arbitrage is the underlying pricing criterion in the derivatives market. However, a plethora of pricing issues and model specifications arise every day. Derivatives markets have been growing very rapidly, and high liquidity in vanilla options on a large number of underlyings including, among others, single stocks and equity indices, interest rates, foreign exchange and commodities, has been achieved. The users benefit from competitive prices, quoted at very tight spreads, for the protection they need. This, in itself, brings another challenge to the providers of such services and products, namely, the models that are currently under development need to reflect this liquidity before they can be used for the pricing of less liquid products. This process is known in the industry as model calibration. To a large extent, one can assume that the market gives the prices for simple derivatives like calls and puts and, hence, pricing considerations dissolve. However, more exotic options need to be priced and this must be done in a way consistent with the basic products (vanilla).

To provide some intuition, consider the case of the so-called first generation exotic, namely, a down and out call option. This is a barrier option that reduces to a simple call option when the likelihood of crossing the barrier is very small. Consequently, a model to price such an option must return the market price of a call in such a scenario. Call prices will be liquid for all strikes up until a certain maturity, say, 18 months or two years for currency options. However, there may be a need to price products with embedded currency options of very long maturity, like up to 50 years in dollar–yen exchange rate. In this case, a suitable model needs to be developed that accommodates short- and long-term issues. On one hand, the model must fit the short-dated foreign exchange (FX) calls and puts. On the other, it has to be consistent with the interest rates volatilities and must capture correctly the dependence structure between the dollar and yen interest rates curves, their volatilities and the spot FX.

A standard approach for solving such problems consists of writing a continuous-time model and trying to fit it to the liquid prices. This task is often very difficult to complete. Indeed, as more market information must be put into a model, the more complicated the model gets, the more difficult and time consuming the calibration procedure becomes, and the more time it takes to produce accurate prices and stable sensitivity reports. To a large extent, model calibration is identical to the specification of one-dimensional distributions of the underlying process. Model specification, on the other hand, can be identified with the specification of an infinite dimensional copula function defining the joint distribution of the entire path, given the marginal distributions that can be deduced from the call prices. At this point, it is important to recall that, often, option payoffs depend solely on a finite dimensional distribution of the underlying process. Consequently, the need to specify the continuous-time dynamics remains valid only if one wants to link the concept of price with perfect replication of the payoff, a requirement that is, in any case, not met in practice.

Seen from this perspective, a new modelling path emerges, namely, one can take the marginals as given by the call prices and choose a copula function in such a way that the joint distribution is consistent with an arbitrage-free model. For example, if one wants to price a forward start option, the distributions of the underlying asset at two different dates are given. Then, only the joint distribution needs to be specified but in such a way that the martingale property is preserved. Clearly, there is an infinite number of ways to build such a martingale, and the choice should be based on additional information – for example, not on the smile as seen today but on the assumptions one might want to make about the smile dynamics.

### Risk Measures

As was previously discussed, absence of arbitrage is the fundamental ingredient in derivative pricing. Absence of perfect replication remains, however, a major issue and dictates the creation of financial reserves. To this effect, regulatory policies have been in place for few years now.

These requirements prompted the axiomatic analysis of the so-called risk measures, which are nonlinear indices yielding the capital requirement of financial positions. The theory of coherent risk measures was proposed by Artzner et al. (1999). A popular risk measure is the ‘value at risk’, which, despite its widespread use, neither promotes diversification nor measures large losses accurately. Since the mid-1990s a substantial research effort has been invested in further developing the theory. Relaxing a scaling assumption in the coherent case has led to the development of convex risk measures. The next step has been the axiomatic construction of dynamic risk measures that are time consistent, an indispensable property of any pricing system.

## See Also

## Bibliography

- Artzner, P., F. Delbaen, J.M. Eber, and D. Heath. 1999. Coherent measures of risk.
*Mathematical Finance*9: 203–228.CrossRefGoogle Scholar - Bachelier, L. 1900. Théorie de la spéculation. Ph.D. dissertation L’ Ecole Normale Supérieure. English translation in
*The Random Character of Stock Market Prices*, ed. P.H. Cootner. Cambridge, MA: MIT Press.Google Scholar - Bielecki, T.R., and M. Rutkowski. 2002.
*Credit risk: Modeling, valuation and hedging*. Berlin: Springer.Google Scholar - Bjork, T. 2004.
*Arbitrage theory in continuous time*. 2nd ed. Oxford: Oxford University Press.CrossRefGoogle Scholar - Black, F., and M. Scholes. 1973. The pricing of options and corporate liabilities.
*Journal of Political Economy*81: 637–654.CrossRefGoogle Scholar - Delbaen, F., and W. Schachermayer. 2006.
*The mathematics of arbitrage*. Berlin: Springer.Google Scholar - Dupire, B. 1993. Pricing and hedging with a smile.
*Journées Internationales de Finance*. La Baule: IGR–AFFI.Google Scholar - Harrison, M., and D.M. Kreps. 1979. Martingales and arbitrage in multi-period security markets.
*Journal of Economic Theory*20: 381–408.CrossRefGoogle Scholar - Harrison, M., and S. Pliska. 1981. Martingales and stochastic integrals in the theory of continuous trading.
*Stochastic Processes and Applications*11: 215–260.CrossRefGoogle Scholar - Heath, D., R.A. Jarrow, and A. Morton. 1992. Bond pricing and the term structure of interest rates: A new methodology for contingent claim valuation.
*Econometrica*60: 77–105.CrossRefGoogle Scholar - Li, D.X. 2000. On default correlation: A copula function approach.
*Journal of Fixed Income*9: 43–54.CrossRefGoogle Scholar - Merton, R. 1973. Theory of rational option pricing.
*Bell Journal of Economics and Management Science*4: 141–183.CrossRefGoogle Scholar - Merton, R. 1998. Applications of option-pricing theory: Twenty-five years later.
*American Economic Review*88: 323–349.Google Scholar - Musiela, M. 1993. Stochastic PDEs and term structure models.
*Journées Internationales de Finance*. La Baule: IGR-AFFI.Google Scholar - Musiela, M., and M. Rutkowski. 2005.
*Martingale methods in financial modelling*. 2nd ed. Berlin: Springer.Google Scholar - Schönbucher, P.J. 2003.
*Credit derivatives pricing models. Model, pricing and implementation*. Chichester: Wiley.Google Scholar - Sklar, A. 1959. Fonctions de répartition à n dimensions et leures marges.
*Publications de l’Institut de Statistique de l’ Université de Paris*8: 229–231.Google Scholar