A generalization of the mutual fund theorem

Abstract.

A generalization of the continuous time mutual fund theorem is given, with no assumptions made on the investors utility functions for consumption and final wealth, except that they are time-additive and non-decreasing. The extension is due to a new mathematical approach, using no more than simple properties of diffusion processes and standard linear algebra. The results are given for complete as well as certain incomplete markets. Moreover, optimal investment strategies that are known only for complete markets with a single risky asset, are automatically extended to complete and incomplete markets with multiple risky assets. An example is given.

This is a preview of subscription content, access via your institution.

Author information

Affiliations

Authors

Additional information

Manuscript received: September 1997; final version received: April 1998

Rights and permissions

Reprints and Permissions

About this article

Cite this article

Khanna, A., Kulldorff, M. A generalization of the mutual fund theorem. Finance Stochast 3, 167–185 (1999). https://doi.org/10.1007/s007800050056

Download citation

  • Key words:Portfolio selection, continuous time, separation theorem, reduction method, incomplete markets JEL classification: D52, D81, G11 Mathematics Subject Classification (1991):49K45, 60G15, 90A09, 93E20