Skip to main content

Minimax Approach

  • Chapter
Bandit problems

Part of the book series: Monographs on Statistics and Applied Probability ((MSAP))

Abstract

A bandit problem is interesting only if there are arms with unknown characteristics. To choose among the available arms a decision maker must first decide how to handle this uncertainty. In the first eight chapters of this monograph the approach used is to average the payoff over the unknown characteristics with respect to a specified prior distribution — a Bayesian approach, in statistical parlance.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 119.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  • Bather, J. A. (1983) Personal communication.

    Google Scholar 

  • Bather, J. A. and Simons, G. (1985) The minimax risk for clinical trials. J.R. Statist. Soc. B (to appear).

    Google Scholar 

  • Feller, W. (1968) An Introduction to Probability Theory and Its Applications, Vol. I, 3rd edn, Wiley, New York.

    MATH  Google Scholar 

  • Fristedt, B. (1974) Sample functions of stochastic processes with stationary, independent increments. Advances in Probability, Vol. 3 (eds P. Ney and S. Port), pp. 241–396, Marcel-Dekker, New York.

    Google Scholar 

  • Parthasarathy, T. and Raghavan, T. E. S. (1971) Some Topics in Two-Person Games, American Elsevier, New York.

    MATH  Google Scholar 

  • Vogel, W. (1960a) A sequential design for the two-armed bandit. Ann. Math. Statist. 31: 430–443.

    Article  MathSciNet  MATH  Google Scholar 

  • Vogel, W. (1960b) An asymptotic minimax theorem for the two-armed bandit problem. Ann. Math. Statist. 31: 444–451.

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 1985 D. A. Berry and B. Fristedt

About this chapter

Cite this chapter

Berry, D.A., Fristedt, B. (1985). Minimax Approach. In: Bandit problems. Monographs on Statistics and Applied Probability. Springer, Dordrecht. https://doi.org/10.1007/978-94-015-3711-7_9

Download citation

  • DOI: https://doi.org/10.1007/978-94-015-3711-7_9

  • Publisher Name: Springer, Dordrecht

  • Print ISBN: 978-94-015-3713-1

  • Online ISBN: 978-94-015-3711-7

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics