Skip to main content
Log in

A note on strategies for bandit problems with infinitely many arms

  • Published:
Metrika Aims and scope Submit manuscript

Abstract.

A bandit problem consisting of a sequence of n choices (n→∞) from a number of infinitely many Bernoulli arms is considered. The parameters of Bernoulli arms are independent and identically distributed random variables from a common distribution F on the interval [0,1] and F is continuous with F(0)=0 and F(1)=1. The goal is to investigate the asymptotic expected failure rates of k-failure strategies, and obtain a lower bound for the expected failure proportion over all strategies presented in Berry et al. (1997). We show that the asymptotic expected failure rates of k-failure strategies when 0<b≤1 and a lower bound can be evaluated if the limit of the ratio F(1)−F(t) versus (1−t)b exists as t→1 for some b>0.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Chen, KY., Lin, CT. A note on strategies for bandit problems with infinitely many arms. Metrika 59, 193–203 (2004). https://doi.org/10.1007/s001840300279

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/s001840300279

Key words

Navigation