A generalization of binary search
A family of deterministic algorithms that minimizes the worst-case number of function evaluations needed to solve the (n, m)-problem;
A deterministic algorithm that comes within one step of minimizing the worst-case number of parallel steps required to solve the (n,m)-problem, where a given number p of concurrent function evaluations may be performed in each parallel step. This result requires that p ≤ m;
A deterministic algorithm that minimizes the expected number of function evaluations when the function f is drawn from a probability distribution satisfying a natural symmetry property;
A randomized algorithm that minimizes the worst-case expected number of function evaluations required to solve the (n, 1)-problem;
Lower and upper bounds on the worst-case expected number of function evaluations required by a randomized algorithm to solve the (n, m)-problem for m > 1;
All the algorithms presented in the paper are extremely simple.
The (n, m) problem is equivalent to the following natural search problem: given a table consisting of n entries in increasing order, and given keys x1 < x2 < ... < xm, determine which of the given keys lie in the table. It is easily seen that the worst-case number of table entries that must be inspected in the search problem is equal to the worst-case number of function evaluations needed to solve the (n, m) problem.
Unable to display preview. Download preview PDF.
- [Yao]A.C.C. Yao, “Probabilistic Computation: Towards a Unified Measure of Complexity,” Proc. 18th IEEE Symp. on Foundations of Computer Science, pp. 222–227, 1977.Google Scholar