Advertisement

The randomized complexity of maintaining the minimum

  • Gerth Stølting Brodal
  • Shiva Chaudhuri
  • Jaikumar Radhakrishnan
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1097)

Abstract

The complexity of maintaining a set under the operations Insert, Delete and FindMin is considered. In the comparison model it is shown that any randomized algorithm with expected amortized cost t comparisons per Insert and Delete has expected cost at least n/(e22t) − 1 comparisons for FindMin. If FindMin is replaced by a weaker operation, FindAny, then it is shown that a randomized algorithm with constant expected cost per operation exists, but no deterministic algorithm. Finally, a deterministic algorithm with constant amortized cost per operation for an offline version of the problem is given.

Keywords

Deterministic Algorithm Coin Toss Operation Insert Occupancy Tree Adversary Strategy 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Mikhail J. Atallah and S. Rao Kosaraju. An adversary-based lower bound for sorting. Information Processing Letters, 13:55–57, 1981.CrossRefGoogle Scholar
  2. 2.
    Gerth Stølting Brodal. Fast meldable priority queues. In Proc. 4th Workshop on Algorithms and Data Structures (WADS), volume 955 of Lecture Notes in Computer Science, pages 282–290. Springer Verlag, Berlin, 1995.Google Scholar
  3. 3.
    Svante Carlsson, Patricio V. Poblete, and J. Ian Munro. An implicit binomial queue with constant insertion time. In Proc. 1st Scandinavian Workshop on Algorithm Theory (SWAT), volume 318 of Lecture Notes in Computer Science, pages 1–13. Springer Verlag, Berlin, 1988.Google Scholar
  4. 4.
    James R. Driscoll, Harold N. Gabow, Ruth Shrairman, and Robert E. Tarjan. Relaxed heaps: An alternative to fibonacci heaps with applications to parallel computation. Communications of the ACM, 31(11):1343–1354, 1988.CrossRefGoogle Scholar
  5. 5.
    Rudolf Fleischer. A simple balanced search tree with O(1) worst-case update time. In Algorithms and Computation: 4th International Symposium, ISAAC '93, volume 762 of Lecture Notes in Computer Science, pages 138–146. Springer Verlag, Berlin, 1993.Google Scholar
  6. 6.
    G. H. Hardy, J. E. Littlewood, and G. Polya. Inequalities. Cambridge University Press, Cambridge, 1952.Google Scholar
  7. 7.
    Christos Levcopoulos and Mark H. Overmars. A balanced search tree with O(1) worst-case update time. ACTA Informatica, 26:269–277, 1988.CrossRefGoogle Scholar
  8. 8.
    J. W. J. Williams. Algorithm 232: Heapsort. Communications of the ACM, 7(6):347–348, 1964.Google Scholar
  9. 9.
    A. C-C. Yao. Probabilistic computations: Towards a unified measure of complexity. In Proc. of the 17th Symp. on Found. of Comp. Sci., 222–227, 1977.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1996

Authors and Affiliations

  • Gerth Stølting Brodal
    • 1
  • Shiva Chaudhuri
    • 2
  • Jaikumar Radhakrishnan
    • 3
  1. 1.BRICS, Computer Science DepartmentAarhus UniversityÅrhus CDenmark
  2. 2.Max-Planck-Institut für InformatikIm StadtwaldSaarbrückenGermany
  3. 3.Tata Institute of Fundamental ResearchMumbaiIndia

Personalised recommendations