Skip to main content

Online Compression Caching

  • Conference paper
Algorithm Theory – SWAT 2008 (SWAT 2008)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 5124))

Included in the following conference series:

  • 671 Accesses

Abstract

Motivated by the possibility of storing a file in a compressed format, we formulate the following class of compression caching problems. We are given a cache with a specified capacity, a certain number of compression/uncompression algorithms, and a set of files, each of which can be cached as it is or by applying one of the compression algorithms. Each compressed format of a file is specified by three parameters: encode cost, decode cost, and size. The miss penalty of a file is the cost of accessing the file if the file or any compressed format of the file is not present in the cache. The goal of a compression caching algorithm is to minimize the total cost of executing a given sequence of requests for files. We say an online algorithm is resource competitive if the algorithm is constant competitive with a constant factor resource advantage. A well-known result in the framework of competitive analysis states that the least-recently used (LRU) algorithm is resource competitive for the traditional paging problem. Since compression caching generalizes the traditional paging problem, it is natural to ask whether a resource competitive online algorithm exists or not for compression caching. In this work, we address three problems in the class of compression caching. The first problem assumes that the encode cost and decode cost associated with any format of a file are equal. For this problem we present a resource competitive online algorithm. To explore the existence of resource competitive online algorithms for compression caching with arbitrary encode costs and decode costs, we address two other natural problems in the aforementioned class, and for each of these problems, we show that there exists a non-constant lower bound on the competitive ratio of any online algorithm, even if the algorithm is given an arbitrary factor capacity blowup. Thus, we establish that there is no resource competitive algorithm for compression caching in its full generality.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 99.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 129.00
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Abali, B., Banikazemi, M., Shen, X., Franke, H., Poff, D.E., Smith, T.B.: Hardware compressed main memory: Operating system support and performance evaluation. IEEE Transactions on Computers 50, 1219–1233 (2001)

    Article  Google Scholar 

  2. Alameldeen, A.R., Wood, D.A.: Adaptive cache compression for high-performance processors. In: Proceedings of the 31st Annual International Symposium on Computer Architecture, June 2004, pp. 212–223 (2004)

    Google Scholar 

  3. Awerbuch, B., Bartal, Y., Fiat, A.: Distributed paging for general networks. Journal of Algorithms 28, 67–104 (1998)

    Article  MATH  MathSciNet  Google Scholar 

  4. Borodin, A., El-Yaniv, R.: Online Computation and Competitive Analysis. Cambridge University Press, Cambridge (1998)

    MATH  Google Scholar 

  5. Cao, P., Irani, S.: Cost-aware WWW proxy caching algorithms. In: Proceedings of the 1st Usenix Symposium on Internet Technologies and Systems, December 1997, pp. 193–206 (1997)

    Google Scholar 

  6. Hallnor, E.G., Reinhardt, S.K.: A unified compressed memory hierarchy. In: Proceedings of the 11th International Symposium on High-Performance Computer Architecture, February 2005, pp. 201–212 (2005)

    Google Scholar 

  7. Sleator, D.D., Tarjan, R.E.: Amortized efficiency of list update and paging rules. Communications of the ACM 28, 202–208 (1985)

    Article  MathSciNet  Google Scholar 

  8. Tiwari, M.: Algorithms for distributed caching and aggregation (2007), http://www.cs.utexas.edu/users/plaxton/pubs/dissertations/mitul.pdf

  9. Young, N.E.: On-line file caching. Algorithmica 33, 371–383 (2002)

    Article  MATH  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Joachim Gudmundsson

Rights and permissions

Reprints and permissions

Copyright information

© 2008 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Plaxton, C.G., Sun, Y., Tiwari, M., Vin, H. (2008). Online Compression Caching. In: Gudmundsson, J. (eds) Algorithm Theory – SWAT 2008. SWAT 2008. Lecture Notes in Computer Science, vol 5124. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-69903-3_37

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-69903-3_37

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-69900-2

  • Online ISBN: 978-3-540-69903-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics