Modified Huffman Code for Bandwidth Optimization Through Lossless Compression

Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 738)

Abstract

In the interest of minimizing bandwidth usage, a modified Huffman code structure is proposed, with an accompanying algorithm, to achieve excellent lossless compression ratios while maintaining a quick compression and decompression process. This is important as the usage of internet bandwidth increases greatly with each passing year, and other existing compression models are either too slow, or not efficient enough. We then implement this data structure and algorithm using English text compression as the data and discuss its application to other data types. We conclude that if this algorithm were to be adopted by browsers and web servers, bandwidth usage could be reduced significantly, resulting in cut costs and a faster internet.

Keywords

Compression Bandwidth Internet Lossless Space optimization 

References

  1. 1.
    Cisco Systems. Visual Networking Index. Available at: https://www.cisco.com/c/en/us/solutions/service-provider/visual-networking-index-vni/index.html, Accessed 10 Mar 2018
  2. 2.
    M.J. Weinberger, G. Seroussi, G. Sapiro, LOCO-I: a low complexity, context-based, loss-less image compression algorithm, in Proceedings Data Compression Conference, 1996, pp. 140–149 (1996).  https://doi.org/10.1109/DCC.1996.488319 CrossRefGoogle Scholar
  3. 3.
    C. Hong-Chung, W. Yue-Li, L. Yu-Feng, Memory-efficient and fast Huffman decoding algorithm. Inf. Process. Lett. 69(3), 119–122 (1999). ISSN: 0020–0190. https://doi.org/10.1016/S0020-0190(99)00002-2. http://www.sciencedirect.com/science/article/pii/S0020019099000022 MathSciNetCrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Trinity UniversitySan AntonioUSA

Personalised recommendations