Large HPC installations today also include large data storage installations. Data compression can significantly reduce the amount of data, and it was one of our goals to find out, how much compression can do for climate data. The price of compression is, of course, the need for additional computational resources, so our second goal was to relate the savings of compression to the costs it necessitates.
In this paper we present the results of our analysis of typical climate data. A lossless algorithm based on these insights is developed and its compression ratio is compared to that of standard compression tools. As it turns out, this algorithm is general enough to be useful for a large class of scientific data, which is the reason we speak of MAFISC as a method for scientific data compression. A numeric problem for lossless compression of scientific data is identified and a possible solution is given. Finally, we discuss the economics of data compression in HPC environments using the example of the German Climate Computing Center.
Data compression NetCDF HDF5
Alfsen K, Skodvin T (2009) The intergovernmental panel on climate change (ipcc) and scientific consensus
Lakshminarasimhan S, Shah N, Ethier S, Klasky S, Latham R, Ross R, Samatova N (2011) Compressing the incompressible with ISABELA: in-situ reduction of spatio-temporal data. In: Euro-Par 2011 Parallel Processing pp. 366–379
Woodring J, Mniszewski S, Brislawn C, DeMarle D, Ahrens J (2011) Revisiting wavelet compression for large-scale climate data using JPEG2000 and ensuring data precision. In: IEEE symposium on large data analysis and visualization (LDAV), pp 31–38. doi:10.1109/LDAV.2011.6092314CrossRefGoogle Scholar