||4 years ago|
|Error-Controlled_Lossy_Compression_Optimized_for_High_Compression_Ratios_of_Scientific_Datasets.pdf||4 years ago|
|README.md||4 years ago|
|Significantly_Improving_Lossy_Compression_for_Scientific_Data_Sets_Based_on_Multidimensional_Prediction_and_Error-Controlled_Quantization.pdf||4 years ago|
|data-compression.pdf||10 years ago|
|fast_error_bounded_Lossy_hpc_data_compression_with_sz.pdf||4 years ago|
|fixed-rate_compressed_floating_point_arrays.pdf||4 years ago|
|fpc_a_high_speed_compressor_for_double_precision_floating_point_data.pdf||4 years ago|
This paper surveys a variety of data compression methods spanning almost 40 years of research, from the work of Shannon, Fano and Huffman in the 40's, to a technique developed in 1986.
Scientific Data Compression
This is the first version of SZ. In this paper, SZ is introduced to achieve data reduction using regression-based data point prediction.
This work is known as SZ-1.4. In this work, SZ employs multi-dimensional data prediction so that data with dimension larger than 1 is no longer linearized into single dimension before compression. In this way, more data locality is preserved thus compression ratio is improved.
This work is known as SZ-2.0. In this work, authors proposed an online selection tool between 2 predictors, the mean-integrated Lorenzo predictor and linear regression-based predictor. Users can choose the predictor that yields larger compression ratio with higher prediction accuracy.