Search engine for discovering works of Art, research articles, and books related to Art and Culture
ShareThis
Javascript must be enabled to continue!

Lossy Scientific Data Compression With SPERR

View through CrossRef
<p>Much of the research in lossy data compression has focused on minimizing the average error for a given storage budget. For scientific applications, the maximum point-wise error is often of greater interest than the average error. This paper introduces an algorithm that encodes outliers—data points exceeding a specified point-wise error tolerance—produced by a lossy compression algorithm optimized for minimizing average error. These outliers can then be corrected to be within the error tolerance when decoding. We pair this outlier coding algorithm with an in-house implementation of SPECK, a lossy compression algorithm based on wavelets that exhibits excellent rate-distortion performance (where distortion is measured by the average error), and introduce a new lossy compression product that we call SPERR. Compared to two leading scientific data compressors, SPERR uses less storage to guarantee an error bound and produces better overall rate-distortion curves at a moderate cost of added computation. Finally, SPERR facilitates interactive data exploration by exploiting the multiresolution properties of wavelets and their ability to reconstruct coarsened data volumes on the fly.</p>
Copernicus GmbH
Title: Lossy Scientific Data Compression With SPERR
Description:
<p>Much of the research in lossy data compression has focused on minimizing the average error for a given storage budget.
For scientific applications, the maximum point-wise error is often of greater interest than the average error.
This paper introduces an algorithm that encodes outliers—data points exceeding a specified point-wise error tolerance—produced by a lossy compression algorithm optimized for minimizing average error.
These outliers can then be corrected to be within the error tolerance when decoding.
We pair this outlier coding algorithm with an in-house implementation of SPECK, a lossy compression algorithm based on wavelets that exhibits excellent rate-distortion performance (where distortion is measured by the average error), and introduce a new lossy compression product that we call SPERR.
Compared to two leading scientific data compressors, SPERR uses less storage to guarantee an error bound and produces better overall rate-distortion curves at a moderate cost of added computation.
Finally, SPERR facilitates interactive data exploration by exploiting the multiresolution properties of wavelets and their ability to reconstruct coarsened data volumes on the fly.
</p>.

Related Results

Practical notes on lossy compression of scientific data
Practical notes on lossy compression of scientific data
<p>Lossy compression methods are extremely efficient in terms of space and performance and allow for reduction of network bandwidth and disk space needed to store dat...
Survey on Various Image Compression Techniques Used in Image Processing to Improve the Quality of Image
Survey on Various Image Compression Techniques Used in Image Processing to Improve the Quality of Image
This paper presents study of assorted lossy compression techniques. the 2 techniques are Wavelet Difference Reduction (WDR) based compression and Singular Value Decomposition (SVD)...
Differential Diagnosis of Neurogenic Thoracic Outlet Syndrome: A Review
Differential Diagnosis of Neurogenic Thoracic Outlet Syndrome: A Review
Abstract Thoracic outlet syndrome (TOS) is a complex and often overlooked condition caused by the compression of neurovascular structures as they pass through the thoracic outlet. ...
A note on precision-preserving compression of scientific data
A note on precision-preserving compression of scientific data
Abstract. Lossy compression of scientific data arrays is a powerful tool to save network bandwidth and storage space. Properly applied lossy compression can reduce the size of a da...
Continuous Leakage Resilient Lossy Trapdoor Functions
Continuous Leakage Resilient Lossy Trapdoor Functions
Lossy trapdoor functions (LTFs) were first introduced by Peikert and Waters (STOC’08). Since their introduction, lossy trapdoor functions have found numerous applications. They can...
Improving the performance of 3D image model compression based on optimized DEFLATE algorithm
Improving the performance of 3D image model compression based on optimized DEFLATE algorithm
AbstractThis study focuses on optimizing and designing the Delayed-Fix-Later Awaiting Transmission Encoding (DEFLATE) algorithm to enhance its compression performance and reduce th...
Lossless Compression Method for Medical Image Sequences Using Super-Spatial Structure Prediction and Inter-frame Coding
Lossless Compression Method for Medical Image Sequences Using Super-Spatial Structure Prediction and Inter-frame Coding
Space research organizations, hospitals and military air surveillance activities, among others, produce a huge amount of data in the form of images hence a large storage space is r...
Millimetre‐wave broadband waveguide‐based power combiner using lossy planar lines
Millimetre‐wave broadband waveguide‐based power combiner using lossy planar lines
A lossy waveguide‐based power combiner is proposed for broadband low‐loss symmetrically combining with good port matching and isolation at millimetre‐wave frequencies. In the combi...

Back to Top