Javascript must be enabled to continue!
Optimal prefetching via data compression
View through CrossRef
Caching and prefetching are important mechanisms for speeding up access time to data on secondary storage. Recent work in competitive online algorithms has uncovered several promising new algorithms for caching. In this paper, we apply a form of the competitive philosophy for the first time to the problem of prefetching to develop an optimal universal prefetcher in terms of fault rate, with particular applications to large-scale databases and hypertext systems. Our prediction algorithms with particular applications to large-scale databases and hypertext systems. Our prediction algorithms for prefetching are novel in that they are based on data compression techniques that are both theoretically optimal and good in practice. Intuitively, in order to compress data effectively, you have to be able to predict future data well, and thus good data compressors should be able to predict well for purposes of prefetching. We show for powerful models such as Markov sources and
m
the order Markov sources that the page fault rate incurred by our prefetching algorithms are optimal in the limit for almost all sequences of page requests.
Title: Optimal prefetching via data compression
Description:
Caching and prefetching are important mechanisms for speeding up access time to data on secondary storage.
Recent work in competitive online algorithms has uncovered several promising new algorithms for caching.
In this paper, we apply a form of the competitive philosophy for the first time to the problem of prefetching to develop an optimal universal prefetcher in terms of fault rate, with particular applications to large-scale databases and hypertext systems.
Our prediction algorithms with particular applications to large-scale databases and hypertext systems.
Our prediction algorithms for prefetching are novel in that they are based on data compression techniques that are both theoretically optimal and good in practice.
Intuitively, in order to compress data effectively, you have to be able to predict future data well, and thus good data compressors should be able to predict well for purposes of prefetching.
We show for powerful models such as Markov sources and
m
the order Markov sources that the page fault rate incurred by our prefetching algorithms are optimal in the limit for almost all sequences of page requests.
Related Results
Differential Diagnosis of Neurogenic Thoracic Outlet Syndrome: A Review
Differential Diagnosis of Neurogenic Thoracic Outlet Syndrome: A Review
Abstract
Thoracic outlet syndrome (TOS) is a complex and often overlooked condition caused by the compression of neurovascular structures as they pass through the thoracic outlet. ...
Entropy-based bounds for online algorithms
Entropy-based bounds for online algorithms
We focus in this work on an aspect of online computation that is not addressed by standard competitive analysis, namely, identifying request sequences for which nontrivial online a...
Improving the performance of 3D image model compression based on optimized DEFLATE algorithm
Improving the performance of 3D image model compression based on optimized DEFLATE algorithm
AbstractThis study focuses on optimizing and designing the Delayed-Fix-Later Awaiting Transmission Encoding (DEFLATE) algorithm to enhance its compression performance and reduce th...
[RETRACTED] Optimal Max Keto - Does It ReallyWork? v1
[RETRACTED] Optimal Max Keto - Does It ReallyWork? v1
[RETRACTED]Shedding the unwanted weight and controlling the calories of your body is the most challenging and complicated process. As we start aging, we have to deal with lots of...
Practical notes on lossy compression of scientific data
Practical notes on lossy compression of scientific data
<p>Lossy compression methods are extremely efficient in terms of space and performance and allow for reduction of network bandwidth and disk space needed to store dat...
Research on Polynomial Regression Prefetching Model
Research on Polynomial Regression Prefetching Model
Abstract
Based on the addition of prefetching function in the Web, the polynomial regression prefetching model is studied, it designed to mitigate low query response...
Survey on Various Image Compression Techniques Used in Image Processing to Improve the Quality of Image
Survey on Various Image Compression Techniques Used in Image Processing to Improve the Quality of Image
This paper presents study of assorted lossy compression techniques. the 2 techniques are Wavelet Difference Reduction (WDR) based compression and Singular Value Decomposition (SVD)...
Lossless Compression Method for Medical Image Sequences Using Super-Spatial Structure Prediction and Inter-frame Coding
Lossless Compression Method for Medical Image Sequences Using Super-Spatial Structure Prediction and Inter-frame Coding
Space research organizations, hospitals and military air surveillance activities, among others, produce a huge amount of data in the form of images hence a large storage space is r...

