### Optimal prefetching via data compressionOptimal prefetching via data compression

Access Restriction
Subscribed

 Author Vitter, Jeffrey Scott ♦ Krishnan, P. Source ACM Digital Library Content type Text Publisher Association for Computing Machinery (ACM) File Format PDF Copyright Year ©1996 Language English
 Subject Domain (in DDC) Computer science, information & general works ♦ Data processing & computer science Subject Keyword Markov source ♦ Caching ♦ Competitive analysis ♦ Data compression ♦ Databases ♦ Fault rate ♦ Hypertext ♦ Prediction ♦ Prefetching ♦ Secondary stage ♦ Universal prefetcher Abstract Caching and prefetching are important mechanisms for speeding up access time to data on secondary storage. Recent work in competitive online algorithms has uncovered several promising new algorithms for caching. In this paper, we apply a form of the competitive philosophy for the first time to the problem of prefetching to develop an optimal universal prefetcher in terms of fault rate, with particular applications to large-scale databases and hypertext systems. Our prediction algorithms with particular applications to large-scale databases and hypertext systems. Our prediction algorithms for prefetching are novel in that they are based on data compression techniques that are both theoretically optimal and good in practice. Intuitively, in order to compress data effectively, you have to be able to predict future data well, and thus good data compressors should be able to predict well for purposes of prefetching. We show for powerful models such as Markov sources and $\textit{m}the$ order Markov sources that the page fault rate incurred by our prefetching algorithms are optimal in the limit for almost all sequences of page requests. ISSN 00045411 Age Range 18 to 22 years ♦ above 22 year Educational Use Research Education Level UG and PG Learning Resource Type Article Publisher Date 1996-09-01 Publisher Place New York e-ISSN 1557735X Journal Journal of the ACM (JACM) Volume Number 43 Issue Number 5 Page Count 23 Starting Page 771 Ending Page 793

#### Open content in new tab

Source: ACM Digital Library