By Topic

Aggregating caches: A mechanism for implicit file prefetching

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Amer, A. ; Jack Baskin Sch. of Eng., California Univ., Santa Cruz, CA, USA ; Long, D.D.E.

We introduce the aggregating cache, and demonstrate how it can be used to reduce the number of file retrieval requests made by a caching client, improving storage system performance by reducing the impact of latency. The aggregating cache utilizes predetermined groupings of files to perform group retrievals. These groups are maintained by the server, and built dynamically using observed inter-file relationships. Through a simple analytical model we demonstrate how this mechanism has the potential to reduce average latencies by 75% to 82%. Through trace-based simulation we demonstrate that a simple aggregating cache can reduce the number of demand fetches by almost 50%, while simultaneously improving cache hit ratios by up to 5%

Published in:

Modeling, Analysis and Simulation of Computer and Telecommunication Systems, 2001. Proceedings. Ninth International Symposium on

Date of Conference:

2001