A new buffer cache design exploiting both temporal and content localities

Document Type

Conference Proceeding

Date of Original Version

8-27-2010

Abstract

This paper presents a Least Popularly Used buffer cache algorithm to exploit both temporal locality and content locality of I/O requests. Popular data blocks are selected as reference blocks that are not only accessed frequently but also identical or similar in content to other blocks that are being accessed. Fast delta compression and decompression are used to satisfy as many I/O requests as possible using the popular reference blocks together with small deltas inside the buffer cache. The popularity of a reference block is calculated based on the statistical analysis of data contents and access frequency. A prototype LPU has been implemented as a new cache layer for Kernel Virtual Machine (KVM) on Linux system. Experimental results show LPU is effective for a variety of workloads with the maximum speed up of over 300% compared with LRU. © 2010 IEEE.

Publication Title, e.g., Journal

Proceedings - International Conference on Distributed Computing Systems

Share

COinS