By Topic

Cluster based cooperative caching approach through mobility prophecy in MANET

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Kuppusamy, P. ; Comput. Sci. & Eng., King Coll. of Technol., Namakkal, India ; Kalaavathi, B.

Data caching is a one of the most attractive approaches used to improve the data retrieval performance in wireless mobile network. In Mobile ad hoc networks (MANETs), data items in the cache need to be updated frequently due to the high mobility of the nodes. This frequent communication leads to additional overhead and latency. In this paper, we propose to develop a cluster based cooperative caching approach through mobility prediction to handle mobile disconnections and to decrease the overhead. The network is divided into non overlapping clusters and the cluster head is selected based on the energy level and connectivity. The cached data items pattern in the cluster and its adjacent clusters are maintained in the local cache table (LCT) and the Global cache table (GCT) respectively. The mobility of the cluster members predicted using observed sequence states and the cluster head updates the LCT and GCT to maintain consistency periodically. Thus the cache updation through cluster heads reduces the latency and frequent overheads in the network. Our simulation results shown that this caching approach reduces the latency by 1.7% and 9% when compared with the existing technique by increasing cache size and mean query generating time respectively. Similarly it reduces the overhead by 3.8% and 19.5% by increasing cache size and mean query generating time respectively.

Published in:

Computing Communication & Networking Technologies (ICCCNT), 2012 Third International Conference on

Date of Conference:

26-28 July 2012