Abstract:
We consider the problem of compressing graphs of the link structure of the World Wide Web. We provide efficient algorithms for such compression that are motivated by rand...Show MoreMetadata
Abstract:
We consider the problem of compressing graphs of the link structure of the World Wide Web. We provide efficient algorithms for such compression that are motivated by random graph models for describing the Web. The algorithms are based on reducing the compression problem to the problem of finding a minimum spanning free in a directed graph related to the original link graph. The performance of the algorithms on graphs generated by the random graph models suggests that by taking advantage of the link structure of the Web, one may achieve significantly better compression than natural Huffman-based schemes. We also provide hardness results demonstrating limitations on natural extensions of our approach.
Published in: Proceedings DCC 2001. Data Compression Conference
Date of Conference: 27-29 March 2001
Date Added to IEEE Xplore: 07 August 2002
Print ISBN:0-7695-1031-0
Print ISSN: 1068-0314