Proceedings DCC 2001. Data Compression Conference

27-29 March 2001

Filter Results

Displaying Results 1 - 25 of 50
  • Proceedings DCC 2001. Data Compression Conference

    Publication Year: 2001
    Request permission for commercial reuse | PDF file iconPDF (431 KB)
    Freely Available from IEEE
  • Universal lossless compression of piecewise stationary slowly varying sources

    Publication Year: 2001, Page(s):371 - 380
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (524 KB) | HTML iconHTML

    Universal lossless compression of parametric piecewise stationary sources with slow changes in the statistics between stationary segments that take place in unknown time intervals is investigated. The minimum description length (MDL) principle is derived for two different settings of this problem under the assumption that the parameter changes are linear over the change interval. In the first sett... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Author index

    Publication Year: 2001, Page(s):531 - 533
    Request permission for commercial reuse | PDF file iconPDF (118 KB)
    Freely Available from IEEE
  • Rate-distortion optimization for the SPIHT encoder

    Publication Year: 2001, Page(s):123 - 132
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (440 KB) | HTML iconHTML

    We study the rate-distortion performance of the set partitioning in hierarchical trees (SPIHT) image compression algorithm and present an optimization method that produces the bit stream with minimum Lagrangian cost J=D+λR for a given stopping criterion. Although there are only three applicable stop points at each bit plane, experiments show that substantial improvements can be obtained ove... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Video residual coding using SPIHT and dependent optimization

    Publication Year: 2001, Page(s):113 - 122
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (460 KB) | HTML iconHTML

    We introduce a video residual coding technique based on wavelet transforms and set partitioning in hierarchical trees (SPIHT). In this scheme, wavelet coefficients that correspond to the same spatial location in the image domain are grouped together to form wavelet blocks. Each wavelet block is then converted to a SPIHT-compatible bit stream by an optimized SPIHT encoder minimizing the Lagrangian ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Low delay perceptually lossless coding of audio signals

    Publication Year: 2001, Page(s):312 - 320
    Cited by:  Papers (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (452 KB) | HTML iconHTML

    A novel predictive lossless coding scheme is proposed. The prediction is based on a new weighted cascaded least mean squared (WCLMS) method. To obtain both a high compression ratio and a very low encoding and decoding delay, the residuals from the prediction are encoded using either a variant of adaptive Huffman coding or a version of adaptive arithmetic coding. WCLMS is especially designed for mu... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Software compression in the client/server environment

    Publication Year: 2001, Page(s):233 - 242
    Cited by:  Papers (5)  |  Patents (6)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (504 KB) | HTML iconHTML

    Lempel-Ziv (1977) based compression algorithms are universal, not assuming any prior knowledge of the file to be compressed or its statistics. Accordingly, the reference dictionary of these textual substitution compression algorithms includes only segments of the already-processed portion of the file. It is often the case, though, that both, compressor and decompressor, even when they reside on di... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Compressed pattern matching for SEQUITUR

    Publication Year: 2001, Page(s):469 - 478
    Cited by:  Papers (5)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (508 KB) | HTML iconHTML

    SEQUITUR due to Nevill-Manning and Witten (see Journal of Artificial Intelligence Research, vol.7, p.67-82, 1997) is a powerful program to infer a phrase hierarchy from the input text, that also provides extremely effective compression of large quantities of semi-structured text. In this paper, we address the problem of searching in SEQUITUR compressed text directly. We show a compressed pattern m... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Managing drift in DCT-based scalable video coding

    Publication Year: 2001, Page(s):351 - 360
    Cited by:  Papers (7)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (480 KB) | HTML iconHTML

    When compressed video is transmitted over erasure-prone channels, errors will propagate whenever temporal or spatial prediction is used. Typical tools to combat this error propagation are packetization, re-synchronizing codewords, intra-coding, and scalability. In recent years, the concern over so-called “drift” has sent researchers toward structures for scalability that do not use enh... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An adaptable binary entropy coder

    Publication Year: 2001, Page(s):391 - 400
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (500 KB) | HTML iconHTML

    We present a novel entropy coding technique which is based on recursive interleaving of variable-to-variable length binary source codes. The encoding is adaptable in that each bit to be encoded may have an associated probability estimate which depends on previously encoded bits. The technique may have advantages over arithmetic coding. The technique can achieve arbitrarily small redundancy, and ad... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Joint source-channel decoding of correlated sources over noisy channels

    Publication Year: 2001, Page(s):283 - 292
    Cited by:  Papers (49)  |  Patents (9)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (436 KB) | HTML iconHTML

    We consider the case of two correlated binary information sequences. Instead of compressing the information using source coding, both sequences are independently channel encoded, and transmitted over an AWGN channel. The correlation between both sequences is exploited at the receiver, allowing reliable communications at signal to noise ratios very close to the theoretical limits established by the... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Feature-preserving image coding for very low bit rates

    Publication Year: 2001, Page(s):103 - 112
    Cited by:  Papers (7)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (852 KB) | HTML iconHTML

    Many progressive wavelet-based image coders are designed for good performance on natural images. They attempt to achieve the greatest reduction in mean squared error (MSE) with each bit sent, an approach that is most effective when the image is composed chiefly of low-frequency content. Many images, however, include sharp-edged objects, text characters or graphics that are not well handled by stan... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Asymptotically optimal scalable coding for minimum weighted mean square error

    Publication Year: 2001, Page(s):43 - 52
    Cited by:  Papers (4)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (372 KB) | HTML iconHTML

    We derive an asymptotically optimal multi-layer coding scheme for entropy-coded scalar quantizers (SQ) that minimizes the weighted mean-squared error (WMSE). The optimal entropy-coded SQ is non-uniform in the case of WMSE. The conventional multi-layer coder quantizes the base-layer reconstruction error at the enhancement-layer, and is sub-optimal for the WMSE criterion. We consider the compander r... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Optimal prefix-free codes that end in a specified pattern and similar problems: the uniform probability case

    Publication Year: 2001, Page(s):143 - 152
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (396 KB) | HTML iconHTML

    In this paper we discuss the problem of constructing minimum-cost, prefix-free codes for equiprobable words under the assumption that all codewords are restricted to belonging to an arbitrary language L. We examine how, given certain types of L, the structure of the minimum-cost code changes as n, the number of codewords, grows View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Joint source channel coding using arithmetic codes and trellis coded modulation

    Publication Year: 2001, Page(s):302 - 311
    Cited by:  Papers (10)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (496 KB) | HTML iconHTML

    Previous work has indicated that using an arithmetic encoder with reserved probability space can provide powerful error detection and error correction when used with a sequential decoding algorithm. However, performance improvements were limited at high error rates, principally because of the lack of an explicit decoding tree. In this work a trellis coded modulation scheme is used to provide a con... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Compressing XML with multiplexed hierarchical PPM models

    Publication Year: 2001, Page(s):163 - 172
    Cited by:  Papers (48)  |  Patents (9)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (600 KB) | HTML iconHTML

    We established a working Extensible Markup Language (XML) compression benchmark based on text compression, and found that bzip2 compresses XML best, albeit more slowly than gzip. Our experiments verified that TXMILL speeds up and improves compression using gzip and bounded-context PPM by up to 15%, but found that it worsens the compression for bzip2 and PPM. We describe alternative appr... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Semidefinite programs for the design of codes for delay-constrained communication in networks

    Publication Year: 2001, Page(s):183 - 192
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (468 KB) | HTML iconHTML

    We consider the problem of designing codes for the problem of delay-constrained communication in packet networks. We show how good codes for this problem can be characterized as the solution of a pair of semidefinite programs plus a rank constraint. Using this characterization, we formulate the design problem as one of finding suitable graph embeddings into Euclidean space, for a certain family of... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Towards compressing Web graphs

    Publication Year: 2001, Page(s):203 - 212
    Cited by:  Papers (29)  |  Patents (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (548 KB) | HTML iconHTML

    We consider the problem of compressing graphs of the link structure of the World Wide Web. We provide efficient algorithms for such compression that are motivated by random graph models for describing the Web. The algorithms are based on reducing the compression problem to the problem of finding a minimum spanning free in a directed graph related to the original link graph. The performance of the ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Streaming thin client compression

    Publication Year: 2001, Page(s):223 - 232
    Cited by:  Papers (1)  |  Patents (10)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (540 KB) | HTML iconHTML

    Thin client compression (TCC) achieves the best compression efficiency for sequences of synthetic images. This paper presents a streaming version of TCC (STCC) that almost fully retains the excellent compression efficiency of the original algorithm. When sending images over low-bandwidth networks, STCC dramatically reduces end-to-end latency by pipelining rows and overlapping the compression, tran... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Faster approximate string matching over compressed text

    Publication Year: 2001, Page(s):459 - 468
    Cited by:  Papers (7)  |  Patents (7)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (536 KB) | HTML iconHTML

    Approximate string matching on compressed text was an open problem for almost a decade. The two existing solutions are very new. Despite that they represent important complexity breakthroughs, in most practical cases they are not useful, in the sense that they are slower than uncompressing the text and then searching the uncompressed text. We present a different approach, which reduces the problem... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Space-time tradeoffs in the inverse B-W transform

    Publication Year: 2001, Page(s):439 - 448
    Cited by:  Papers (5)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (500 KB) | HTML iconHTML

    Lossless text compression based on the Burrows-Wheeler transform (BWT) has become popular. Compression-time issues-MTF coding or the avoidance thereof (Wirth 2000), encoding the MTF values, sorting fast (Seward 2000)-have seen considerable investigation. Decompression-time issues remain underinvestigated. For some applications, decompression-time behaviour is a critical issue. For example, boot-ti... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Multihypothesis motion estimation for video coding

    Publication Year: 2001, Page(s):341 - 350
    Cited by:  Papers (8)  |  Patents (45)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (376 KB) | HTML iconHTML

    Multihypothesis motion-compensating predictors combine several motion-compensated signals to predict the current frame of a video signal. This paper applies the wide-sense stationary theory of multihypothesis motion compensation for hybrid video codecs to multihypothesis motion estimation. This allows us to study the influence of the displacement error correlation on the efficiency of multihypothe... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The coding-optimal transform

    Publication Year: 2001, Page(s):381 - 390
    Cited by:  Papers (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (452 KB) | HTML iconHTML

    We propose a new transform coding algorithm that integrates all optimization steps into a coherent and consistent framework. Each iteration of the algorithm is designed to minimize coding distortion as a function of both the transform and quantizer designs. Our algorithm is a constrained version of the Linde-Buzo-Gray (LBG) algorithm for vector quantizer design. The reproduction vectors are constr... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Can we do without ranks in Burrows Wheeler transform compression?

    Publication Year: 2001, Page(s):419 - 428
    Cited by:  Papers (3)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (484 KB) | HTML iconHTML

    Compressors based on the Burrows Wheeler transform (1994) convert the transformed text into a string of (move-to-front) ranks. These ranks are then encoded with an Order-0 model, or a hierarchy of such models. Although these rank-based methods perform very well, we believe the transformation to MTF numbers blurs the distinction between individual symbols and is a possible cause of inefficiency. In... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • On variable length codes for iterative source/channel decoding

    Publication Year: 2001, Page(s):273 - 282
    Cited by:  Papers (69)  |  Patents (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (432 KB) | HTML iconHTML

    We focus on a trellis-based decoding technique for variable length codes (VLCs) which does not require any additional side information besides the number of bits in the coded sequence. A bit-level soft-in/soft-out decoder based on this trellis is used as an outer component decoder in an iterative decoding scheme for a serially concatenated source/channel coding system. In contrast to previous appr... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.