By Topic

Proceedings DCC 2001. Data Compression Conference

27-29 March 2001

Filter Results

Displaying Results 1 - 25 of 50
  • Proceedings DCC 2001. Data Compression Conference

    Publication Year: 2001
    Request permission for commercial reuse | PDF file iconPDF (431 KB)
    Freely Available from IEEE
  • Universal lossless compression of piecewise stationary slowly varying sources

    Publication Year: 2001, Page(s):371 - 380
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (524 KB) | HTML iconHTML

    Universal lossless compression of parametric piecewise stationary sources with slow changes in the statistics between stationary segments that take place in unknown time intervals is investigated. The minimum description length (MDL) principle is derived for two different settings of this problem under the assumption that the parameter changes are linear over the change interval. In the first sett... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Author index

    Publication Year: 2001, Page(s):531 - 533
    Request permission for commercial reuse | PDF file iconPDF (118 KB)
    Freely Available from IEEE
  • Asymptotically optimal scalable coding for minimum weighted mean square error

    Publication Year: 2001, Page(s):43 - 52
    Cited by:  Papers (4)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (372 KB) | HTML iconHTML

    We derive an asymptotically optimal multi-layer coding scheme for entropy-coded scalar quantizers (SQ) that minimizes the weighted mean-squared error (WMSE). The optimal entropy-coded SQ is non-uniform in the case of WMSE. The conventional multi-layer coder quantizes the base-layer reconstruction error at the enhancement-layer, and is sub-optimal for the WMSE criterion. We consider the compander r... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Compressed pattern matching for SEQUITUR

    Publication Year: 2001, Page(s):469 - 478
    Cited by:  Papers (5)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (508 KB) | HTML iconHTML

    SEQUITUR due to Nevill-Manning and Witten (see Journal of Artificial Intelligence Research, vol.7, p.67-82, 1997) is a powerful program to infer a phrase hierarchy from the input text, that also provides extremely effective compression of large quantities of semi-structured text. In this paper, we address the problem of searching in SEQUITUR compressed text directly. We show a compressed pattern m... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Robust predictive vector quantizer design

    Publication Year: 2001, Page(s):33 - 42
    Cited by:  Papers (6)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (396 KB) | HTML iconHTML

    The design of predictive quantizers generally suffers from difficulties due to the prediction loop, which have an impact on the convergence and the stability of the design procedure. We previously proposed an asymptotically closed-loop approach to quantizer design for predictive coding applications, which benefits from the stability of open-loop design while asymptotically optimizing the actual cl... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • On Zador's entropy-constrained quantization theorem

    Publication Year: 2001, Page(s):3 - 12
    Cited by:  Papers (4)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (388 KB) | HTML iconHTML

    Zador's classic result for the asymptotic high-rate behavior of entropy-constrained vector quantization is recast in a Lagrangian form which better matches the Lloyd algorithm used to optimize such quantizers. A proof that the result holds for a general class of distributions is sketched View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Faster approximate string matching over compressed text

    Publication Year: 2001, Page(s):459 - 468
    Cited by:  Papers (7)  |  Patents (6)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (536 KB) | HTML iconHTML

    Approximate string matching on compressed text was an open problem for almost a decade. The two existing solutions are very new. Despite that they represent important complexity breakthroughs, in most practical cases they are not useful, in the sense that they are slower than uncompressing the text and then searching the uncompressed text. We present a different approach, which reduces the problem... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Semidefinite programs for the design of codes for delay-constrained communication in networks

    Publication Year: 2001, Page(s):183 - 192
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (468 KB) | HTML iconHTML

    We consider the problem of designing codes for the problem of delay-constrained communication in packet networks. We show how good codes for this problem can be characterized as the solution of a pair of semidefinite programs plus a rank constraint. Using this characterization, we formulate the design problem as one of finding suitable graph embeddings into Euclidean space, for a certain family of... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Joint source channel coding using arithmetic codes and trellis coded modulation

    Publication Year: 2001, Page(s):302 - 311
    Cited by:  Papers (10)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (496 KB) | HTML iconHTML

    Previous work has indicated that using an arithmetic encoder with reserved probability space can provide powerful error detection and error correction when used with a sequential decoding algorithm. However, performance improvements were limited at high error rates, principally because of the lack of an explicit decoding tree. In this work a trellis coded modulation scheme is used to provide a con... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Rate-distortion optimization for the SPIHT encoder

    Publication Year: 2001, Page(s):123 - 132
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (440 KB) | HTML iconHTML

    We study the rate-distortion performance of the set partitioning in hierarchical trees (SPIHT) image compression algorithm and present an optimization method that produces the bit stream with minimum Lagrangian cost J=D+λR for a given stopping criterion. Although there are only three applicable stop points at each bit plane, experiments show that substantial improvements can be obtained ove... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Can we do without ranks in Burrows Wheeler transform compression?

    Publication Year: 2001, Page(s):419 - 428
    Cited by:  Papers (3)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (484 KB) | HTML iconHTML

    Compressors based on the Burrows Wheeler transform (1994) convert the transformed text into a string of (move-to-front) ranks. These ranks are then encoded with an Order-0 model, or a hierarchy of such models. Although these rank-based methods perform very well, we believe the transformation to MTF numbers blurs the distinction between individual symbols and is a possible cause of inefficiency. In... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Design of tree-structured multiple description vector quantizers

    Publication Year: 2001, Page(s):23 - 32
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (440 KB) | HTML iconHTML

    We present a new multiple description source coding scheme based on tree-structured vector quantization (TSVQ). In this scheme, the codebook of each decoder is organized in a binary tree. The encoding is greedy and based on a sequence of binary decisions as in traditional TSVQ. Each binary decision of the encoder corresponds to adding information on one of the available channels and the encoding c... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Towards compressing Web graphs

    Publication Year: 2001, Page(s):203 - 212
    Cited by:  Papers (27)  |  Patents (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (548 KB) | HTML iconHTML

    We consider the problem of compressing graphs of the link structure of the World Wide Web. We provide efficient algorithms for such compression that are motivated by random graph models for describing the Web. The algorithms are based on reducing the compression problem to the problem of finding a minimum spanning free in a directed graph related to the original link graph. The performance of the ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Enhancing image coders by using spatial noise shaping (SNS)

    Publication Year: 2001, Page(s):321 - 330
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (604 KB) | HTML iconHTML

    We have developed and demonstrated that spatial noise shaping (SNS) can enhance the performance of an image coder. SNS runs open-loop 2-D linear prediction (LP) in the frequency domain instead of in the time domain as compared to the standard LP used in speech and image coding. This predictive analysis/synthesis process over frequency possesses two important properties which result in a decoded im... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Optimal prefix-free codes that end in a specified pattern and similar problems: the uniform probability case

    Publication Year: 2001, Page(s):143 - 152
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (396 KB) | HTML iconHTML

    In this paper we discuss the problem of constructing minimum-cost, prefix-free codes for equiprobable words under the assumption that all codewords are restricted to belonging to an arbitrary language L. We examine how, given certain types of L, the structure of the minimum-cost code changes as n, the number of codewords, grows View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Fast adaptive encoder for bi-level images

    Publication Year: 2001, Page(s):253 - 262
    Cited by:  Papers (6)  |  Patents (5)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (456 KB) | HTML iconHTML

    We present a new bi-level image compression coder that does not use arithmetic encoding, but whose performance is close to that of state-of-the-art coders such as JBIG, JBIG-2, and JB2. The proposed bi-level coder (BLC) uses two simple adaptation rules: the first to compute context-dependent probability estimates that control a pixel prediction module, and the second to adjust a run-length paramet... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Group testing for wavelet packet image compression

    Publication Year: 2001, Page(s):73 - 82
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (444 KB) | HTML iconHTML

    This paper introduces group testing for wavelet packets (GTWP), a novel embedded image compression algorithm based on wavelet packets and group testing. This algorithm extends the group testing for wavelets (GTW) algorithm to handle wavelet packets. Like its predecessor, GTWP obtains good compression performance without the use of arithmetic coding. It also shows that the group testing methodology... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Pattern matching in Huffman encoded texts

    Publication Year: 2001, Page(s):449 - 458
    Cited by:  Papers (4)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (444 KB) | HTML iconHTML

    The possibility of locating a pattern directly in a text which has been encoded by a static Huffman code is investigated. The main problem is one of synchronization, as an occurrence of the encoded pattern in the encoded text does not necessarily correspond to an occurrence of the pattern in the text. A simple algorithm is suggested which reduces the number of false hits. The probability of false ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Software compression in the client/server environment

    Publication Year: 2001, Page(s):233 - 242
    Cited by:  Papers (5)  |  Patents (6)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (504 KB) | HTML iconHTML

    Lempel-Ziv (1977) based compression algorithms are universal, not assuming any prior knowledge of the file to be compressed or its statistics. Accordingly, the reference dictionary of these textual substitution compression algorithms includes only segments of the already-processed portion of the file. It is often the case, though, that both, compressor and decompressor, even when they reside on di... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Quantized oversampled filter banks with erasures

    Publication Year: 2001, Page(s):173 - 182
    Cited by:  Papers (19)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (424 KB) | HTML iconHTML

    Oversampled filter banks can be used to enhance resilience to erasures in communication systems in much the same way that finite-dimensional frames have previously been applied. This paper extends previous finite dimensional treatments to frames and signals in l2(Z) with frame expansions that can be implemented efficiently with filter banks. It is shown that tight frames attain best per... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Design of trellis codes for source coding with side information at the decoder

    Publication Year: 2001, Page(s):361 - 370
    Cited by:  Papers (14)  |  Patents (4)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (420 KB) | HTML iconHTML

    The problem of source coding with side information at the decoder arises in many practical scenarios. Although this problem has been well characterized in information theory, particularly by the work of Wyner and Ziv (1976), there is still lack of successful algorithms for it. In this paper, we use trellis codes to approach the theoretical limit. An embedded trellis code structure is proposed, and... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Successive refinement on trees: a special case of a new MD coding region

    Publication Year: 2001, Page(s):293 - 301
    Cited by:  Papers (7)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (312 KB) | HTML iconHTML

    New achievability results for the L-stage successive refinement problem with L>2 are presented. These are derived from a recent achievability result for the more general problem of multiple description (MD) coding with L>2 channels. It is shown that successive refinability on chains implies successive refinability on trees and that memoryless Gaussian sources are successively refinable on ch... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Video residual coding using SPIHT and dependent optimization

    Publication Year: 2001, Page(s):113 - 122
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (460 KB) | HTML iconHTML

    We introduce a video residual coding technique based on wavelet transforms and set partitioning in hierarchical trees (SPIHT). In this scheme, wavelet coefficients that correspond to the same spatial location in the image domain are grouped together to form wavelet blocks. Each wavelet block is then converted to a SPIHT-compatible bit stream by an optimized SPIHT encoder minimizing the Lagrangian ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • On the hardness of finding optimal multiple preset dictionaries

    Publication Year: 2001, Page(s):411 - 418
    Cited by:  Patents (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (332 KB) | HTML iconHTML

    Preset dictionaries for Huffman codes are used effectively in fax transmission and JPEG encoding. A natural extension is to allow multiple preset dictionaries instead of just one. We show, however, that finding optimal multiple preset dictionaries for Huffman and LZ77-based compression schemes is NP-hard View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.