Proceedings DCC 2001. Data Compression Conference

27-29 March 2001

Filter Results

Displaying Results 1 - 25 of 50
  • Proceedings DCC 2001. Data Compression Conference

    Publication Year: 2001
    Request permission for commercial reuse | |PDF file iconPDF (431 KB)
    Freely Available from IEEE
  • Universal lossless compression of piecewise stationary slowly varying sources

    Publication Year: 2001, Page(s):371 - 380
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (524 KB) | HTML iconHTML

    Universal lossless compression of parametric piecewise stationary sources with slow changes in the statistics between stationary segments that take place in unknown time intervals is investigated. The minimum description length (MDL) principle is derived for two different settings of this problem under the assumption that the parameter changes are linear over the change interval. In the first sett... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Author index

    Publication Year: 2001, Page(s):531 - 533
    Request permission for commercial reuse | |PDF file iconPDF (118 KB)
    Freely Available from IEEE
  • Quantized oversampled filter banks with erasures

    Publication Year: 2001, Page(s):173 - 182
    Cited by:  Papers (20)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (424 KB) | HTML iconHTML

    Oversampled filter banks can be used to enhance resilience to erasures in communication systems in much the same way that finite-dimensional frames have previously been applied. This paper extends previous finite dimensional treatments to frames and signals in l2(Z) with frame expansions that can be implemented efficiently with filter banks. It is shown that tight frames attain best per... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Compressing XML with multiplexed hierarchical PPM models

    Publication Year: 2001, Page(s):163 - 172
    Cited by:  Papers (52)  |  Patents (9)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (600 KB) | HTML iconHTML

    We established a working Extensible Markup Language (XML) compression benchmark based on text compression, and found that bzip2 compresses XML best, albeit more slowly than gzip. Our experiments verified that TXMILL speeds up and improves compression using gzip and bounded-context PPM by up to 15%, but found that it worsens the compression for bzip2 and PPM. We describe alternative appr... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Combining PPM models using a text mining approach

    Publication Year: 2001, Page(s):153 - 162
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (604 KB) | HTML iconHTML

    This paper introduces a novel switching method which can be used to combine two or more PPM models. The work derives from our earlier work on modelling English and text mining, and the approach takes advantage of both to help improve the compression performance significantly. The performance of the combination of models is at least as good as (and in many cases significantly better than) the best ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Optimal prefix-free codes that end in a specified pattern and similar problems: the uniform probability case

    Publication Year: 2001, Page(s):143 - 152
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (396 KB) | HTML iconHTML

    In this paper we discuss the problem of constructing minimum-cost, prefix-free codes for equiprobable words under the assumption that all codewords are restricted to belonging to an arbitrary language L. We examine how, given certain types of L, the structure of the minimum-cost code changes as n, the number of codewords, grows View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Embedded image coding using zeroblocks of subband/wavelet coefficients and context modeling

    Publication Year: 2001, Page(s):83 - 92
    Cited by:  Papers (15)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (508 KB) | HTML iconHTML

    In this paper, we present a new embedded wavelet image coding system using quadtree splitting and context modeling. It features low computational complexity and high compression efficiency, thanks to joint utilization of two powerful embedded coding techniques-set partitioning and context modeling. With effective exploitation of strong statistical dependencies among the quadtree nodes built up fro... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Length-restricted coding in static and dynamic frameworks

    Publication Year: 2001, Page(s):133 - 142
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (504 KB) | HTML iconHTML

    This paper describes variants of a recent length-restricted coding technique for use in static and dynamic frameworks. The resulting compression systems are shown to have identical asymptotic time complexity and also competitive performance to the corresponding unrestricted systems View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Compressed pattern matching for SEQUITUR

    Publication Year: 2001, Page(s):469 - 478
    Cited by:  Papers (5)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (508 KB) | HTML iconHTML

    SEQUITUR due to Nevill-Manning and Witten (see Journal of Artificial Intelligence Research, vol.7, p.67-82, 1997) is a powerful program to infer a phrase hierarchy from the input text, that also provides extremely effective compression of large quantities of semi-structured text. In this paper, we address the problem of searching in SEQUITUR compressed text directly. We show a compressed pattern m... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Design of trellis codes for source coding with side information at the decoder

    Publication Year: 2001, Page(s):361 - 370
    Cited by:  Papers (14)  |  Patents (4)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (420 KB) | HTML iconHTML

    The problem of source coding with side information at the decoder arises in many practical scenarios. Although this problem has been well characterized in information theory, particularly by the work of Wyner and Ziv (1976), there is still lack of successful algorithms for it. In this paper, we use trellis codes to approach the theoretical limit. An embedded trellis code structure is proposed, and... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Group testing for wavelet packet image compression

    Publication Year: 2001, Page(s):73 - 82
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (444 KB) | HTML iconHTML

    This paper introduces group testing for wavelet packets (GTWP), a novel embedded image compression algorithm based on wavelet packets and group testing. This algorithm extends the group testing for wavelets (GTW) algorithm to handle wavelet packets. Like its predecessor, GTWP obtains good compression performance without the use of arithmetic coding. It also shows that the group testing methodology... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Compression of the layered depth image

    Publication Year: 2001, Page(s):331 - 340
    Cited by:  Papers (2)  |  Patents (8)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (592 KB) | HTML iconHTML

    A layered depth image (LDI) is a new popular representation and rendering method for objects with complex geometries. Similar to a 2D image, the LDI consists of an array of pixels. However, unlike the 2D image, an LDI pixel has depth information, and there are multiple layers at a pixel location. In this paper, we develop a novel LDI compression algorithm that handles the multiple layers and the d... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Asymptotically optimal scalable coding for minimum weighted mean square error

    Publication Year: 2001, Page(s):43 - 52
    Cited by:  Papers (4)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (372 KB) | HTML iconHTML

    We derive an asymptotically optimal multi-layer coding scheme for entropy-coded scalar quantizers (SQ) that minimizes the weighted mean-squared error (WMSE). The optimal entropy-coded SQ is non-uniform in the case of WMSE. The conventional multi-layer coder quantizes the base-layer reconstruction error at the enhancement-layer, and is sub-optimal for the WMSE criterion. We consider the compander r... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Rate-distortion optimization for the SPIHT encoder

    Publication Year: 2001, Page(s):123 - 132
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (440 KB) | HTML iconHTML

    We study the rate-distortion performance of the set partitioning in hierarchical trees (SPIHT) image compression algorithm and present an optimization method that produces the bit stream with minimum Lagrangian cost J=D+λR for a given stopping criterion. Although there are only three applicable stop points at each bit plane, experiments show that substantial improvements can be obtained ove... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • On the hardness of finding optimal multiple preset dictionaries

    Publication Year: 2001, Page(s):411 - 418
    Cited by:  Patents (3)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (332 KB) | HTML iconHTML

    Preset dictionaries for Huffman codes are used effectively in fax transmission and JPEG encoding. A natural extension is to allow multiple preset dictionaries instead of just one. We show, however, that finding optimal multiple preset dictionaries for Huffman and LZ77-based compression schemes is NP-hard View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Faster approximate string matching over compressed text

    Publication Year: 2001, Page(s):459 - 468
    Cited by:  Papers (7)  |  Patents (7)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (536 KB) | HTML iconHTML

    Approximate string matching on compressed text was an open problem for almost a decade. The two existing solutions are very new. Despite that they represent important complexity breakthroughs, in most practical cases they are not useful, in the sense that they are slower than uncompressing the text and then searching the uncompressed text. We present a different approach, which reduces the problem... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Joint source channel coding using arithmetic codes and trellis coded modulation

    Publication Year: 2001, Page(s):302 - 311
    Cited by:  Papers (10)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (496 KB) | HTML iconHTML

    Previous work has indicated that using an arithmetic encoder with reserved probability space can provide powerful error detection and error correction when used with a sequential decoding algorithm. However, performance improvements were limited at high error rates, principally because of the lack of an explicit decoding tree. In this work a trellis coded modulation scheme is used to provide a con... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Managing drift in DCT-based scalable video coding

    Publication Year: 2001, Page(s):351 - 360
    Cited by:  Papers (7)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (480 KB) | HTML iconHTML

    When compressed video is transmitted over erasure-prone channels, errors will propagate whenever temporal or spatial prediction is used. Typical tools to combat this error propagation are packetization, re-synchronizing codewords, intra-coding, and scalability. In recent years, the concern over so-called “drift” has sent researchers toward structures for scalability that do not use enh... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Enhancing analog image transmission systems using digital side information: a new wavelet-based image coding paradigm

    Publication Year: 2001, Page(s):63 - 72
    Cited by:  Papers (29)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (544 KB) | HTML iconHTML

    We address digital transmission for enhancing, in a backward compatible way, the quality of analog image transmission systems. We propose a practical algorithm that treats the problem as one of wavelet image compression with side information (available in the form of a noisy analog version of the image) present at the decoder. We propose a rate allocation technique to efficiently allocate the rate... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Enhancing image coders by using spatial noise shaping (SNS)

    Publication Year: 2001, Page(s):321 - 330
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (604 KB) | HTML iconHTML

    We have developed and demonstrated that spatial noise shaping (SNS) can enhance the performance of an image coder. SNS runs open-loop 2-D linear prediction (LP) in the frequency domain instead of in the time domain as compared to the standard LP used in speech and image coding. This predictive analysis/synthesis process over frequency possesses two important properties which result in a decoded im... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Robust predictive vector quantizer design

    Publication Year: 2001, Page(s):33 - 42
    Cited by:  Papers (6)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (396 KB) | HTML iconHTML

    The design of predictive quantizers generally suffers from difficulties due to the prediction loop, which have an impact on the convergence and the stability of the design procedure. We previously proposed an asymptotically closed-loop approach to quantizer design for predictive coding applications, which benefits from the stability of open-loop design while asymptotically optimizing the actual cl... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Video residual coding using SPIHT and dependent optimization

    Publication Year: 2001, Page(s):113 - 122
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (460 KB) | HTML iconHTML

    We introduce a video residual coding technique based on wavelet transforms and set partitioning in hierarchical trees (SPIHT). In this scheme, wavelet coefficients that correspond to the same spatial location in the image domain are grouped together to form wavelet blocks. Each wavelet block is then converted to a SPIHT-compatible bit stream by an optimized SPIHT encoder minimizing the Lagrangian ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Network vector quantization

    Publication Year: 2001, Page(s):13 - 22
    Cited by:  Papers (13)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (528 KB) | HTML iconHTML

    A network source code is an optimal source code for a network. To design network source codes, we require each node to have a single encoder, which jointly encodes all messages transmitted by that node, and a single decoder, which jointly decodes all messages arriving at that node. Given a distribution over the sources, the design of the network source code jointly optimizes all encoders and decod... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Overlap in adaptive vector quantization

    Publication Year: 2001, Page(s):401 - 410
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (600 KB) | HTML iconHTML

    Constantinescu and Storer (1994) introduced an adaptive single-pass vector quantization algorithm (AVQ) that employs variable size and shaped codebook entries that are “learned” as an image is processed (no specific training or prior knowledge of the data is used). The approach allows the tradeoff between compression and fidelity to be continuously adjusted from lossless (with less com... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.