Scheduled System Maintenance
On Tuesday, May 22, IEEE Xplore will undergo scheduled maintenance. Single article sales and account management will be unavailable
from 6:00am–5:00pm ET. There may be intermittent impact on performance from noon–6:00pm ET.
We apologize for the inconvenience.

Proceedings DCC 2001. Data Compression Conference

27-29 March 2001

Filter Results

Displaying Results 1 - 25 of 50
  • Proceedings DCC 2001. Data Compression Conference

    Publication Year: 2001
    Request permission for commercial reuse | PDF file iconPDF (431 KB)
    Freely Available from IEEE
  • Universal lossless compression of piecewise stationary slowly varying sources

    Publication Year: 2001, Page(s):371 - 380
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (524 KB) | HTML iconHTML

    Universal lossless compression of parametric piecewise stationary sources with slow changes in the statistics between stationary segments that take place in unknown time intervals is investigated. The minimum description length (MDL) principle is derived for two different settings of this problem under the assumption that the parameter changes are linear over the change interval. In the first sett... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Author index

    Publication Year: 2001, Page(s):531 - 533
    Request permission for commercial reuse | PDF file iconPDF (118 KB)
    Freely Available from IEEE
  • Semidefinite programs for the design of codes for delay-constrained communication in networks

    Publication Year: 2001, Page(s):183 - 192
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (468 KB) | HTML iconHTML

    We consider the problem of designing codes for the problem of delay-constrained communication in packet networks. We show how good codes for this problem can be characterized as the solution of a pair of semidefinite programs plus a rank constraint. Using this characterization, we formulate the design problem as one of finding suitable graph embeddings into Euclidean space, for a certain family of... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Quantized oversampled filter banks with erasures

    Publication Year: 2001, Page(s):173 - 182
    Cited by:  Papers (20)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (424 KB) | HTML iconHTML

    Oversampled filter banks can be used to enhance resilience to erasures in communication systems in much the same way that finite-dimensional frames have previously been applied. This paper extends previous finite dimensional treatments to frames and signals in l2(Z) with frame expansions that can be implemented efficiently with filter banks. It is shown that tight frames attain best per... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Compressing XML with multiplexed hierarchical PPM models

    Publication Year: 2001, Page(s):163 - 172
    Cited by:  Papers (52)  |  Patents (9)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (600 KB) | HTML iconHTML

    We established a working Extensible Markup Language (XML) compression benchmark based on text compression, and found that bzip2 compresses XML best, albeit more slowly than gzip. Our experiments verified that TXMILL speeds up and improves compression using gzip and bounded-context PPM by up to 15%, but found that it worsens the compression for bzip2 and PPM. We describe alternative appr... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Software compression in the client/server environment

    Publication Year: 2001, Page(s):233 - 242
    Cited by:  Papers (6)  |  Patents (6)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (504 KB) | HTML iconHTML

    Lempel-Ziv (1977) based compression algorithms are universal, not assuming any prior knowledge of the file to be compressed or its statistics. Accordingly, the reference dictionary of these textual substitution compression algorithms includes only segments of the already-processed portion of the file. It is often the case, though, that both, compressor and decompressor, even when they reside on di... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Pattern matching in Huffman encoded texts

    Publication Year: 2001, Page(s):449 - 458
    Cited by:  Papers (5)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (444 KB) | HTML iconHTML

    The possibility of locating a pattern directly in a text which has been encoded by a static Huffman code is investigated. The main problem is one of synchronization, as an occurrence of the encoded pattern in the encoded text does not necessarily correspond to an occurrence of the pattern in the text. A simple algorithm is suggested which reduces the number of false hits. The probability of false ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Joint source channel coding using arithmetic codes and trellis coded modulation

    Publication Year: 2001, Page(s):302 - 311
    Cited by:  Papers (10)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (496 KB) | HTML iconHTML

    Previous work has indicated that using an arithmetic encoder with reserved probability space can provide powerful error detection and error correction when used with a sequential decoding algorithm. However, performance improvements were limited at high error rates, principally because of the lack of an explicit decoding tree. In this work a trellis coded modulation scheme is used to provide a con... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Combining PPM models using a text mining approach

    Publication Year: 2001, Page(s):153 - 162
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (604 KB) | HTML iconHTML

    This paper introduces a novel switching method which can be used to combine two or more PPM models. The work derives from our earlier work on modelling English and text mining, and the approach takes advantage of both to help improve the compression performance significantly. The performance of the combination of models is at least as good as (and in many cases significantly better than) the best ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Streaming thin client compression

    Publication Year: 2001, Page(s):223 - 232
    Cited by:  Papers (1)  |  Patents (10)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (540 KB) | HTML iconHTML

    Thin client compression (TCC) achieves the best compression efficiency for sequences of synthetic images. This paper presents a streaming version of TCC (STCC) that almost fully retains the excellent compression efficiency of the original algorithm. When sending images over low-bandwidth networks, STCC dramatically reduces end-to-end latency by pipelining rows and overlapping the compression, tran... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Space-time tradeoffs in the inverse B-W transform

    Publication Year: 2001, Page(s):439 - 448
    Cited by:  Papers (5)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (500 KB) | HTML iconHTML

    Lossless text compression based on the Burrows-Wheeler transform (BWT) has become popular. Compression-time issues-MTF coding or the avoidance thereof (Wirth 2000), encoding the MTF values, sorting fast (Seward 2000)-have seen considerable investigation. Decompression-time issues remain underinvestigated. For some applications, decompression-time behaviour is a critical issue. For example, boot-ti... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Successive refinement on trees: a special case of a new MD coding region

    Publication Year: 2001, Page(s):293 - 301
    Cited by:  Papers (7)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (312 KB) | HTML iconHTML

    New achievability results for the L-stage successive refinement problem with L>2 are presented. These are derived from a recent achievability result for the more general problem of multiple description (MD) coding with L>2 channels. It is shown that successive refinability on chains implies successive refinability on trees and that memoryless Gaussian sources are successively refinable on ch... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Optimal prefix-free codes that end in a specified pattern and similar problems: the uniform probability case

    Publication Year: 2001, Page(s):143 - 152
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (396 KB) | HTML iconHTML

    In this paper we discuss the problem of constructing minimum-cost, prefix-free codes for equiprobable words under the assumption that all codewords are restricted to belonging to an arbitrary language L. We examine how, given certain types of L, the structure of the minimum-cost code changes as n, the number of codewords, grows View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Compressing the graph structure of the Web

    Publication Year: 2001, Page(s):213 - 222
    Cited by:  Papers (21)  |  Patents (6)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (596 KB) | HTML iconHTML

    A large amount of research has recently focused on the graph structure (or link structure) of the World Wide Web. This structure has proven to be extremely useful for improving the performance of search engines and other tools for navigating the Web. However, since the graphs in these scenarios involve hundreds of millions of nodes and even more edges, highly space-efficient data structures are ne... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Network vector quantization

    Publication Year: 2001, Page(s):13 - 22
    Cited by:  Papers (13)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (528 KB) | HTML iconHTML

    A network source code is an optimal source code for a network. To design network source codes, we require each node to have a single encoder, which jointly encodes all messages transmitted by that node, and a single decoder, which jointly decodes all messages arriving at that node. Given a distribution over the sources, the design of the network source code jointly optimizes all encoders and decod... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Compression of the layered depth image

    Publication Year: 2001, Page(s):331 - 340
    Cited by:  Papers (2)  |  Patents (8)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (592 KB) | HTML iconHTML

    A layered depth image (LDI) is a new popular representation and rendering method for objects with complex geometries. Similar to a 2D image, the LDI consists of an array of pixels. However, unlike the 2D image, an LDI pixel has depth information, and there are multiple layers at a pixel location. In this paper, we develop a novel LDI compression algorithm that handles the multiple layers and the d... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Design of trellis codes for source coding with side information at the decoder

    Publication Year: 2001, Page(s):361 - 370
    Cited by:  Papers (14)  |  Patents (4)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (420 KB) | HTML iconHTML

    The problem of source coding with side information at the decoder arises in many practical scenarios. Although this problem has been well characterized in information theory, particularly by the work of Wyner and Ziv (1976), there is still lack of successful algorithms for it. In this paper, we use trellis codes to approach the theoretical limit. An embedded trellis code structure is proposed, and... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Parsing strategies for BWT compression

    Publication Year: 2001, Page(s):429 - 438
    Cited by:  Papers (6)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (560 KB) | HTML iconHTML

    Block-sorting is an innovative compression mechanism introduced by Burrows and Wheeler (1994), and has been the subject of considerable scrutiny in the years since it first became public. Block-sorting compression is usually described as involving three steps: permuting the input one block at a time through the use of the Burrows-Wheeler transform (BWT); applying a move-to-front (MTF) transform to... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Overlap in adaptive vector quantization

    Publication Year: 2001, Page(s):401 - 410
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (600 KB) | HTML iconHTML

    Constantinescu and Storer (1994) introduced an adaptive single-pass vector quantization algorithm (AVQ) that employs variable size and shaped codebook entries that are “learned” as an image is processed (no specific training or prior knowledge of the data is used). The approach allows the tradeoff between compression and fidelity to be continuously adjusted from lossless (with less com... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Joint source-channel decoding of correlated sources over noisy channels

    Publication Year: 2001, Page(s):283 - 292
    Cited by:  Papers (50)  |  Patents (9)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (436 KB) | HTML iconHTML

    We consider the case of two correlated binary information sequences. Instead of compressing the information using source coding, both sequences are independently channel encoded, and transmitted over an AWGN channel. The correlation between both sequences is exploited at the receiver, allowing reliable communications at signal to noise ratios very close to the theoretical limits established by the... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Group testing for wavelet packet image compression

    Publication Year: 2001, Page(s):73 - 82
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (444 KB) | HTML iconHTML

    This paper introduces group testing for wavelet packets (GTWP), a novel embedded image compression algorithm based on wavelet packets and group testing. This algorithm extends the group testing for wavelets (GTW) algorithm to handle wavelet packets. Like its predecessor, GTWP obtains good compression performance without the use of arithmetic coding. It also shows that the group testing methodology... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Length-restricted coding in static and dynamic frameworks

    Publication Year: 2001, Page(s):133 - 142
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (504 KB) | HTML iconHTML

    This paper describes variants of a recent length-restricted coding technique for use in static and dynamic frameworks. The resulting compression systems are shown to have identical asymptotic time complexity and also competitive performance to the corresponding unrestricted systems View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Construction of low complexity regular quantizers for overcomplete expansions in RN

    Publication Year: 2001, Page(s):193 - 202
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (532 KB) | HTML iconHTML

    We study the construction of structured regular quantizers for overcomplete expansions in RN. Our goal is to design structured quantizers allowing simple reconstruction algorithms with low (memory and computational) complexity and having good performance in terms of accuracy. Most related work to date in quantized redundant expansions has assumed that uniform scalar quantization with th... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Tag insertion complexity

    Publication Year: 2001, Page(s):243 - 252
    Cited by:  Patents (38)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (492 KB) | HTML iconHTML

    This paper is about inferring markup information, a generalization of part-of-speech tagging. We use compression models based on a marked-up training corpus and apply them to fresh, unmarked, text. In effect, this technique builds filters that extract information from text in a way that is generalized because it depends on training text rather than preprogrammed heuristics. As illustrated, we use ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.