Notice
There is currently an issue with the citation download feature. Learn more

Proceedings DCC 2002. Data Compression Conference

4-4 April 2002

Filter Results

Displaying Results 1 - 25 of 86
  • Proceedings DCC 2002. Data Compression Conference

    Publication Year: 2002
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (394 KB)

    The following topics are dealt with: image compression; image coding; audio coding; codes; decoding; video compression; vector quantization; scalar quantization; rate-distortion theory; source coding. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Pattern matching in BWT-transformed text

    Publication Year: 2002
    Cited by:  Papers (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (196 KB) | HTML iconHTML

    Summary form only given. The compressed pattern matching problem is to locate the occurrence(s) of a pattern P in a text string T using a compressed representation of T, with minimal (or no) decompression. The BWT performs a permutation of the characters in the text, such that characters in lexically similar contexts will be near to each other. The motivation for our approach is the observation th... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Geometry compression of 3-D mesh models using a joint prediction

    Publication Year: 2002
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (220 KB) | HTML iconHTML

    Summary form only given. We address geometry coding of 3D models. Conventional geometry coding schemes quantize vertex positions using a bounding box before coding it by an entropy coder. However, in the proposed scheme, we first predict vertex positions and then quantize them differentially. Our geometry encoder consists of four stages: preprocessing, prediction, quantization, and entropy coding.... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Generalization of the BWT transformation and inversion ranks

    Publication Year: 2002
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (617 KB) | HTML iconHTML

    Summary form only given. We expand the theoretical foundations of LPSA (lexical permutation sorting algorithm) (see Arnavut, Z., and Magliveras, S.S., The Computer Journal, vol.40, no.5, pp.292-5, 1997) from permutations to multiset permutations (data strings) and give the general theory behind the BWT combinatorially. We then show the information theoretic relationship between interval ranks and ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • LZAC lossless data compression

    Publication Year: 2002
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (182 KB)

    Summary form only given. This paper presents LZAC, a new universal lossless data compression algorithm derived from the popular and widely used LZ77 family. The objective of LZAC is to improve the compression ratios of the LZ77 family while still retaining the family's key characteristics: simple, universal and fast in decoding and economical in memory consumption. LZAC presents two new ideas: com... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A parallel algorithm for lossless image compression by block matching

    Publication Year: 2002
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (227 KB) | HTML iconHTML

    Summary form only given. We show a parallel algorithm using a rectangle greedy matching technique which requires a linear number of processors and O(log(M)log(n)) time on the PRAM EREW model. The algorithm is suitable for practical parallel architectures as a mesh of trees, a pyramid or a multigrid. We implement a sequential procedure which simulates the compression performed by the parallel algor... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Fast peak autocorrelation finding for periodicity-exploiting compression methods

    Publication Year: 2002
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (197 KB) | HTML iconHTML

    Summary form only given. Bilevel image compression algorithms like JBIG, JBIG2-Generic, and PRECIS can exploit 1D or 2D peak autocorrelation in binary images like 'digital halftones', in order to achieve breakthrough boosts in additional compression. For hard to compress, but periodic halftones, boosts of factors of three or more times the compression ratios and similar increases in decompression ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A wavelet based low complexity embedded block coding algorithm

    Publication Year: 2002
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (276 KB)

    Summary form only given. Along with compression efficiency, other factors, like complexity, are significant issues for image coding. The measure of complexity varies from application to application. To overcome the problems of large database maintenance and the high computational burden of EZT and SPIHT, a new algorithm, WEBLOC (Wavelet-based Embedded BLOck Coding), is proposed for low complexity,... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Image coding with the MAP criterion

    Publication Year: 2002
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (236 KB) | HTML iconHTML

    Summary form only given. BCJR based source coding of image residuals is investigated. From a trellis representation of the residual, a joint source-channel coding system is formed. Then the BCJR algorithm is applied to find the MAP encoding. MAP and minimized squared error encoding are compared. The novelty of this work is the use of the BCJR algorithm and the MAP criterion in the source coding pr... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Diagnostically lossless 3D wavelet compression for digital angiogram video

    Publication Year: 2002
    Cited by:  Papers (6)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (191 KB) | HTML iconHTML

    Summary form only given. A novel method for the compression of angiogram video sequences is presented. The approach consists of a 3D wavelet encoding algorithm, incorporating a region of interest (ROI) estimation model to provide higher-quality image reconstruction in areas considered to be diagnostically significant. This is coupled with a texture modelling procedure, which provides a model for s... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Overhead-constrained rate-allocation for scalable video transmission over networks

    Publication Year: 2002
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (214 KB) | HTML iconHTML

    Summary form only given. Forward error correction (FEC) based schemes are use widely to address the packet loss problem for Internet video. Given total available bandwidth, finding optimal bit allocation is very important in FEC-based video, because the FEC bit rate limits the rate available to compress video. We want to give proper protection to the source, but also prevent unwanted FEC rate expa... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Extended Golomb codes for binary Markov sources

    Publication Year: 2002
    Cited by:  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (214 KB)

    Summary form only given. Elementary Golomb codes have been widely used for compressing correlated binary sources. We study the theoretical bit-rate performance of two different Golomb coding methods on binary Markov sources: the sequential coding method, and the interleaved coding method. Although the theoretical bit-rate performance for these codes on on i.i.d. sources is known, to the best of ou... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A character elimination algorithm for lossless data compression

    Publication Year: 2002
    Cited by:  Patents (19)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (191 KB) | HTML iconHTML

    Summary form only given. We present a detailed description of a lossless compression algorithm intended for use on files with non-uniform character distributions. This algorithm takes advantage of the relatively small distances between character occurrences once we remove the less frequent characters. This allows it to create a compressed version of the file that, when decompressed, is an exact co... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Rate control using conditional mean estimator

    Publication Year: 2002
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (193 KB) | HTML iconHTML

    Summary form only given. This paper presents a simple, fast and accurate rate control algorithm using conditional mean estimator (nonlinear regression) that plays a central role in estimation theory. Central to nonlinear estimation and stochastic control problems is the determination of the probability density function of the state conditioned on the available data. If this a posteriori density fu... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • On coding of sources with two-sided geometric distribution using binary decomposition

    Publication Year: 2002
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (227 KB)

    Summary form only given. We address the problem of entropy coding of integers i/spl isin/Z with a probability distribution defined as the two-sided geometric distribution (TSGD) which arises mainly in tasks of image and video compression. An efficient method based on binary tree decomposition of the source alphabet, combined with binary arithmetic coding, was proposed for coding of DC and AC coeff... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Using multiple Huffman code tables for optimal coding of DCT blocks

    Publication Year: 2002
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (189 KB)

    Summary form only given. It is a well-known observation that when a DCT block is traversed in a zig-zag order, the AC coefficients generally decrease in size and the run-length of zero coefficients increase in number. Therefore, use of a single AC Huffman code table in the JPEG baseline algorithm leads to sub-optimal coding, and it is desirable to use multiple code tables, one for each DCT coeffic... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Low-complexity interpolation coding for server-based computing

    Publication Year: 2002
    Cited by:  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (185 KB)

    Summary form only given. The growing total cost of ownership has resulted in a shift away from the distributed model of desktop computing toward a more centralized server-based computing (SBC) model. In SBC, all application logic is executed on the server while clients simply process the resulting screen updates sent from the server. To provide good performance, SBC systems employ various techniqu... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Embedded coding of palette images in the topological space

    Publication Year: 2002
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (188 KB) | HTML iconHTML

    Summary form only given. Most existing image coding techniques resolve the uncertainty of an image source on a pixel-by-pixel basis. We demonstrate the effectiveness of region-based image models for the class of palette images. We propose to represent the index map of a palette image by a collection of successively refined color regions, from which the original index map can be reconstructed witho... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Perceptual preprocessing techniques applied to video compression: some result elements and analysis

    Publication Year: 2002
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (222 KB)

    Summary form only given. The developments in video coding research deal with solutions to improve the picture quality while decreasing the bit rates. However, no major breakthrough in compression emerged and low bit rate high quality video compression is still an open issue. The compression scheme is generally decomposed into two stages: coding and decoding. In order to improve the compression eff... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Minimizing distortion via multiuser resource allocation

    Publication Year: 2002
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (186 KB) | HTML iconHTML

    Summary form only given. Source-controlled resource allocation is investigated for orthogonal transmission of heterogenous sources over a multiuser system. Different scenarios are considered where the system resources rate, bandwidth, and power are shared among all users in such a way that the average distortion over all users is minimized. This may prove beneficial to understand the trade-offs in... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Turbo source coding: a noise-robust approach to data compression

    Publication Year: 2002
    Cited by:  Papers (27)  |  Patents (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (256 KB)

    Summary form only given. All traditional data compression techniques, such as Huffman coding, the Lempel-Ziv algorithm, run-length limited coding, Tunstall coding and arithmetic coding are highly susceptible to residual channel errors and noise. We have previously proposed the use of parallel concatenated codes and iterative decoding for fixed-length to fixed-length source coding, i.e., turbo codi... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • MPEG-7 binary format for XML data

    Publication Year: 2002
    Cited by:  Patents (5)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (188 KB) | HTML iconHTML

    Summary form only given. For the MPEG-7 standard, a binary format for the encoding of XML data has been developed that meets a set of requirements that was derived from a wide range of targeted applications. The resulting key features of the binary format are: high data compression (up to 98% for the document structure), provision of streaming, dynamic update of the document structure, random orde... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Compression of image block means for non-equal block size partition schemes using Delaunay triangulation and prediction

    Publication Year: 2002
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (189 KB) | HTML iconHTML

    Summary form only given. An approach based on applying Delaunay triangulation to compression of mean values of image blocks that have non-identical shape and size is proposed. It can be useful for image compression methods that require the use of image partition schemes with non-equal block size like fractal and DCT-based image coding. Several methods of block mean value coding are considered. In ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • String matching with stopper compression

    Publication Year: 2002
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (240 KB) | HTML iconHTML

    Summary form only given. We consider string searching in compressed texts. We utilize a compression method related to static Huffman compression. Characters are encoded as variable length sequences of base symbols, which consist of a fixed number of bits. Because the length of a code as base symbols varies, we divide base symbols into stoppers and continuers in order to be able to recognize where ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A method for compressing lexicons

    Publication Year: 2002
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (197 KB) | HTML iconHTML

    Summary form only given. Lexicon lookup is an essential part of almost every natural language processing system. A natural language lexicon is a set of strings where each string consists of a word and the associated linguistic data. Its computer representation is a structure that returns appropriate linguistic data on a given input word. It should be small and fast. We propose a method for lexicon... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.