By Topic

Data Compression Conference, 2000. Proceedings. DCC 2000

Date 28-30 March 2000

Filter Results

Displaying Results 1 - 25 of 92
  • Proceedings DCC 2000. Data Compression Conference

    Publication Year: 2000
    Request permission for commercial reuse | PDF file iconPDF (348 KB)
    Freely Available from IEEE
  • Trees, windows and tiles for wavelet image compression

    Publication Year: 2000, Page(s):283 - 292
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (134 KB)

    We investigate the task of compressing an image by using different probability models for compressing different regions of the image. In an earlier paper, we introduced a class of probability models for images, the k-rectangular tiling of an image, which is formed by partitioning the image into k rectangular regions and generating the coefficients within each region by using a probability model se... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Seismic data compression using GenLOT: towards "optimality"?

    Publication Year: 2000
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (54 KB)

    Summary form only given. Seismic data compression is desirable in geophysics for both storage and transmission stages. Wavelet coding methods have generated interesting developments, including a real-time field test trial in the North Sea in 1995. Previous work showed that GenLOT with basic optimization also outperforms state-of-the-art biorthogonal wavelet coders for seismic data. In this paper, ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • QccPack: an open-source software library for quantization, compression, and coding

    Publication Year: 2000
    Cited by:  Papers (29)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (8 KB)

    Summary form only given. We describe the QccPack software package, an open-source collection of library routines and utility programs for quantization, compression, and coding of data. QccPack is written to expedite data compression research and development by providing general and reliable implementations of common compression techniques. Functionality of the current release includes entropy codi... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Multi-resolution adaptation of the SPIHT algorithm for multiple description

    Publication Year: 2000, Page(s):303 - 312
    Cited by:  Papers (4)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (196 KB)

    Multiple description codes are data compression algorithms designed with the goal of minimizing the distortion caused by data loss in packet-based or diversity communications systems. Recently, techniques that achieve multiple description coding by combining embedded source codes with unequal error protection channel codes have become popular in the literature. These codes allow for data reconstru... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Some notes on the context mapping function in lossless data compression

    Publication Year: 2000
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (84 KB)

    One of the major challenges when applying (serial) universal source coding to 2-dimensional data, e.g., images, is to determine suitable context data. For this reason the concept of context mapping function (CMF) has been introduced. In this paper we discuss the foundation for CMF as well an off-line construction method via a combinatorial optimization method View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Modifications of uniform quantization applied in wavelet coder

    Publication Year: 2000, Page(s):293 - 302
    Cited by:  Papers (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (188 KB)

    An algorithm of wavelet domain data quantization aimed at improving compression efficiency is presented. Threshold data selection is proposed as a more effective uniform quantization modification than zero-zone increasing. To fit adaptively the threshold value to local image features, the estimation of the significance expectation for each wavelet coefficient was included in the thresholding proce... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • On-line decision making for a class of loss functions via Lempel-Ziv parsing

    Publication Year: 2000, Page(s):163 - 172
    Cited by:  Papers (5)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (248 KB)

    Prefetching in computer memory architectures is formalized as a sequential decision problem in which the instantaneous losses depend not only on the current action-observation pair, as in the traditional formulation, but also on past pairs. Motivated by the prefetching application, we study a class of loss functions that admit an efficient on-line decision algorithm. The algorithm uses the LZ78 pa... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Variable-to-fixed length codes: a geometrical approach to low-complexity source codes

    Publication Year: 2000
    Cited by:  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (56 KB)

    Summary form only given. We consider the coding of a binary IID source using variable-to-fixed length (VF) source codes. The goal is to design “good” codes of low complexity. A VF code maps variable length source sequences (segments) into fixed length code sequences (codewords). We conclude that Petry codes are an efficient implementation of Tunstall codes and moreover that by approxim... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Effect of image activity on lossy and lossless coding performance

    Publication Year: 2000
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (16 KB)

    Summary form only given. When we compress a variety of multimedia images using a fixed wavelet filter, the PSNR values vary widely. Similarly in lossless image compression using a fixed integer wavelet transform, the bit rates can vary sharply. These large variations can be attributed to the image activity measure (IAM). We define and use a number of IAM from image variance, edges, wavelet coeffic... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Prediction by grammatical match

    Publication Year: 2000, Page(s):153 - 162
    Cited by:  Papers (4)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (204 KB)

    We present prediction by grammatical match (PGM), a new general purpose adaptive text compression framework successfully blending finite-context and general context-free models. A PGM compressor operates incrementally by parsing a prefix of the input text, generating a set of analyses; these analyses are scored according to encoding cost, the cheapest is selected, and sent through an order k PPM e... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Low-complexity scalable image compression

    Publication Year: 2000, Page(s):23 - 32
    Cited by:  Papers (6)  |  Patents (6)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (44 KB)

    We have developed a scalable image compression scheme with a good performance-complexity trade-off. Like JPEG, it is based on the 8×8 block discrete cosine transform (DCT), but it uses no additional quantization or entropy coding (such as Huffman or arithmetic coding). Bit-rate or quality scalability is enabled by encoding the DCT coefficients bit plane by bit plane, starting at the most sig... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • PPM performance with BWT complexity: a new method for lossless data compression

    Publication Year: 2000, Page(s):203 - 212
    Cited by:  Papers (8)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (236 KB)

    This work combines a new fast context-search algorithm with the lossless source coding models of PPM to achieve a lossless data compression algorithm with the linear context-search complexity and memory of BWT and Ziv-Lempel codes and the compression performance of PPM-based algorithms. Both sequential and nonsequential encoding are considered. The proposed algorithm yields an average rate of 2.27... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Wavelet coding of 3-D shape data using space-frequency quantization

    Publication Year: 2000
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (36 KB)

    Summary form only given. Efficient representations of three dimensional object shape have attracted wide attention for transmission of computer graphics data and for interactive design in manufacturing. Polygonal meshes which consist of connectivity data of the vertices and their geometry data are often used for representing an object shape. The connectivity of the vertices is efficiently coded wi... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Piecewise linear image coding using surface triangulation and geometric compression

    Publication Year: 2000, Page(s):410 - 419
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (620 KB)

    The provably NP-hard problem of finding optimal piecewise linear approximation for images is extended from 1D curve fitting to 2D surface fitting by a dual-agent algorithm. The results not only yield a storage-efficient codec for range, or intensity, images but also a surface triangulation technique to generate a succinct, accurate and visually pleasant 3D visualization model. Compared with the tr... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Wavelet-based lossy compression of turbulence data

    Publication Year: 2000
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (80 KB)

    Summary form only given. Modelling of turbulence is among a class of Grand Challenge applications that severely strain data storage capabilities. The rate at which data is generated can actually limit the performance of the application. One tool that can assist is data compression. Unfortunately, lossless compression performs poorly on such data. Lossy compression is the only choice for turbulence... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Data compression with truncated suffix trees

    Publication Year: 2000
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (60 KB)

    Summary form only given. The suffix tree is an efficient data structure used for Ziv-Lempel coding schemes. We propose a new data structure called the k-truncated suffix tree (k-TST), which is a truncated version of the suffix tree. While the suffix tree maintains all substrings of a given string, the k-TST stores the substrings of length at most k, where k is a constant. Hence the truncated suffi... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • What's your sign?: efficient sign coding for embedded wavelet image coding

    Publication Year: 2000, Page(s):273 - 282
    Cited by:  Papers (11)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (176 KB)

    Wavelet transform coefficients are defined by both a magnitude and a sign. While promising algorithms exist for efficiently coding the transform coefficient magnitudes, current wavelet image coding algorithms are not efficient at coding the sign of the transform coefficients. It is generally assumed that there is no compression gain to be obtained from entropy coding of the sign. Only recently hav... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An efficient low-bit rate motion compensation technique based on quadtree

    Publication Year: 2000
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (24 KB)

    Summary form only given. A quadtree structured motion compensation technique effectively utilizes the motion content of a frame as oppose to a fixed size block motion compensation technique. In this paper, we propose a novel quadtree-structured region-wise motion compensation technique that divides a frame into equilateral triangle blocks using the quadtree structure. Arbitrary partition shapes ar... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Channel decoding using inter- and intra-correlation of source encoded frames

    Publication Year: 2000, Page(s):103 - 112
    Cited by:  Papers (2)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (80 KB)

    The goal of source controlled channel decoding is to improve the performance of the channel decoder by using the residual redundancy of the source encoded data. The original approach exploits only the interframe correlation between bits and a few simple schemes have been devised in order to use the intraframe correlation. In this paper, we present an efficient method to exploit the intraframe corr... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Product code and recurrent alternative decoding for wireless image transmission

    Publication Year: 2000
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (48 KB)

    Summary form only given. We have developed a novel channel coding scheme for image transmission over wireless channels. A key component of this scheme is the construction of the product code using a convolutional code and a recursive systematic code (RSC) in the horizontal and vertical directions, respectively. High performance transmission has been achieved through an innovative recurrent alterna... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Analysis of optimal filter banks for multiple description coding

    Publication Year: 2000, Page(s):323 - 332
    Cited by:  Papers (4)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (236 KB)

    We study the problem of multiple description (MD) coding of stationary Gaussian sources with memory. First, we compute an approximate rate distortion region for these sources, which we prove to be asymptotically tight at high rates: this region generalizes the standard MD rate distortion region for memoryless sources. Then we develop an algorithm for the design of optimal biorthogonal filter banks... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Compression of SMIL documents

    Publication Year: 2000
    Cited by:  Patents (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (8 KB)

    Summary form only given. The World Wide Web Consortium (W3C) standard synchronized multimedia integration language (SMIL) is an HTML-like mark-up language to describe temporal behavior and presentation layout for multimedia objects. SMIL is widely used in today's Internet. Many video clips send SMIL documents to clients before transmitting video and audio streams. Data compression is a process to ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The skip-innovation model for sparse images

    Publication Year: 2000, Page(s):43 - 52
    Cited by:  Papers (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (2228 KB)

    On sparse images, contiguous runs of identical symbols often occur in the same coding context. This paper proposes a model for efficiently encoding such runs in a two-dimensional setting. Because it is model based, the method can be used with any coding scheme. An experimental coder using the model compresses CCITT facsimile documents 2% better than JBIG and is more than three times as fast. A low... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Optimal subtractive dither for near-lossless compression

    Publication Year: 2000, Page(s):223 - 232
    Cited by:  Papers (1)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (384 KB)

    Subtractive dither is a technique which may be used to reduce the occurrence of compression artifacts from near-lossless compression. Standard subtractive dither incurs a cost, however, in the form of an increase in rate and distortion, and by giving the reconstructed signal an overall grainy appearance. It is possible to compromise between the costs and benefits of dithering by using dither signa... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.