By Topic

Data Compression Conference, 1999. Proceedings. DCC '99

Date 29-31 March 1999

Filter Results

Displaying Results 1 - 25 of 105
  • Proceedings DCC'99 Data Compression Conference (Cat. No. PR00096)

    Publication Year: 1999
    Request permission for commercial reuse | PDF file iconPDF (390 KB)
    Freely Available from IEEE
  • Single-resolution compression of arbitrary triangular meshes with properties

    Publication Year: 1999, Page(s):247 - 256
    Cited by:  Papers (22)  |  Patents (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (309 KB)

    We propose a new layering structure to partition an arbitrary triangular mesh (no-manifold and arbitrary-genus) into generalized triangle strips. An efficient and flexible encoding of the connectivity, vertex coordinates and attribute data yields excellent single-resolution compression. This scheme gracefully solves the "crack" problem and also prevents error propagation while providing efficient ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Author index

    Publication Year: 1999, Page(s):564 - 566
    Request permission for commercial reuse | PDF file iconPDF (380 KB)
    Freely Available from IEEE
  • Compressing ATM streams on-line

    Publication Year: 1999
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (48 KB)

    Summary form only given. Asynchronous transfer mode (ATM) is one of the state-of-the art network protocols nowadays. A main characteristic of an ATM network is that its switches should be fairly simple and inexpensive. As a result, a significant part of the network cost is in the cost of the links. So by increasing the network traffic we can send over a given link, we will certainly increase the e... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • “Bit rate on demand” using pruned tree-structured hierarchical lookup vector quantization

    Publication Year: 1999, Page(s):42 - 51
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (92 KB)

    We propose a real-time image coding system, which is capable of adapting instantaneously to the available channel bandwidth. The range of operational bandwidths for this system has a finer calibration than ordinary hierarchical vector quantization (HVQ) or wavelet based hierarchical vector quantization (WHVQ) methods suggested in the literature. These properties make it very attractive for network... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Reduced comparison search for the exact GLA

    Publication Year: 1999, Page(s):33 - 41
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (80 KB)

    This paper introduces a new method for reducing the number of distance calculations in the generalized Lloyd algorithm (GLA), which is a widely used method to construct a codebook in vector quantization. The reduced comparison search detects the activity of the code vectors and utilizes it on the classification of the training vectors. For training vectors whose current code vector has not been mo... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Joint image compression and classification with vector quantization and a two dimensional hidden Markov model

    Publication Year: 1999, Page(s):23 - 32
    Cited by:  Papers (11)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (188 KB)

    We present an algorithm to achieve good compression and classification for images using vector quantization and a two dimensional hidden Markov model. The feature vectors of image blocks are assumed to be generated by a two dimensional hidden Markov model. We first estimate the parameters of the model, then design a vector quantizer to minimize a weighted sum of compression distortion and classifi... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Multiple description lattice vector quantization

    Publication Year: 1999, Page(s):13 - 22
    Cited by:  Papers (55)  |  Patents (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (652 KB)

    We consider the problem of designing a lattice-based multiple description vector quantizer for a two-channel diversity system. The design of such a quantizer can be reduced to the problem of assigning pair labels to points of a vector quantizer codebook. A general labeling procedure based on the structure of the lattice is presented, along with detailed results for the hexagonal lattice: algorithm... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Eigen wavelet: hyperspectral image compression algorithm

    Publication Year: 1999
    Cited by:  Papers (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (64 KB)

    Summary form only given. The increased information content of hyperspectral imagery over multispectral data has attracted significant interest from the defense and remote sensing communities. We develop a mechanism for compressing hyperspectral imagery with no loss of information. The challenge of hyperspectral image compression lies in the non-isotropy and non-stationarity that is displayed acros... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Sorted sliding window compression

    Publication Year: 1999
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (136 KB)

    Summary form only given. Sorted sliding window compression (SSWC) uses a new model (sorted sliding window model, SSWM) to encode strings efficiently, which appear again while encoding a symbol sequence. The SSWM holds statistics of all strings up to a certain length k in a sliding window of size n. The compression program can use the SSWM to determine if the string of the next symbols are already ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Image compression based on low-pass wavelet transform and multi-scale edge compensation. Part II: MSEC model

    Publication Year: 1999
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (8 KB)

    Summary form only given. This paper presents the idea of multi-scale edge compensation, and puts forward an image compression method (MSEC) based on the low-pass wavelet transform and multi-scale edge compensation. The encoder performs edge detection, edge compensation at every scale from fine to coarse, outputs the model information and the final background. The decoder synthesizes the image acco... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Multiple description decoding of overcomplete expansions using projections onto convex sets

    Publication Year: 1999, Page(s):72 - 81
    Cited by:  Papers (30)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (272 KB)

    This paper presents a POCS-based algorithm for consistent reconstruction of a signal x∈RK from any subset of quantized coefficients yεRN in an N×K overcomplete frame expansion y=Fx, N=2K. By choosing the frame operator F to be the concatenation of two K×K invertible transforms, the projections may be computed in RK using only the transforms an... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Generalized multiple description vector quantization

    Publication Year: 1999, Page(s):3 - 12
    Cited by:  Papers (39)  |  Patents (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (200 KB)

    Packet-based data communication systems suffer from packet loss under high network traffic conditions. As a result, the receiver is often left with an incomplete description of the requested data. Multiple description source coding addresses the problem of minimizing the expected distortion caused by packet loss. An equivalent problem is that of source coding for data transmission over multiple ch... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Quadtree classification and TCQ image coding

    Publication Year: 1999, Page(s):149 - 157
    Cited by:  Papers (2)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (104 KB)

    The SPIHT algorithm is shown to implicitly use quadtree-based classification. The rate-distortion encoding performance of the classes is described, and quantization improvements presented. A new encoding algorithm combines a general SPIHT data structure with the granular gain of multi-dimensional quantization to achieve improved PSNR versus rate performance View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Encoding time reduction in fractal image compression

    Publication Year: 1999
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (92 KB)

    Summary form only given. The mathematical interpretation of fractal image compression is strongly related to Banach's fixed point theorem. More precisely, if (X,d) represents a metric space of digital images where d is a given suitable metric, we want to think of an element of X that we wish to encode as a fixed point of some operator. Since we are dealing with coding images, the choice of the met... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Protein is incompressible

    Publication Year: 1999, Page(s):257 - 266
    Cited by:  Papers (13)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (88 KB)

    Life is based on two polymers, DNA and protein, whose properties can be described in a simple text file. It is natural to expect that standard text compression techniques would work on biological sequences as they do on English text. But biological sequences have a fundamentally different structure from linguistic ones, and standard compression schemes exhibit disappointing performance on them. We... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Codes for data synchronization with timing

    Publication Year: 1999, Page(s):443 - 452
    Cited by:  Papers (2)  |  Patents (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (320 KB)

    This paper investigates the design and analysis of data synchronization codes whose decoders have the property that, in addition to reestablishing correct decoding after encoded data is lost or afflicted with errors, they produce the original time index of each decoded data symbol modulo some integer T. The motivation for such data synchronization with timing is that in many situations where data ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Modified SPIHT encoding for SAR image data

    Publication Year: 1999
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (48 KB)

    Summary form only given. We developed a wavelet-based SAR image compression algorithm which combines tree-structured texture analysis, soft-thresholding speckle reduction, quadtree homogeneous decomposition, and a modified zero-tree coding scheme. First, the tree-structured wavelet transform is applied to the SAR image. The decomposition is no longer simply applied to the low-scale subsignals recu... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • On entropy-constrained residual vector quantization design

    Publication Year: 1999
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (104 KB)

    Summary form only given. Entropy-constrained residual vector quantization (EC-RVQ) has been shown to be a competitive compression technique. Its design procedure is an iterative process which typically consists of three steps: encoder update, decoder update, and entropy coder update. We propose a new algorithm for the EC-RVQ design. The main features of our algorithm are: (i) in the encoder update... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Design consideration for multi-lingual cascading text compressors

    Publication Year: 1999
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (144 KB)

    Summary form only given. We study the cascading of LZ variants to Huffman coding for multilingual documents. Two models are proposed: the static model and the adaptive (dynamic) model. The static model makes use of the dictionary generated by the LZW algorithm in Chinese dictionary-based Huffman compression to achieve better performance. The dynamic model is an extension of the static cascading mo... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Software synthesis of variable-length code decoder using a mixture of programmed logic and table lookups

    Publication Year: 1999, Page(s):121 - 130
    Cited by:  Papers (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (176 KB)

    Implementation of variable-length code (VLC) decoders can involve a tradeoff between the number of decoding steps and memory usage. In this paper, we proposed a novel scheme for optimizing this tradeoff using a machine model abstracted from general purpose processors with hierarchical memories. We formulate the VLC decode problem as an optimization problem where the objective is to minimize the av... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A fractional chip wavelet zero tree codec (WZT) for video compression

    Publication Year: 1999
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (8 KB)

    [Summary form only given]. We introduce a motion wavelet transform zero tree (WZT) codec which achieves good compression ratios and can be implemented in a single ASIC of modest size. The codec employs a group of pictures (GOP) of two interlaced video frames, edge filters for the boundaries, intermediate field image compression and block compression structure. Specific features of the implementati... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A novel dual-path architecture for HDTV video decoding

    Publication Year: 1999
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (16 KB)

    Summary form only given. We present an architecture for digital HDTV video decoding (MPEG-2 MP@HL), based on dual decoding data paths controlled in a block layer synchronization manner and an efficient write back scheme. Our fixed schedule controller synchronizes the baseline units on a block basis in both data-paths. This scheme reduces embedded buffer sizes within the decoder and eliminates a lo... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Embedded post-processing for enhancement of compressed images

    Publication Year: 1999, Page(s):62 - 71
    Cited by:  Papers (8)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (384 KB)

    This paper presents a simple and effective post-processing method for compressed images. This work focuses on the cyclic time-variance introduced by block-based and subband transform coders. We propose an algorithm to (almost) restore stationarity to the cyclo-stationary output of the conventional transform coders. Despite a simple, non-iterative structure, this method outperforms other methods of... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Resynchronizing variable-length codes for robust image transmission

    Publication Year: 1999
    Cited by:  Patents (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (8 KB)

    Summary form only given. This paper considers instantaneous prefix codes with possibly unequal codeword lengths but specific codewords, such as a Huffman code. Resynchronizing VLC (RVLC) contain one or more synchronizing codewords that resynchronize the decoder regardless of any previous data. Previous applications of optimal resynchronizing VLC have been limited to sources with alphabets of sizes... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.