By Topic

Data Compression Conference, 1998. DCC '98. Proceedings

March 30 1998-April 1 1998

Filter Results

Displaying Results 1 - 25 of 118
  • Proceedings DCC '98 Data Compression Conference (Cat. No.98TB100225)

    Publication Year: 1998
    Request permission for commercial reuse | PDF file iconPDF (440 KB)
    Freely Available from IEEE
  • Author index

    Publication Year: 1998, Page(s):587 - 589
    Request permission for commercial reuse | PDF file iconPDF (107 KB)
    Freely Available from IEEE
  • Analysis and comparison of various image downsampling and upsampling methods

    Publication Year: 1998
    Cited by:  Papers (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (52 KB)

    Summary form only given. The goal is to gain a better understanding of the behavior of the image down/upsampling combinations, and find better down/upsampling methods. We examined existing down/upsampling methods and proposed new ones. We formulated a frequency response approach for understanding and evaluating down/upsampling combinations. The approach was validated experimentally by running the ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Switching between two universal source coding algorithms

    Publication Year: 1998, Page(s):491 - 500
    Cited by:  Papers (15)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (196 KB)

    This paper discusses a switching method which can be used to combine two sequential universal source coding algorithms. The switching method treats these two algorithms as black-boxes and can only use their estimates of the probability distributions for the consecutive symbols of the source sequence. Three weighting algorithms based on this switching method are presented. Empirical results show th... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Universal data compression and linear prediction

    Publication Year: 1998, Page(s):511 - 520
    Cited by:  Papers (6)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (164 KB)

    The relationship between prediction and data compression can be extended to universal prediction schemes and universal data compression. Previous work shows that minimizing the sequential squared prediction error for individual sequences can be achieved using the same strategies which minimize the sequential code length for data compression of individual sequences. Defining a “probability&rd... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Post-processing enhancement of decompressed images using variable order Bezier polynomials and distance transform

    Publication Year: 1998
    Cited by:  Papers (6)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (80 KB)

    Summary form only given. We post-process images compressed by lossy JPEG algorithms (DCT and LOGO), following decompression, using Bezier polynomials for enhancement. Both JPEG algorithms, namely the DCT-based JPEG and the near-lossless baseline JPEG-LS (aka LOGO), employ spatial quantization. The DCT-based version introduces an artifact known as mosaic or “blockiness” by coarsely quan... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Efficient lossless coding of medical image volumes using reversible integer wavelet transforms

    Publication Year: 1998, Page(s):428 - 437
    Cited by:  Papers (11)  |  Patents (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (124 KB)

    A novel lossless medical image compression algorithm based on three-dimensional integer wavelet transforms and zerotree coding is presented. The EZW algorithm is extended to three dimensions and context-based adaptive arithmetic coding is used to improve its performance. The algorithm (3-D CB-EZW) efficiently encodes image volumes by exploiting the dependencies in all three dimensions, while enabl... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Control structure efficiency enhancement for predictive video coding

    Publication Year: 1998
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (8 KB)

    Summary form only given. To achieve superior video compression performance it is generally necessary to base the encoding decisions on a local scale, instead of the traditional frame-by-frame approach. Partitioning schemes for optimal frame subdivision have been proposed. Unfortunately, a full local approach that leads to inhomogeneous partitioning of the frame encounters the serious problem of de... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Intensity controlled motion compensation

    Publication Year: 1998, Page(s):249 - 258
    Cited by:  Patents (5)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (188 KB)

    A new motion compensation technique that allows more than one motion vector inside each block is introduced. The technique uses the intensity information to determine which motion vector to apply at any given pixel. An efficient motion estimation algorithm is described that finds near optimal selections of motion vectors. The simulation results show a significant improvement in the prediction accu... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Conditional source coding with competitive lists

    Publication Year: 1998
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (72 KB)

    Summary form only given. A new lossless source coding algorithm was developed that achieves a compression ratio slightly better than the Lempel-Ziv-Welch algorithm, but requires as little as 250 kBytes of storage. The algorithm is based on the context-tree approach, encoding one input symbol at a time. Thus, its throughput lies in a range comparable to the PPM algorithm. The very low memory requir... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A new compression scheme for syntactically structured messages (programs) and its application to Java and the Internet

    Publication Year: 1998
    Cited by:  Papers (4)  |  Patents (4)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (64 KB)

    Summary form only given. The immense demand of required network bandwidth to load computer programs like Java applets or applications within an acceptable time makes efficient compression schemes highly desirable. Syntax-oriented coding (SOC) is a new compression scheme that is able to eliminate redundancy caused by syntactical restrictions. Current compression schemes with a lexical view of the s... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Embedded trellis coded quantization

    Publication Year: 1998, Page(s):93 - 102
    Cited by:  Papers (12)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (208 KB)

    Embedded trellis coded quantization (E-TCQ) is introduced as an embedded quantization technique which achieves good rate-distortion performance for reasonable computational complexity. The performance of E-TCQ is investigated for memoryless Gaussian, Laplacian, and uniform sources. E-TCQ is shown to outperform multi-stage TCQ. For Gaussian and Laplacian sources the performance of ETCQ shows large ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Simple pre-processors significantly improve LZ 1 compression

    Publication Year: 1998
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (16 KB)

    Summary form only given. The effectiveness of the LZ 1 class of lossless adaptive data compression algorithms can, for many different types of data, be significantly improved by employing a dual stage compression/decompression process. A pre-processing stage first re-codes the input data stream in such a way as to make it more amenable to subsequent LZ 1 compression. To decode the data, the invers... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Lossless compression of video using motion compensation

    Publication Year: 1998
    Cited by:  Papers (10)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (76 KB)

    Summary form only given. We investigate lossless coding of video using predictive coding and motion compensation. The new coding methods combine state-of-the-art lossless techniques as JPEG (context based prediction and bias cancellation, Golomb coding), with high resolution motion field estimation, 3D predictors, prediction using one or multiple (k) previous images, predictor dependent error mode... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Non-uniform PPM and context tree models

    Publication Year: 1998, Page(s):279 - 288
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (204 KB)

    The problem of optimizing PPM with the help of different choices of estimators and their parameters for different subsets of nodes in the context tree is considered. Methods of such optimization for Markov chain and context tree models for individual files and over given sets of files are presented, and it is demonstrated that the extension from Markov chain models to context tree models is necess... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A scalable entropy code

    Publication Year: 1998
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (36 KB)

    Summary form only given. We present an algorithm for constructing entropy codes that allow progressive transmission. The algorithm constructs codes by forming an unbalanced tree in a similar to fashion to Huffman coding. It differs, however, in that nodes are combined in a rate-distortion sense. Because nodes are formed with both rate and distortion in mind, each internal tree node, in addition to... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A locally optimal design algorithm for block-based multi-hypothesis motion-compensated prediction

    Publication Year: 1998, Page(s):239 - 248
    Cited by:  Papers (25)  |  Patents (59)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (228 KB)

    Multi-hypothesis motion-compensated prediction extends traditional motion-compensated prediction used in video coding schemes. Known algorithms for block-based multi-hypothesis motion-compensated prediction are, for example, overlapped block motion compensation (OBMC) and bidirectionally predicted frames (B-frames). This paper presents a generalization of these algorithms in a rate-distortion fram... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Lossless compression of pre-press images using linear color decorrelation

    Publication Year: 1998
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (52 KB)

    Summary form only given. In the pre-press industry color images have both a high spatial and a high color resolution. Such images require a considerable amount of storage space and impose long transmission times. Data compression is desired to reduce these storage and transmission problems. Because of the high quality requirements in the pre-press industry only lossless compression is acceptable. ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Reversible variable length codes for efficient and robust image and video coding

    Publication Year: 1998, Page(s):471 - 480
    Cited by:  Papers (42)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (168 KB)

    The International Telecommunications Union (ITU) has adopted reversible variable length codes (RVLCs) for use in the emerging H.263+ video compression standard. As the name suggests, these codes can be decoded in two directions and can therefore be used by a decoder to enhance robustness in the presence of transmission bit errors. In addition, these RVLCs involve little or no efficiency loss relat... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • On optimality of variants of the block sorting compression

    Publication Year: 1998
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (76 KB)

    Summary form only given. Block sorting uses the Burrows-Wheeler transformation (BWT) which permutes an input string. The permutation is defined by the lexicographic order of contexts of symbols. If we assume that symbol probability is defined by preceding k symbols called context, symbols whose contexts are the same are collected in consecutive regions after the BWT. Sadakane (1997) proposed a var... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Improved lossless halftone image coding using a fast adaptive context template selection scheme

    Publication Year: 1998
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (76 KB)

    Applications such as printing on demand and personalized printing have arisen where lossless halftone image compression can be useful for increasing transmission speed and lowering storage costs. We present an improvement on the context modeling scheme by adapting the context template to the periodic structure of the halftone image. This is a non-trivial problem for which we propose a fast close-t... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Successively refinable trellis coded quantization

    Publication Year: 1998, Page(s):83 - 92
    Cited by:  Papers (6)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (156 KB)

    We propose successively refinable trellis coded quantizers which are suitable for progressive transmission. A new trellis structure which is scalable is used in the design of our trellis coded quantizers. A hierarchical set partitioning is used to preserve successive refillability. Two algorithms for designing trellis coded quantizers which provide embedded bit streams are provided. The computatio... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Adaptive wavelet transforms for image coding using lifting

    Publication Year: 1998
    Cited by:  Papers (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (84 KB)

    Summary form only given. Image compression relies on efficient representations of images, and within smooth image regions, the wavelet transform provides such a representation. However, near edges, wavelet coefficients decay slowly and are expensive to code. We focus on improving the transform by incorporating adaptivity. Construction of nonlinear filter banks has been discussed, but the question ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A multimode context-based lossless wavelet image coder

    Publication Year: 1998
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (52 KB)

    Summary form only given. Currently, the most difficult part of a context-based lossless image compression system is how to determine the contexts. There is a tendency that the techniques for choosing contexts are based on practical experiences. For example, Wu et al. proposed the gradient of the current pixel as the base to choose the contexts for CALIC. Weinberger et al. proposed a similar strate... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The context trees of block sorting compression

    Publication Year: 1998, Page(s):189 - 198
    Cited by:  Papers (20)  |  Patents (22)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (176 KB)

    The Burrows-Wheeler (1994) transform (BWT) and block sorting compression are closely related to the context trees of PPM. The usual approach of treating BWT as merely a permutation is not able to fully exploit this relation. We show that an explicit context tree for BWT can be efficiently generated by taking a subset of the corresponding suffix tree, identifying the central problems in exploiting ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.