By Topic

Data Compression Conference, 1994. DCC '94. Proceedings

Date 29-31 March 1994

Filter Results

Displaying Results 1 - 25 of 46
  • Proceedings of IEEE Data Compression Conference (DCC'94)

    Publication Year: 1994
    Request permission for commercial reuse | PDF file iconPDF (13 KB)
    Freely Available from IEEE
  • Customized JPEG compression for grayscale printing

    Publication Year: 1994, Page(s):156 - 165
    Cited by:  Papers (6)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (500 KB)

    Describes a procedure by which JPEG compression may be customized for grayscale images that are to be compressed, halftoned, and printed. The technique maintains 100% compatibility with the JPEG standard, and is applicable with any halftoning algorithm. The JPEG quantization table is designed using frequency-domain characteristics of the halftoning patterns and the human visual system, and the Huf... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Variable dimension weighted universal vector quantization and noiseless coding

    Publication Year: 1994, Page(s):2 - 11
    Cited by:  Papers (9)  |  Patents (26)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (448 KB)

    A new algorithm for variable dimension weighted universal coding is introduced. Combining the multi-codebook system of weighted universal vector quantization (WUVQ), the partitioning technique of variable dimension vector quantization, and the optimal design strategy common to both, variable dimension WUVQ allows mixture sources to be effectively carved into their component subsources, each of whi... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Bayes risk weighted VQ and learning VQ

    Publication Year: 1994, Page(s):400 - 409
    Cited by:  Papers (4)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (960 KB)

    This paper examines two vector quantization algorithms which can combine the tasks of compression and classification: Bayes risk weighted vector quantization (BRVQ) proposed by Oehler et al. (1991), and optimized learning vector quantization 1 (OLVQ1) proposed by Kohonen et al. (1988). BRVQ uses a parameter λ to control the tradeoff between compression and classification. BRVQ performance i... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Syntax-constrained encoder optimization using adaptive quantization thresholding for JPEG/MPEG coders

    Publication Year: 1994, Page(s):146 - 155
    Cited by:  Papers (4)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (312 KB)

    The authors show a rate-distortion optimal quantization technique to threshold the DCT coefficients in the industry image and video coding standards JPEG and MPEG respectively. Their scheme achieves a decent thresholding gain in terms of both objective SNR (about 1 dB) as well as perceived quality and uses a fast dynamic programming recursive structure which exploits certain monotonicity character... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A nonlinear VQ-based predictive lossless image coder

    Publication Year: 1994, Page(s):304 - 310
    Cited by:  Papers (9)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (272 KB)

    A new lossless predictive image coder is introduced and tested. The predictions are made with a nonlinear, vector quantizer based, adaptive predictor. The prediction errors are losslessly compressed with an arithmetic coder that presumes they are Laplacian distributed with variances that are estimated during the prediction process, as in the approach of Howard and Vitter (1992) View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Online compression of video sequences using adaptive VQ codebooks

    Publication Year: 1994, Page(s):185 - 194
    Cited by:  Papers (9)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (400 KB)

    Proposes a novel approach that combines the space covering property of high rate lattice VQ with the pattern matching ability of clustering VQ. The proposed scheme encompasses a broad range of online algorithms that use suitable VQ encodings and fixed-size, adaptive codebooks. The generic baseline algorithm for the scheme has the following desirable characteristics: the distortion per individual v... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Self-similarity of the multiresolutional image/video decomposition: smart expansion as compression of still and moving pictures

    Publication Year: 1994, Page(s):331 - 340
    Cited by:  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (420 KB)

    The paper introduces a new combined fractal/multiresolutional image compression based on the observed property of self-similarity of the pyramidal image transform. The gist of the method is zooming out from a (possibly shrunken) low-resolution image producing a sharp and crisp “natural looking” high-resolution view, without blockiness and jaggedness. It is demonstrated that the techniq... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Huffman-type codes for infinite source distributions

    Publication Year: 1994, Page(s):83 - 89
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (288 KB)

    A new sufficient condition is given for an infinite source distribution to share a minimum average codeword length code with a geometric distribution. Thus some new examples of parametric families of infinite source distributions can be optimally encoded by Huffman-type codes View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Vector quantization of contextual information for lossless image compression

    Publication Year: 1994, Page(s):390 - 399
    Cited by:  Papers (8)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (452 KB)

    The authors present a new pruned tree structured vector quantization (TSVQ) algorithm, called incremental tree growing (ITG), which incrementally grows and quantizes the context tree locally on a level by level basis, thus drastically reducing both the memory requirements and the computational complexity. After the incremental tree growing is completed, terminal branches are globally vector quanti... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The minimax redundancy is a lower bound for most sources

    Publication Year: 1994, Page(s):52 - 61
    Cited by:  Papers (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (472 KB)

    The capacity of the channel induced by a given class of sources is well known to be an attainable lower bound on the redundancy of universal codes w.r.t this class, both in the minimax sense and in the Bayesian (maximin) sense. The authors main contribution is a relatively simple proof that the capacity is essentially a lower bound also in a stronger sense, that is, for “most” sources ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Parsing algorithms for dictionary compression on the PRAM

    Publication Year: 1994, Page(s):136 - 145
    Cited by:  Papers (8)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (464 KB)

    Parallel algorithms for lossless data compression via dictionary compression using optimal and greedy parsing strategies are described. Dictionary compression removes redundancy by replacing substrings of the input by references to strings stored in a dictionary. Given a static dictionary stored as a suffix tree, the authors present a parallel random access concurrent read, concurrent write (PRAM ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A high performance fixed rate compression scheme for still image transmission

    Publication Year: 1994, Page(s):294 - 303
    Cited by:  Papers (13)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (364 KB)

    Variable length source coding schemes have a very painful drawback when it comes to transmission of compressed images. Thus the authors restrict themselves to fixed rate source coding schemes, that do not have the problems of re-synchronization or complete loss of information in the case of uncorrectable transmission errors. In the paper, a high performance source coding scheme based on the discre... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Explicit bit minimization for motion-compensated video coding

    Publication Year: 1994, Page(s):175 - 184
    Cited by:  Papers (6)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (488 KB)

    Compares methods for choosing motion vectors for motion-compensated video compression. The primary focus is on videophone and videoconferencing applications, where very low bit rates are necessary, where the motion is usually limited, and where the frames must be coded in the order they are generated. the authors provide evidence, using established benchmark videos of this type, that choosing moti... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A rapidly adaptive lossless compression algorithm for high fidelity audio coding

    Publication Year: 1994, Page(s):430 - 439
    Cited by:  Papers (2)  |  Patents (26)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (472 KB)

    A new adaptive algorithm for lossless compression of digital audio is presented. The algorithm is derived from ideas from both dictionary coding and source-modeling. An adaptive Lempel-Ziv (1977) style fixed dictionary coder is used to build a source model that fuels an arithmetic coder. As a result, variable length strings drawn from the source alphabet are mapped onto variable length strings tha... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Enhancement of block transform coded images using residual spectra adaptive postfiltering

    Publication Year: 1994, Page(s):321 - 330
    Cited by:  Papers (5)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (504 KB)

    Image block transform techniques usually introduce several types of spatial periodic distortion which are mostly noticeable at low bit rates. One way to reduce these artifacts to obtain an acceptable visual quality level is to postfilter the decoded images using nonlinear space-variant adaptive filters derived from the structural relationships and residual spectral information provided by the disc... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Filter evaluation and selection in wavelet image compression

    Publication Year: 1994, Page(s):351 - 360
    Cited by:  Papers (15)  |  Patents (50)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (552 KB)

    Choice of filter bank in wavelet compression is a critical issue that affects image quality as well as system design. Although regularity is sometimes used in filter evaluation, its success at predicting compression performance is only partial. A more reliable evaluation can be obtained by considering an L-level synthesis/analysis system as a single-input, single-output, linear shift-variant syste... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Data compression techniques for stock market prediction

    Publication Year: 1994, Page(s):72 - 82
    Cited by:  Patents (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (556 KB)

    Presents advanced data compression techniques for predicting stock markets behavior under widely accepted market models in finance. The techniques are applicable to technical analysis, portfolio theory, and nonlinear market models. The authors find that lossy and lossless compression techniques are well suited for predicting stock prices as well as market modes such as strong trends and major adju... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • On lattice quantization noise

    Publication Year: 1994, Page(s):380 - 389
    Cited by:  Papers (6)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (336 KB)

    Presents several results regarding the properties of a random vector, uniformly distributed over a lattice cell. This random vector is the quantization noise associated with dithered lattice quantization, and at high resolution it is the noise generated in regular lattice quantization of “smooth” sources. The authors find that the noise associated with the optimal lattice quantizers is... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A new multiple path search technique for residual vector quantizers

    Publication Year: 1994, Page(s):42 - 51
    Cited by:  Papers (4)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (464 KB)

    Multiple path searching can provide varying degrees of joint search optimization of residual vector quantizer encoder stages. A short coming of the conventional multiple path M-search algorithm, however, is that joint search optimization of encoder stages is limited to consecutive stages. A new iterated multipath (IM)-search algorithm is introduced that is not subject to any particular ordering of... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A hybrid approach to text compression

    Publication Year: 1994, Page(s):225 - 233
    Cited by:  Papers (3)  |  Patents (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (388 KB)

    Text compression schemes have sometimes been divided into two classes: symbolwise methods, which form a source model, typically using a finite context to predict symbols; and dictionary methods, which replace phrases (groups of symbols) in the input with a code. It is possible to decompose some dictionary methods into equivalent symbolwise methods. The decomposed method gives identical compression... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Static compression for dynamic texts

    Publication Year: 1994, Page(s):126 - 135
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (484 KB)

    The authors have explored the particular needs of large information retrieval systems, in which hundreds of megabytes of data are stored, retrieval is non-sequential, and new text is continually being appended. It has been shown that the word-based model can be adapted to cope well both with dynamic environments, and with situations in which decode-time memory is limited. In the latter case as lit... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Fast bintree-structured image coder for high subjective quality

    Publication Year: 1994, Page(s):284 - 293
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (620 KB)

    A practical, fast, bintree-structured variable size block image coder is proposed. The inability of current segmentation-based image coding methods using piecewise functional approximation to represent textures is overcome by a generalized BTC encoding of approximation errors in texture areas. This new texture encoding technique significantly enhances subjective image quality at small extra costs ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Lossless image compression with lossy image using adaptive prediction and arithmetic coding

    Publication Year: 1994, Page(s):166 - 174
    Cited by:  Papers (8)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (280 KB)

    Lossless gray scale image compression is necessary for many purposes, such as medical imaging, image databases and so on. Lossy images are important as well, because of the high compression ratio. The authors propose a lossless image compression scheme using a lossy image generated with the JPEG-DCT scheme. The concept is, send a JPEG-compressed lossy image primarily, then send residual informatio... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Entropy-constrained tree-structured vector quantizer design by the minimum cross entropy principle

    Publication Year: 1994, Page(s):12 - 21
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (508 KB)

    The authors address the variable rate tree-structured vector quantizer design problem, wherein the rate is measured by the quantizer's entropy. For this problem, tree pruning via the generalized Breiman-Friedman-Olshen-Stone (1980) algorithm obtains solutions which are optimal over the restricted solution space consisting of all pruned trees derivable from an initial tree. However, the restriction... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.