By Topic

Data Compression Conference, 1996. DCC '96. Proceedings

March 31 1996-April 3 1996

Filter Results

Displaying Results 1 - 25 of 85
  • Enhancing Lempel-Ziv codes using an on-line variable length binary encoding

    Publication Year: 1996
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (65 KB)

    Summary form only given. The LZW algorithm is the most popular dictionary-based adaptive text compression scheme [Welch 1984]. In the LZW algorithm, a changing dictionary contains common strings that have been encountered so far in the text. The motivation for the present research is to explore an on-line variable-length binary encoding. We apply this encoding to LZW codes for remedy of the proble... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Lossless compression using inversions on multiset permutations

    Publication Year: 1996
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (59 KB)

    Summary form only given. Linear prediction schemes, such as JPEG or BJPEG, are simple and normally result in a significant reduction in source entropy. Occasionally the entropy of the prediction error becomes greater than that of the original image. Such situations frequently occur when the image data has discrete gray-levels located within certain intervals. To alleviate this problem, various aut... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Combining tree and feature classification in fractal encoding of images

    Publication Year: 1996
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (65 KB)

    Summary form only given. One of the main problems with fractal compression of images is the long encoding time, due to the repeated search of the domain block pool. Faster search can be achieved through block classification. This is done by grouping the domain blocks independently and online into predefined classes. Only the class of a range block is then searched for a matching domain. Bani-Eqbal... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Efficient lossless compression of trees and graphs

    Publication Year: 1996
    Cited by:  Papers (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (50 KB)

    Summary form only given, as follows. Data compression algorithms have been widely used in many areas to meet the demand of storage and transfer of large size data. Most of the data compression algorithms regard the input as a sequence of binary numbers and represent the compressed data also as a binary sequence. However, in many areas such as programming languages (e.g. LISP and C) and compiler de... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A low complexity wavelet based audio compression method

    Publication Year: 1996
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (44 KB)

    Summary form only given. Multimedia computing is becoming of increasing importance to modern telecommunications. Audio compression is an integral part of multimedia applications. The unique operating environment of multimedia computing imposes many unique requirements on the audio compression algorithm, including a high compression rate, low-complexity, and the ability of handling different audio ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Combination coding: a new entropy coding technique

    Publication Year: 1996
    Cited by:  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (76 KB)

    Summary form only given. Entropy coding is defined to be the compression of a stream of symbols taken from a known symbol set where the probability of occurrence of any symbol from the set at any given point in the stream is constant and independent of any known occurrences of any other symbols. Shannon and Fano showed that the information of such a sequence could be calculated. When measured in b... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Global and local distortion inference during embedded zerotree wavelet decompression

    Publication Year: 1996
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (74 KB)

    Summary form only given. Local distortion inference is proposed as an alternative to assuming a spatially uniform error for image compression applications requiring careful error analysis. This paper presents an algorithm for inferring estimates of the L/sub 2/-norm distortion (root-mean-square or RMS error) for the embedded zerotree wavelet (EZW) algorithm while maintaining its good rate-distorti... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A high speed motion estimator using 2-D log search algorithm

    Publication Year: 1996
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (74 KB)

    Summary form only given. This paper describes the design of a high speed motion estimator using the 2-D log search algorithm. The architecture consists of 5 simple processing elements (PE) where each PE is capable of computing the sum-of-absolute-difference (SAD) to exploit the parallelism. For each step in the 2-D log search procedure, the 5 SADs of the 5 search points are computed in parallel. T... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Efficient chain-code encoding for segmentation-based image compression

    Publication Year: 1996
    Cited by:  Papers (1)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (91 KB)

    Summary form only given. This paper presents a new and efficient method of encoding uniform image regions and lines. Regions and lines are obtained as the result of image segmentation, split and merge image compression, or as the output of line and polygon drawing algorithms. Lines and contours of uniform regions are encoded using chain-code. The chain-code is obtained in a way that is efficient w... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A lossy image codec based on index coding

    Publication Year: 1996
    Cited by:  Papers (16)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (46 KB)

    Summary form only. We propose a new lossy image codec based on index coding. Both Shapiro's embedded zerotree wavelet algorithm, and Said and Pearlman's codetree algorithm use spatial orientation tree structures to implicitly locate the significant wavelet transform coefficients. A direct approach to find the positions of these significant coefficients is presented. The new algorithm combines the ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • High performance wavelet image compression optimized for MSE and HVS metrics

    Publication Year: 1996
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (72 KB)

    Summary form only. Wavelet still image compression has been a focus of intense research. Considerable coding gains over older DCT-based methods have been achieved, while the computational complexity has been made very competitive. We report on a high performance wavelet still image compression algorithm optimized for both mean-squared error (MSE) and human visual system (HVS) characteristics. We p... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Author index

    Publication Year: 1996
    Request permission for commercial reuse | PDF file iconPDF (87 KB)
    Freely Available from IEEE
  • Image compression with iterated function systems, finite automata and zerotrees: grand unification

    Publication Year: 1996
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (72 KB)

    The paper deals with analysis, generalizations and unifications of the latest group of powerful image compression techniques: fractal image compression with iterated function systems (IFS), Culik's compression with finite automata and Shapiro's embedded coding of wavelet coefficients using zerotrees. All three techniques achieve premium results by exploiting properties of self-similarity of typica... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Hybrid block-based/segment-based video compression at very low bit rate

    Publication Year: 1996
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (56 KB)

    An algorithmic architecture of video compression at very low bit rate is presented. This architecture incorporates block-based as well as segment-based coding algorithms and manages them in a cooperative way in which different approaches are applied to different image frames iteratively: I and P frames are coded by block-based algorithm and O frames are coded by segment-based algorithm. An extra d... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Pattern matching image compression

    Publication Year: 1996
    Cited by:  Patents (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (60 KB)

    We propose a non-transform image compression scheme based on approximate pattern matching, that we name pattern matching linage compression (PMIC). The main idea behind it is a lossy extension of the Lempel-Ziv data compression scheme in which one searches for the longest prefix of an uncompressed image that approximately occurs in the already processed image. We consider both the Hamming distance... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Designing vector quantizers in the presence of source noise or channel noise

    Publication Year: 1996, Page(s):33 - 42
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (364 KB)

    The problem of vector quantizer empirical design for noisy channels or for noisy sources is studied. It is shown that the average squared distortion of a vector quantizer designed optimally from observing clean i.i.d. training vectors converges in expectation, as the training set size grows, to the minimum possible mean-squared error obtainable for quantizing the clean source and transmitting acro... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Bi-level document image compression using layout information

    Publication Year: 1996
    Cited by:  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (60 KB)

    Most bi-level images stored on computers today comprise scanned text, and are stored using generic bi-level image technology based either on classical run-length coding, such as the CCITT Group 4 method, or on modern schemes such as JBIG that predict pixels from their local image context. However, image compression methods that are tailored specifically for images known to contain printed text can... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Compression of aerial ortho images based on image denoising

    Publication Year: 1996
    Cited by:  Papers (6)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (60 KB)

    Abstract only given. Discusses the compression of an important class of computer images, called aerial ortho images, that result from geodetic transformation computations [Kinsner, 1994]. The computations introduce numerical noise, making the images nearly incompressible losslessly because of their high entropy. The use of classical lossy compression schemes is also not desirable because their eff... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Lossless and lossy compression of text images by soft pattern matching

    Publication Year: 1996, Page(s):210 - 219
    Cited by:  Papers (28)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (532 KB)

    We present a method for both lossless and lossy compression of bilevel images that consist mostly of printed or typed text. The key feature of the method is soft pattern matching, a way of making use of the information in previously encountered characters without risking the introduction of character substitution errors. We can obtain lossless compression about 20 percent better than that of the J... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Vector quantizer design using genetic algorithms

    Publication Year: 1996
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (56 KB)

    The design of vector quantizers (VQs) that yield minimal distortion is one of the most challenging problems in source coding. The problem of VQ design is to find a codebook that gives the least overall distortion (or equivalently, the largest signal-to-noise ratio (SNR)) for a given set of input vectors. This problem is known to be difficult as there are no known closed-form solutions. The general... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Joint image classification and compression using hierarchical table-lookup vector quantization

    Publication Year: 1996, Page(s):23 - 32
    Cited by:  Papers (5)  |  Patents (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (756 KB)

    Classification and compression play important roles today in communicating digital information and their combination is useful in many applications. The aim is to produce image classification without any further signal processing on the compressed image. This paper presents techniques for the design of block based joint classifier and quantizer classifiers/encoders implemented by table lookups. In... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Arbitrary tilings of the time-frequency plane using local bases

    Publication Year: 1996, Page(s):369 - 378
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (444 KB)

    We show how to obtain arbitrary tilings of the time-frequency plane using local orthogonal bases. These bases were recently constructed as a generalization of the cosine-modulated filter banks in discrete time, and local sine and cosine bases in continuous time. Due to the fact that they use a single prototype window, these bases also lead to time-varying tilings. Moreover, they have a fast implem... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An adaptive data compression method based on context sorting

    Publication Year: 1996, Page(s):160 - 169
    Cited by:  Papers (5)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (416 KB)

    Every symbol in the data can be predicted by taking its immediately preceding symbols, or context, into account. This paper proposes a new adaptive data compression method based on a technique of context sorting. The aim of context sorting is to sort a set of contexts in order to find previous contexts similar to the current one. The proposed method predicts the next symbol by ranking the previous... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Audio coding using variable-depth multistage quantizers

    Publication Year: 1996
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (60 KB)

    Digital coding of high-fidelity audio signals has become a key technology in the development of cost-effective multimedia systems. Most audio coding algorithms rely on (i) removal of statistical redundancies in the signal and (ii) exploitation of masking properties of the human auditory system to hide distortions in the coded signal. Transform and subband coders provide a convenient framework for ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Fast reconstruction of subband decomposed signals for progressive transmission

    Publication Year: 1996, Page(s):230 - 239
    Cited by:  Patents (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (408 KB)

    We propose a fast reconstruction method for a progressive subband-decomposed signal coding system. It is shown that unlike the normal approach which contains a fixed computational complexity, the computational complexity of the proposed approach is proportional to the number of refined coefficients. Therefore, using the proposed approach in image coding applications, we can update the image after ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.