Scheduled System Maintenance
On Tuesday, September 26, IEEE Xplore will undergo scheduled maintenance from 1:00-4:00 PM ET.
During this time, there may be intermittent impact on performance. We apologize for any inconvenience.

Data Compression Conference, 1995. DCC '95. Proceedings

28-30 March 1995

Filter Results

Displaying Results 1 - 25 of 117
  • Proceedings DCC '95 Data Compression Conference [table of contents]

    Publication Year: 1995
    Request permission for commercial reuse | PDF file iconPDF (349 KB)
    Freely Available from IEEE
  • Generalized Lempel-Ziv parsing scheme and its preliminary analysis of the average profile

    Publication Year: 1995, Page(s):262 - 271
    Cited by:  Papers (4)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (514 KB)

    The goal of this contribution is twofold: (i) to introduce a generalized Lempel-Ziv parsing scheme, and (ii) to analyze second-order properties of some compression schemes based on the above parsing scheme. We consider a generalized Lempel-Ziv parsing scheme that partitions a sequence of length n into variable phrases (blocks) such that a new block is the longest substring seen in the past by at m... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A high performance block compression algorithm for small systems-software and hardware implementations

    Publication Year: 1995
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (60 KB)

    Summary form only given. A new algorithmic approach to block data compression, using a highly contextual codification of the dictionary, that gives substantial compression-rate advantages over existing technologies, is described. The algorithm takes into account the limitations and characteristics of small systems, such as a low consumption of memory, high speed and short latency, as required by c... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Application of single-pass adaptive VQ to bilevel images

    Publication Year: 1995
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (73 KB)

    Summary form only given; substantially as follows. Constantinescu and Storer (1994) introduced a new single pass adaptive vector quantization algorithm that maintains a constantly changing dictionary of variable sized rectangles by "learning" larger rectangles from smaller ones as an image is processed. For lossy compression of gray scale images, this algorithm with no advance information or train... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Correction of fixed pattern background and restoration of JPEG compressed CCD images

    Publication Year: 1995
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (76 KB)

    Summary form only given; substantially as follows. The present paper addresses the problem of the removal of the sensor background patterns, dark current and responsivity, from CCD images, when the uncorrected image was transmitted through a JPEG like block transform coding system. The work is of particular interest for imaging systems which operate under severe hardware restrictions, and require ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Histogram analysis of JPEG compressed images as an aid in image deblocking

    Publication Year: 1995
    Cited by:  Papers (1)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (84 KB)

    Summary form only given, substantially as follows. Transform coded images suffer from specific image degradations. In the case of standard JPEG compression/decompression the image quality losses are known to be blocking effects resulting from mean value discontinuities along the 8*8 pixel block boundaries as well as ringing artifacts due to the limited precision of the reconstruction from linear c... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A posteriori restoration of block transform-compressed data

    Publication Year: 1995
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (70 KB)

    Summary form only given; substantially as follows. The NASA/JPL Galileo spacecraft uses lossy data compression for the transmission of its science imagery over the low-bandwidth communication system. The technique for image compression is a block transform technique based on the integer cosine transform (ICT), a derivative of the JPEG image compression standard where a computationally efficient in... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Quantization distortion in block transform-compressed data

    Publication Year: 1995
    Cited by:  Papers (1)  |  Patents (8)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (68 KB)

    Summary form only given, as follows. The JPEG image compression standard is an example of a block transform-based compression scheme; the image is systematically subdivided into blocks that are individually transformed, quantized, and encoded. The compression is achieved by quantizing the transformed data, reducing the data entropy and thus facilitating efficient encoding. Block transform compress... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Efficient handling of large sets of tuples with sharing trees

    Publication Year: 1995
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (78 KB)

    Summary form only given; substantially as follows. Computing with sets of tuples (n-ary relations) is often required in programming, while being a major cause of performance degradation as the size of sets increases. The authors present a new data structure dedicated to the manipulation of large sets of tuples, dubbed a sharing tree. The main idea to reduce memory consumption is to share some sub-... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Real-time VLSI compression for high-speed wireless local area networks

    Publication Year: 1995
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (67 KB)

    Summary form only presented; substantially as follows. Presents a new compact, power-efficient, and scalable VLSI array for the first Lempel-Ziv (LZ) algorithm to be used in high-speed wireless data communication systems. Lossless data compression can be used to inexpensively halve the amount of data to be transmitted, thus improving the effective bandwidth of the communication channel and in turn... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • On the performance of affine index assignments for redundancy free source-channel coding

    Publication Year: 1995
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (60 KB)

    Summary form only given. Many popular redundancy free codes are linear or affine, including the natural binary code (NBC), the folded binary code (FBC), the Gray code (GC), and the two's complement code (TCC). A theorem which considers the channel distortion of a uniform 2/sup n/ level scalar quantizer with stepsize /spl Delta/, which uses an affine index assignment with generator matrix G to tran... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • VQ-based model design algorithms for text compression

    Publication Year: 1995
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (62 KB)

    Summary form only given. We propose a new approach for text compression where fast decoding is more desirable than encoding. An example of such a requirement is an information retrieval system. For efficient compression, high-order conditional probability information of text data is analyzed and modeled by utilizing vector quantization concept. Generally, vector quantization (VQ) has been used for... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Classified conditional entropy coding of LSP parameters

    Publication Year: 1995
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (64 KB)

    Summary form only given. A new LSP speech parameter compression scheme is proposed which uses conditional probability information through classification. For efficient compression of speech LSP parameter vectors it is essential that higher order correlations are exploited. The use of conditional probability information has been hindered by high complexity of the information. For example, a LSP vec... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Lattice-based designs of direct sum codebooks for vector quantization

    Publication Year: 1995
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (67 KB)

    Summary form only given. A direct sum codebook (DSC) has the potential to reduce both memory and computational costs of vector quantization. A DSC consists of several sets or stages of vectors. An equivalent code vector is made from the direct sum of one vector from each stage. Such a structure, with p stages containing m vectors each, has m/sup p/ equivalent code vectors, while requiring the stor... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Extending Huffman coding for multilingual text compression

    Publication Year: 1995
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (66 KB)

    Summary form only given. We propose two new algorithms that are based on the 16-bit or 32-bit sampling character set and on the unique features of languages with a large number of distinct characters to improve the data compression ratios for multilingual text documents. We choose Chinese language using 16 bit character sampling as the representative language in our study. The first approach, call... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Winning-weighted competitive learning for image compression

    Publication Year: 1995
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (74 KB)

    Summary form only given. The LBG algorithm and Kohonen learning algorithm (KLA) both have the problem of being trapped in local minima, the empty cell in LBG and the never winning codevector in KLA. Although compared to the LBG algorithm, Kohonen learning actually relaxes the initial limitation in design, it still relies on initial conditions. We point out the principle of maximum information pres... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An image segmentation method based on a color space distribution model

    Publication Year: 1995
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (86 KB)

    Summary form only given. The use of image segmentation methods to perform second generation image coding has received considerable research attention because homogenized partial image data can be efficiently coded on a separate basis. Regarding color image coding, conventional segmentation techniques are especially useful when applied to a uniform color space, e.g., Miyahara et al. ( see IEICE Tra... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • FFT based fast architecture & algorithm for discrete wavelet transforms

    Publication Year: 1995
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (65 KB)

    Summary form only given. A non-recursive (unlike classical dyadic decomposition) and fast Fourier transform based architecture for computing discrete wavelet transforms (DWT) of a one dimensional sequence is presented. The DWT coefficients at all resolutions can be generated simultaneously without waiting for generation of coefficients at a lower octave level. This architecture is faster than arch... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • PPT based fast architecture & algorithm for discrete wavelet transforms

    Publication Year: 1995
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (86 KB)

    First Page of the Article
    View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Context coding of parse trees

    Publication Year: 1995
    Cited by:  Papers (4)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (64 KB)

    Summary form only given. General-purpose text compression works normally at the lexical level assuming that symbols to be encoded are independent or they depend on preceding symbols within a fixed distance. Traditionally such syntactical models have been focused on compression of source programs, but also other areas are feasible. The compression of a parse tree is an important and challenging par... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Trade-off and applications of source-controlled channel decoding to still images

    Publication Year: 1995
    Cited by:  Papers (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (75 KB)

    Summary form only given, as follows. For image transmission, using a new channel decoder, we present improvements in image quality leading to a much more graceful degradation in case of degrading channel conditions. The APRI-SOVA based on the Viterbi algorithm exploits the residual redundancy and correlation in the source bit stream without changing the transmitter. For three different quantizers ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Algorithm evaluation for synchronous data compression

    Publication Year: 1995
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (79 KB)

    Summary form only given. As part of an industry standardization effort, we have evaluated compression algorithms for throughput enhancement in a synchronous communication environment. Synchronous data compression systems are link layer compressors used between digital transmission devices in internetworks to increase effective throughput. Compression is capable of speeding such links, but achievab... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Queuing models of synchronous compressors

    Publication Year: 1995
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (69 KB)

    Summary form only given. In synchronous compression, a lossless data compressor attempts to equalize the rates of two synchronous communication channels. Synchronous compression is of broad applicability in improving the efficiency of internetwork links over public digital networks. The most notable features of the synchronous compression application are the mixed traffic it must tolerate and the ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Adaptive bidirectional time-recursive interpolation for deinterlacing

    Publication Year: 1995
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (68 KB)

    Summary form only given. There exists a need for finding a good deinterlacing (scan format conversion) system, since, for example, current available cameras are interlaced and the US HDTV Grand Alliance has put forward a proposal containing both interlaced and progressive scanning formats. On the other hand, over the next few years, the local broadcasting stations will find themselves in the posit... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Subband coding methods for seismic data compression

    Publication Year: 1995
    Cited by:  Papers (4)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (80 KB)

    Summary form only given. A typical seismic analysis scenario involves collection of data by an array of seismometers, transmission over a channel offering limited data rate, and storage of data for analysis. Seismic data analysis is performed for monitoring earthquakes and for planetary exploration as in the planned study of seismic events on Mars. Seismic data compression systems are required to ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.