By Topic

Data Compression Conference, 2003. Proceedings. DCC 2003

27-27 March 2003

Filter Results

Displaying Results 1 - 25 of 89
  • Proceedings DCC 2003. Data Compression Conference

    Publication Year: 2003
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (476 KB)

    The following topics are dealt with: low complexity code design; distributed source coding; slide information source coding; error resilience; MAP criterion; video coding; vide compression; linear programming optimization; unequal loss protection; predictive coding; joint source-channel rate allocation scheme; JPEG codestream compression; image compression; constrained quantization; universal mult... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Video compression with intra/inter mode switching and a dual frame buffer

    Publication Year: 2003, Page(s):63 - 72
    Cited by:  Papers (22)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (350 KB) | HTML iconHTML

    Video codecs that use motion compensation have achieved performance improvements from the use of intra/inter mode switching decisions within a rate-distortion framework. A separate development has involved the use of multiple frame prediction, in which more than one past reference frame is available for motion estimation. It is shown that using a dual frame buffer, together with intra/inter mode s... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Wyner-Ziv coding for video: applications to compression and error resilience

    Publication Year: 2003, Page(s):93 - 102
    Cited by:  Papers (50)  |  Patents (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (1420 KB) | HTML iconHTML

    Two separate applications of Wyner-Ziv coding of motion video coding are considered. The Wyner-Ziv theorem on source coding with side information available only at the decoder suggests that an asymmetric video codec, where individual frames are encoded separately, but decoded conditionally could achieve efficiency comparable to current interframe video compression systems. First results on the Wyn... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An efficient joint source-channel rate allocation scheme for JPEG2000 codestreams

    Publication Year: 2003, Page(s):113 - 122
    Cited by:  Papers (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (301 KB) | HTML iconHTML

    A two-level, hybrid-optimization scheme is proposed for rate allocation of JPEG2000 (J2K) transmission over noisy channels. It combines forward error correction (FEC) and the J2K error resilience modes to minimize the expected end-to-end image distortion. Subject to a total target bit rate, it forms fixed-length channel packets in a straightforward manner. The proposed scheme utilizes the source d... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • PPM model cleaning

    Publication Year: 2003, Page(s):163 - 172
    Cited by:  Papers (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (412 KB) | HTML iconHTML

    The prediction by partial matching (PPM) algorithm uses a cumulative frequency count of input symbols in different contexts to estimate its probability distribution. Compression ratios yielded by the PPM algorithm have not instigated broader use of this scheme mainly because of its high demand for computational resources. An algorithm that improves the memory usage by the PPM model is presented. T... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • High rate mismatch in entropy constrained quantization

    Publication Year: 2003, Page(s):173 - 182
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (294 KB) | HTML iconHTML

    It is shown that if an asymptotically optimal sequence of variable rate codes is designed for a k-dimensional probability density function (pdf) g and then applied to another pdf f for which f/g is bounded, then the resulting mismatch or loss of performance from the optimal possible is given by the relative entropy or Kullback-Leibler divergence I(f/spl par/g). It is also shown that under the same... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Codelet parsing: quadratic-time, sequential, adaptive algorithms for lossy compression

    Publication Year: 2003, Page(s):223 - 232
    Cited by:  Papers (3)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (1308 KB) | HTML iconHTML

    The codelet parsing algorithms were proposed for lossy compression. The algorithms sequentially parse a given source sequence into phrases, say, sourcelets, and map each sourcelet to a distorted phrase, say, a codelet, such that the pre-letter distortion between the two phrases does not exceed the desired distortion. The algorithms adaptively maintain a codebook, and do not require any a priori kn... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • In-place differential file compression

    Publication Year: 2003, Page(s):263 - 272
    Cited by:  Papers (2)  |  Patents (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (347 KB) | HTML iconHTML

    Algorithms for in-place differential file compression were presented, where a target file of size n is compressed with respect to a source file of size m using no additional space; that is, the space for the source file is overwritten by the decompressed target file so that at no time is more than a total of MAX(m,n) space is used. From a theoretical point of view, an optimal solution (best possib... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Searchable compressed representation of very sparse bitmaps

    Publication Year: 2003, Page(s):353 - 361
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (326 KB) | HTML iconHTML

    Very sparse bitmaps are used in a wide variety of applications, ranging from adjacency matrices in representation of large sparse graphs, representation of sparse space occupancy to book-keeping in databases. A method based on pruning of the binary space partition (BSP) tree in the minimal description length (MDL) principle for coding very sparse bitmaps was proposed. This new method for coding of... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Code compression using variable-to-fixed coding based on arithmetic coding

    Publication Year: 2003, Page(s):382 - 391
    Cited by:  Papers (4)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (358 KB) | HTML iconHTML

    Embedded computing systems are space and cost sensitive. Memory is one of the most restricted resources that post serious constraints on program size. Code compression, which is a special case of data compression where the input source is in machine instructions, has been proposed as a solution to this problem. Previous work in code compression has focused on either fixed-to-variable coding or dic... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • On the optimality of embedded deadzone scalar-quantizers for wavelet-based L-infinite-constrained image coding

    Publication Year: 2003
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (243 KB) | HTML iconHTML

    Summary form only given. Several methods for L/sub /spl infin//-distortion constrained compression have been proposed that target a set of fixed reconstruction-error bounds. Recently, a wavelet-based L/sub /spl infin//-constrained embedded image-coding technique was proposed that guarantees the required distortion bound while retaining the coding performance and scalability option state-of-the-art... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Stack-run adaptive wavelet image coding

    Publication Year: 2003
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (228 KB) | HTML iconHTML

    Summary form only given. An adaptive wavelet transform based on an image coder that employs a stack-run representation for quantized transform coefficients and benefits from the intra-subband redundancies is presented. The compression algorithm can be divided into four parts. First, an adaptive wavelet packet basis is selected for representing the given image using certain entropy-based cost funct... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The transport of reversible and unreversible embedded wavelets (TRUEW)

    Publication Year: 2003
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (241 KB) | HTML iconHTML

    Summary form only given. Transport of reversible and unreversible embedded wavelets (TRUEW) is a system for the portion delivery of compressed JPEG 2000 files over a network. A discussion on the development of the TRUEW architecture, the benefits of the TRUEW architecture, an efficiency analysis of tile-part and precint access to JPEG 2000 compressed images, and some comments on potential applicat... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Estimating and comparing entropies across written natural languages using PPM compression

    Publication Year: 2003
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (304 KB) | HTML iconHTML

    Summary form only given. The measurement of the entropy of written English is extended to include the following written natural languages: Arabic, Chinese, French, Japanese, Korean, Russian, and Spanish. It was observed that translations of the same document have approximately the same size when compressed even though they have widely varying uncompressed sizes. In the experiment, an efficient com... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An error-resilient blocksorting compression algorithm

    Publication Year: 2003
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (222 KB) | HTML iconHTML

    Summary form only given. The error susceptibility in the compressed bit stream is considered as a key limitation of adaptive lossless compression systems. The inherent design of these systems often requires that they discard all data subsequent to the error. This is especially problematic in the Burrows-Wheeler blocksorting transform (BWT), with 1MB suffix-sorted blocks. Error-correcting codes, su... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A new visual masking tool for JPEG2000

    Publication Year: 2003
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (220 KB) | HTML iconHTML

    Summary form only given. A nonuniform quantization scheme, based on the perceptual relevance of each channel signal component, has been developed to exploit HVS properties. This strategy is based on the exploitation of the visual masking effect by performing a subband decomposition. An evaluation of the quality was performed using psychophysical measures for validation. The obtained results denote... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Compression of RADARSAT data with block adaptive wavelets

    Publication Year: 2003
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (721 KB) | HTML iconHTML

    Summary form only given. A new algorithm, referred to as the wavelet packet-based embedded blocking code (WPEB), was developed for synthetic aperture radar (SAR) data compression. This algorithm combines the different properties of wavelet packet decomposition, block coding, and speckle reduction. Better results were obtained by using the wavelet packet transform that decomposes the higher frequen... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Binary combinatorial coding

    Publication Year: 2003
    Cited by:  Papers (2)  |  Patents (4)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (270 KB) | HTML iconHTML

    Summary form only given. A novel binary entropy code, called combinatorial coding (CC), is presented. The theoretical basis for CC has been described previously under the context of universal coding, enumerative coding, and minimum description length. The code described in these references works as follows: assume the source data are binary of length M, memoryless, and generated with an unknown pa... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A novel RAM architecture for bit-plane based coding

    Publication Year: 2003
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (261 KB) | HTML iconHTML

    Summary form only given. An optimized memory organization has been designed for the hierarchical coding of wavelet subbands. Changing the RAM access pattern and using multiple location access at each clock instant can accomplish a better economy in time and resources. The bit-planes are distributed along the z-direction and the x-y plane contains 256 /spl times/ 256 number of memory elements. The ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Almost work-optimal PRAM EREW decoders of LZ compressed text

    Publication Year: 2003
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (252 KB) | HTML iconHTML

    Summary form only given. The parallel complexity of LZ compression and decompression has been studied. Parallel algorithms have been designed for LZ1 compression and decompression. LZ2 compression is hardly parallelizable, since it is P-complete. A nearly work-optimal parallel decoding algorithm was shown which run on the PRAM EREW in O(log n) time with O(n/(log n)/sup 1/2 /) processors for text c... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Rate-distortion bound for joint compression and classification

    Publication Year: 2003
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (219 KB) | HTML iconHTML

    Summary form only given. Rate-distortion theory is applied to the problem of joint compression and classification. A Lagrangian distortion measure is used to consider both the Euclidean error in reconstructing the original data as well as the classification performance. The bound is calculated based on an alternating-minimization procedure, representing an extension of the Blahut-Arimoto algorithm... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Optimal variable rate multiplexing of scalable code streams

    Publication Year: 2003
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (225 KB) | HTML iconHTML

    Summary form only given. The problem of multiplexing of several rate-distortion scalable multimedia code streams to be transmitted via a variable-rate channel is investigated. The aim is to minimize the expected distortion at the receiver, weighted by the probability distribution of the truncation point, P(l). Firstly, a very simple algorithm is presented which solves the problem in the case when ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Enhanced Sequitur for finding structure in data

    Publication Year: 2003
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (286 KB) | HTML iconHTML

    Summary form only given. The enhancements made to Sequitur for two more specific kinds of input are described, namely DNA strings and digitized music. Because of the natural orientation of the double helix DNA structure, DNA sequences bond to their reverse complements. Sequitur is enhanced to recognize reverse complements. These enhancements improved Sequitur's ability to compress and discover str... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Image foveation based on vector quantization

    Publication Year: 2003
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (202 KB) | HTML iconHTML

    Summary form only given. The perceptual resolution of vision is greatly space variant and is highest at the point of fixation and decreases rapidly away from this point. Novel unstructured and structured vector quantization (VQ) schemes are proposed to take advantage of this property of the human visual system (HVS) by providing the best image quality around the fixation point. As a foveation tech... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Network source coding using entropy constrained dithered quantization

    Publication Year: 2003
    Cited by:  Papers (6)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (289 KB) | HTML iconHTML

    Summary form only given. Assuming the squared error distortion measure, the performance achieved is bounded by using scalar entropy constrained dithered quantization (SECDQ) to build multi-resolution (MR), multiple access (MA) and broadcast system (BS) source codes. The resulting performances for arbitrary source distribution are discussed. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.