By Topic

Data Compression Conference, 2003. Proceedings. DCC 2003

Date 27-27 March 2003

Filter Results

Displaying Results 1 - 25 of 89
  • Proceedings DCC 2003. Data Compression Conference

    Publication Year: 2003
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (476 KB)

    The following topics are dealt with: low complexity code design; distributed source coding; slide information source coding; error resilience; MAP criterion; video coding; vide compression; linear programming optimization; unequal loss protection; predictive coding; joint source-channel rate allocation scheme; JPEG codestream compression; image compression; constrained quantization; universal mult... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Video compression with intra/inter mode switching and a dual frame buffer

    Publication Year: 2003, Page(s):63 - 72
    Cited by:  Papers (21)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (350 KB) | HTML iconHTML

    Video codecs that use motion compensation have achieved performance improvements from the use of intra/inter mode switching decisions within a rate-distortion framework. A separate development has involved the use of multiple frame prediction, in which more than one past reference frame is available for motion estimation. It is shown that using a dual frame buffer, together with intra/inter mode s... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Wyner-Ziv coding for video: applications to compression and error resilience

    Publication Year: 2003, Page(s):93 - 102
    Cited by:  Papers (48)  |  Patents (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (1420 KB) | HTML iconHTML

    Two separate applications of Wyner-Ziv coding of motion video coding are considered. The Wyner-Ziv theorem on source coding with side information available only at the decoder suggests that an asymmetric video codec, where individual frames are encoded separately, but decoded conditionally could achieve efficiency comparable to current interframe video compression systems. First results on the Wyn... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An efficient joint source-channel rate allocation scheme for JPEG2000 codestreams

    Publication Year: 2003, Page(s):113 - 122
    Cited by:  Papers (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (301 KB) | HTML iconHTML

    A two-level, hybrid-optimization scheme is proposed for rate allocation of JPEG2000 (J2K) transmission over noisy channels. It combines forward error correction (FEC) and the J2K error resilience modes to minimize the expected end-to-end image distortion. Subject to a total target bit rate, it forms fixed-length channel packets in a straightforward manner. The proposed scheme utilizes the source d... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • PPM model cleaning

    Publication Year: 2003, Page(s):163 - 172
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (412 KB) | HTML iconHTML

    The prediction by partial matching (PPM) algorithm uses a cumulative frequency count of input symbols in different contexts to estimate its probability distribution. Compression ratios yielded by the PPM algorithm have not instigated broader use of this scheme mainly because of its high demand for computational resources. An algorithm that improves the memory usage by the PPM model is presented. T... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • High rate mismatch in entropy constrained quantization

    Publication Year: 2003, Page(s):173 - 182
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (294 KB) | HTML iconHTML

    It is shown that if an asymptotically optimal sequence of variable rate codes is designed for a k-dimensional probability density function (pdf) g and then applied to another pdf f for which f/g is bounded, then the resulting mismatch or loss of performance from the optimal possible is given by the relative entropy or Kullback-Leibler divergence I(f/spl par/g). It is also shown that under the same... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Codelet parsing: quadratic-time, sequential, adaptive algorithms for lossy compression

    Publication Year: 2003, Page(s):223 - 232
    Cited by:  Papers (3)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (1308 KB) | HTML iconHTML

    The codelet parsing algorithms were proposed for lossy compression. The algorithms sequentially parse a given source sequence into phrases, say, sourcelets, and map each sourcelet to a distorted phrase, say, a codelet, such that the pre-letter distortion between the two phrases does not exceed the desired distortion. The algorithms adaptively maintain a codebook, and do not require any a priori kn... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • In-place differential file compression

    Publication Year: 2003, Page(s):263 - 272
    Cited by:  Papers (2)  |  Patents (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (347 KB) | HTML iconHTML

    Algorithms for in-place differential file compression were presented, where a target file of size n is compressed with respect to a source file of size m using no additional space; that is, the space for the source file is overwritten by the decompressed target file so that at no time is more than a total of MAX(m,n) space is used. From a theoretical point of view, an optimal solution (best possib... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Searchable compressed representation of very sparse bitmaps

    Publication Year: 2003, Page(s):353 - 361
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (326 KB) | HTML iconHTML

    Very sparse bitmaps are used in a wide variety of applications, ranging from adjacency matrices in representation of large sparse graphs, representation of sparse space occupancy to book-keeping in databases. A method based on pruning of the binary space partition (BSP) tree in the minimal description length (MDL) principle for coding very sparse bitmaps was proposed. This new method for coding of... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Code compression using variable-to-fixed coding based on arithmetic coding

    Publication Year: 2003, Page(s):382 - 391
    Cited by:  Papers (4)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (358 KB) | HTML iconHTML

    Embedded computing systems are space and cost sensitive. Memory is one of the most restricted resources that post serious constraints on program size. Code compression, which is a special case of data compression where the input source is in machine instructions, has been proposed as a solution to this problem. Previous work in code compression has focused on either fixed-to-variable coding or dic... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Estimating and comparing entropies across written natural languages using PPM compression

    Publication Year: 2003
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (304 KB) | HTML iconHTML

    Summary form only given. The measurement of the entropy of written English is extended to include the following written natural languages: Arabic, Chinese, French, Japanese, Korean, Russian, and Spanish. It was observed that translations of the same document have approximately the same size when compressed even though they have widely varying uncompressed sizes. In the experiment, an efficient com... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Image foveation based on vector quantization

    Publication Year: 2003
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (202 KB) | HTML iconHTML

    Summary form only given. The perceptual resolution of vision is greatly space variant and is highest at the point of fixation and decreases rapidly away from this point. Novel unstructured and structured vector quantization (VQ) schemes are proposed to take advantage of this property of the human visual system (HVS) by providing the best image quality around the fixation point. As a foveation tech... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An asymptotically optimal predictor for stereo lossless audio compression

    Publication Year: 2003
    Cited by:  Papers (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (203 KB) | HTML iconHTML

    Summary form only given. A new generalized stereo decorrelation algorithm and optimal predictor for lossless audio compression is proposed. The proposed algorithm yields around 1.5% better compression than the previous state-of-the-art algorithms. With this approach, the stereo decorrelation and the prediction can be merged into a single step, which can lead to a global minimum for the combined pr... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Recurrence relations on transfer matrices yield good lower and upper bounds on the channel capacity of some 2-dimensional constrained systems

    Publication Year: 2003
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (213 KB) | HTML iconHTML

    Summary form only given. Two classes of constrained systems are discussed: the generation of read/write isolated memory and two-dimensional run length limited constrained systems. The procedure on how to use the recurrence relations on the A/sub n/ and '1'-counting to derive recurrence inequalities on the /spl lambda//sub n/ is shown. This procedure has been found to yield good upper and lower bou... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Arc-length compression

    Publication Year: 2003
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (312 KB) | HTML iconHTML

    Summary form only given. A novel method for lossy compression of the two-dimensional curves is introduced based on the arc-length parameterization. This method has a number of advantages: it is progressive, converges uniformly, and requires the number of the bits proportional to the total arc-length of the curve. The method is applied to the compression of the handwritten letters and scanlines of ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • On the average redundancy rate of adaptive block codes under mixed sources

    Publication Year: 2003
    Cited by:  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (250 KB) | HTML iconHTML

    Summary form only given. The average redundancy rate of Krichevsky's sample-based universal block codes was analyzed, in a situation wherein the samples and block codes for compression were produced from two different memoryless sources. It was proven that the average redundancy rate of adaptive block codes O/sub /spl lscr/,T/, constructed using samples of length /spl lscr/ from a source T, were u... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Image classification using GMM with context information and with a solution of singular covariance problem

    Publication Year: 2003
    Cited by:  Papers (5)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (249 KB) | HTML iconHTML

    Summary form only given. Taking the average of feature vectors from the center and neighboring blocks to a block being coded is proposed as a method of considering context information in block classification. The algorithm has the advantage of low complexity. Gauss mixture models (GMM) are adopted to extract features from image blocks, including an algorithm to handle singular covariance matrices.... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Author index

    Publication Year: 2003, Page(s):459 - 461
    Request permission for commercial reuse | PDF file iconPDF (191 KB)
    Freely Available from IEEE
  • Binary combinatorial coding

    Publication Year: 2003
    Cited by:  Papers (2)  |  Patents (4)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (270 KB) | HTML iconHTML

    Summary form only given. A novel binary entropy code, called combinatorial coding (CC), is presented. The theoretical basis for CC has been described previously under the context of universal coding, enumerative coding, and minimum description length. The code described in these references works as follows: assume the source data are binary of length M, memoryless, and generated with an unknown pa... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Compression of RADARSAT data with block adaptive wavelets

    Publication Year: 2003
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (721 KB) | HTML iconHTML

    Summary form only given. A new algorithm, referred to as the wavelet packet-based embedded blocking code (WPEB), was developed for synthetic aperture radar (SAR) data compression. This algorithm combines the different properties of wavelet packet decomposition, block coding, and speckle reduction. Better results were obtained by using the wavelet packet transform that decomposes the higher frequen... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A new visual masking tool for JPEG2000

    Publication Year: 2003
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (220 KB) | HTML iconHTML

    Summary form only given. A nonuniform quantization scheme, based on the perceptual relevance of each channel signal component, has been developed to exploit HVS properties. This strategy is based on the exploitation of the visual masking effect by performing a subband decomposition. An evaluation of the quality was performed using psychophysical measures for validation. The obtained results denote... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An error-resilient blocksorting compression algorithm

    Publication Year: 2003
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (222 KB) | HTML iconHTML

    Summary form only given. The error susceptibility in the compressed bit stream is considered as a key limitation of adaptive lossless compression systems. The inherent design of these systems often requires that they discard all data subsequent to the error. This is especially problematic in the Burrows-Wheeler blocksorting transform (BWT), with 1MB suffix-sorted blocks. Error-correcting codes, su... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The distributed, partial, and conditional Karhunen-Loeve transforms

    Publication Year: 2003, Page(s):283 - 292
    Cited by:  Papers (14)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (319 KB) | HTML iconHTML

    The Karhunen-Loeve transform (KLT) is a key element of many signal processing tasks, including approximation, compression, and classification. Many recent applications involve distributed signal processing where it is not generally possible to apply the KLT to the signal; the KLT must be approximated in a distributed fashion. Investigations were carried out on the distributed approximations to the... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Performance of universal codes over infinite alphabets

    Publication Year: 2003, Page(s):402 - 410
    Cited by:  Papers (15)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (273 KB) | HTML iconHTML

    It was known that universal compression of strings generated by independent and identically distributed sources over infinite alphabets entails infinite per-symbol redundancy. Alternative compression schemes, which decompose the description of such strings into a description of the symbols appearing in the string, and a description of the arrangement of the symbols form were presented. Two descrip... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Pattern matching by means of multi-resolution compression

    Publication Year: 2003
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (272 KB) | HTML iconHTML

    Summary form only given. The problem of compressed pattern matching deals with the ways to find a pattern within a compressed file, without decompressing. The techniques for solving the problem fall into two major categories, creating a unique compression scheme that enables efficient pattern matching; or using some known compression scheme and develop algorithms to search the files being produced... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.