By Topic

IEEE Transactions on Information Theory

Issue 4 • Jul 1993

Filter Results

Displaying Results 1 - 25 of 41
  • Vector quantization with complexity costs

    Publication Year: 1993, Page(s):1133 - 1145
    Cited by:  Papers (55)  |  Patents (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (1528 KB)

    Vector quantization is a data compression method by which a set of data points is encoded by a reduced set of reference vectors: the codebook. A vector quantization strategy is discussed that jointly optimizes distortion errors and the codebook complexity, thereby determining the size of the codebook. A maximum entropy estimation of the cost function yields an optimal number of reference vectors, ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • On the nonperiodic cyclic equivalence classes of Reed-Solomon codes

    Publication Year: 1993, Page(s):1431 - 1434
    Cited by:  Papers (14)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (352 KB)

    Picking up exactly one member from each of the nonperiodic cyclic equivalence classes of an (n, k+1) Reed-Solomon code E over GF(q) gives a code, E", which has bounded Hamming correlation values and the self-synchronizing property. The exact size of E" is shown to be (1/n) Σd|n μ(d)q1+kd... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Common randomness in information theory and cryptography. I. Secret sharing

    Publication Year: 1993, Page(s):1121 - 1132
    Cited by:  Papers (531)  |  Patents (11)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (1096 KB)

    As the first part of a study of problems involving common randomness at distance locations, information-theoretic models of secret sharing (generating a common random key at two terminals, without letting an eavesdropper obtain information about this key) are considered. The concept of key-capacity is defined. Single-letter formulas of key-capacity are obtained for several models, and bounds to ke... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A linear algebra approach to minimal convolutional encoders

    Publication Year: 1993, Page(s):1219 - 1233
    Cited by:  Papers (49)  |  Patents (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (1032 KB)

    The authors review the work of G.D. Forney, Jr., on the algebraic structure of convolutional encoders upon which some new results regarding minimal convolutional encoders rest. An example is given of a basic convolutional encoding matrix whose number of abstract states is minimal over all equivalent encoding matrices. However, this encoding matrix can be realized with a minimal number of memory el... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Nonbinary codes correcting localized errors

    Publication Year: 1993, Page(s):1413 - 1416
    Cited by:  Papers (9)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (332 KB)

    A recursive construction of asymptotically dense codes correcting a constant number of localized errors is examined. The codes overcome difficulties for a particular case with an asymptotic Hamming bound in which the number of errors increases linearly with the length of codes. It is shown that this method is applicable to both binary and nonbinary cases View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Noncausal Gauss Markov random fields: parameter structure and estimation

    Publication Year: 1993, Page(s):1333 - 1355
    Cited by:  Papers (51)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (1884 KB)

    The parameter structure of noncausal homogeneous Gauss Markov random fields (GMRF) defined on finite lattices is studied. For first-order (nearest neighbor) and a special class of second-order fields, a complete characterization of the parameter space and a fast implementation of the maximum likelihood estimator of the field parameters are provided. For general higher order fields, tight bounds fo... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Random infinite trees and supercritical behavior of collision resolution algorithms

    Publication Year: 1993, Page(s):1460 - 1465
    Cited by:  Papers (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (488 KB)

    An analytical evaluation is given of the behavior of the free access stack algorithm when the input load, a Poisson flow of λ packet per slot, is above the maximum throughput achievable by the protocol (within 0.360177 packet per slot) under an infinite population model. In particular, the marginal output stream that the system sustains on the channel is analytically and quantitatively deri... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Further asymptotic upper bounds on the minimum distance of trellis codes

    Publication Year: 1993, Page(s):1428 - 1431
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (360 KB)

    Asymptotic upper bounds on the minimum distance of trellis codes are derived. A universal bound and bounds specific to phase-shift keying (PSK) and quadrature amplitude modulation (QAM) signal sets are obtained View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • On the density of phase-space expansions

    Publication Year: 1993, Page(s):1152 - 1156
    Cited by:  Papers (24)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (332 KB)

    Phase-space decompositions for signals of finite energy have been used to formulate the intuitive but elusive idea that frequency content of a signal can vary with time as the signal evolves. The decompositions consist of subdividing the time and frequency axes into certain ranges and expanding signals stably in a two-parameter basis of fixed functions of time, in which the (k,m)... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A coding scheme for single peak-shift correction in (d, k)-constrained channels

    Publication Year: 1993, Page(s):1444 - 1450
    Cited by:  Papers (9)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (712 KB)

    A two step coding scheme for peak-shift correction in (d, k)-constrained sequences is described. The first step is based on q-ary (q=k-d+1 is a prime) block codes that allow correction of specific types of double errors caused by single peak-shifts. The second step is a simple conversion of q-ary symbols to binary strings of the type 0... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Asymptotic results for maximum likelihood estimation with an array of sensors

    Publication Year: 1993, Page(s):1374 - 1385
    Cited by:  Papers (4)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (1004 KB)

    In many cases, the maximum likelihood (ML) estimator is consistent and asymptotically normal with covariance equal to the inverse of the Fisher's information matrix. It does not follow, though, that the covariance of the ML estimator approaches the Cramer-Rao lower bound as the sample size increases. However, it is possible to draw such a conclusion for the adaptive array problem in which directio... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Multistage decoding of multilevel block M-PSK modulation codes and its performance analysis

    Publication Year: 1993, Page(s):1204 - 1218
    Cited by:  Papers (26)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (1288 KB)

    Multistage decoding of multilevel block multilevel phase-shift keying (M-PSK) modulation codes for the additive white Gaussian noise (AWGN) channel is investigated. Several types of multistage decoding, including a suboptimum soft-decision decoding scheme, are devised and analyzed. Upper bounds on the probability of an incorrect decoding of a code are derived for the proposed multistage d... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Nonparametric maximum entropy

    Publication Year: 1993, Page(s):1409 - 1413
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (540 KB)

    The standard maximum entropy method developed by J.P. Burg (1967) and the resulting autoregressive model have been widely applied to spectrum estimation and prediction. A generalization of the maximum entropy formalism in a nonparametric setting is presented, and the class of the resulting solutions is identified to be a class of Markov processes. The proof is based on a string of information theo... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Partition function estimation of Gibbs random field images using Monte Carlo simulations

    Publication Year: 1993, Page(s):1322 - 1332
    Cited by:  Papers (13)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (864 KB)

    A Monte Carlo simulation technique for estimating the partition function of a general Gibbs random field image is proposed. By expressing the partition function as an expectation, an importance sampling approach for estimating it using Monte Carlo simulations is developed. As expected, the resulting estimators are unbiased and consistent. Computations can be performed iteratively by using simple M... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • On the design of an optimal quantizer

    Publication Year: 1993, Page(s):1180 - 1194
    Cited by:  Papers (11)  |  Patents (21)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (1312 KB)

    The problem of designing an optimal quantizer with a fixed number of levels for a wide class of error weighting functions and an arbitrary distribution function is discussed. The existence of an optimal quantizer is proved, and a two-stage algorithm for its design is suggested. In this algorithm, at the first stage, Lloyd's iterative Method I is applied for reducing the region where, at the second... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Distributed estimation and quantization

    Publication Year: 1993, Page(s):1456 - 1459
    Cited by:  Papers (119)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (324 KB)

    An algorithm is developed for the design of a nonlinear, n-sensor, distributed estimation system subject to communication and computation constraints. The algorithm uses only bivariate probability distributions and yields locally optimal estimators that satisfy the required system constraints. It is shown that the algorithm is a generalization of the classical Lloyd-Max results View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A relation between Levenshtein-type distances and insertion-and-deletion correcting capabilities of codes

    Publication Year: 1993, Page(s):1424 - 1427
    Cited by:  Papers (12)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (352 KB)

    A code is a collection of words or strings, not necessarily all of the same length, over come fixed alphabet. A relation is established between the insertion-and-deletion correcting capability of a code and its minimum distance for suitable Levenshtein-type distance measures View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Kullback-Leibler information measure for studying convergence rates of densities and distributions

    Publication Year: 1993, Page(s):1401 - 1404
    Cited by:  Papers (10)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (308 KB)

    The Kullback-Leibler (KL) information measure l(f1:f2) is proposed as an index for studying rates of convergence of densities and distribution functions. To this end, upper bounds in terms of l(f 1:f2) for several distance functions for densities and for distribution functions are obtained. Many illus... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Sliding block codes between constrained systems

    Publication Year: 1993, Page(s):1303 - 1309
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (620 KB)

    The construction of finite-state codes between constrained systems called sofic systems introduced by R. Karabed and B. Marcus (1988) is continued. It is shown that if Σ is a shift of finite type and S is a sofic system with k/n=h(S )/h(Σ), where h denotes entropy, there is a noncatastrophic finite-state invertible code fro... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Strong universal consistency of neural network classifiers

    Publication Year: 1993, Page(s):1146 - 1151
    Cited by:  Papers (27)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (496 KB)

    In statistical pattern recognition, a classifier is called universally consistent if its error probability converges to the Bayes-risk as the size of the training data grows for all possible distributions of the random variable pair of the observation vector and its class. It is proven that if a one-layered neural network with properly chosen number of nodes is trained to minimize the empirical ri... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Maximum likelihood decoding of the Leech lattice

    Publication Year: 1993, Page(s):1435 - 1444
    Cited by:  Papers (25)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (804 KB)

    An algorithm for maximum likelihood decoding of the Leech lattice is presented. The algorithm involves projecting the points of the Leech lattice directly onto the codewords of the (6,3,4) quaternary code-the hexacode. Projection on the hexacode induces a partition of the Leech lattice into four cosets of a certain sublattice 24. Such a partition into cosets enables maximum likelihood decoding of ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Multilevel codes for unequal error protection

    Publication Year: 1993, Page(s):1234 - 1248
    Cited by:  Papers (121)  |  Patents (18)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (1056 KB)

    Two combined unequal error protection (UEP) coding and modulation schemes are proposed. The first method multiplexes different coded signal constellations, with each coded constellation providing a different level of error protection. In this method, a codeword specifies the multiplexing rule and the choice of the codeword from a fixed codebook is used to convey additional important information. T... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The maximal error capacity of arbitrarily varying channels for constant list sizes

    Publication Year: 1993, Page(s):1416 - 1417
    Cited by:  Papers (7)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (168 KB)

    The capacity of an arbitrarily varying channel (AVC) for list codes of arbitrarily small list rate under the maximal error criterion has previously been determined. The authors prove that for an AVC A , any rate R below the list code capacity C1 (A) is achievable with a particular list size dependent on A, R and Y, where ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Markov modeling for Bayesian restoration of two-dimensional layered structures

    Publication Year: 1993, Page(s):1356 - 1373
    Cited by:  Papers (12)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (1612 KB)

    Bayesian estimation of two-dimensional stratified structures is described. The major point addressed is the derivation of a statistical prior model that adequately describes such layered media. This problem is of interest in applications in which the data are generally processed in one dimension only. In order to take local interactions into account, a Markovian description is used. The model is d... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Wavelet transforms associated with finite cyclic groups

    Publication Year: 1993, Page(s):1157 - 1166
    Cited by:  Papers (42)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (824 KB)

    Multiresolution analysis via decomposition on wavelet bases has emerged as an important tool in the analysis of signals and images when these objects are viewed as sequences of complex or real numbers. An important class of multiresolution decompositions are the Laplacian pyramid schemes, in which the resolution is successively halved by recursively low-pass filtering the signal under analysis and... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.

Aims & Scope

IEEE Transactions on Information Theory publishes papers concerned with the transmission, processing, and utilization of information.

Full Aims & Scope

Meet Our Editors

Editor-in-Chief
Prakash Narayan 

Department of Electrical and Computer Engineering