By Topic

IEEE Transactions on Information Theory

Issue 4 • Date Jul 1991

Filter Results

Displaying Results 1 - 25 of 37
  • Multivariate probability density deconvolution for stationary random processes

    Publication Year: 1991, Page(s):1105 - 1115
    Cited by:  Papers (42)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (688 KB)

    The kernel-type estimation of the joint probability density functions of stationary random processes from noisy observations is considered. Precise asymptotic expressions and bounds on the mean-square estimation error are established, along with rates of mean-square convergence, for processes satisfying a variety of mixing conditions. The dependence of the convergence rates on the joint density of... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • New bounds on the redundancy of Huffman codes

    Publication Year: 1991, Page(s):1095 - 1104
    Cited by:  Papers (27)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (596 KB)

    Upper and lower bounds are obtained for the redundancy of binary Huffman codes for a memoryless source whose least likely source letter probability is known. Tight upper bounds on redundancy in terms of the most and least likely source letter probabilities are provided View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The zero-frequency problem: estimating the probabilities of novel events in adaptive text compression

    Publication Year: 1991, Page(s):1085 - 1094
    Cited by:  Papers (196)  |  Patents (7)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (836 KB)

    Approaches to the zero-frequency problem in adaptive text compression are discussed. This problem relates to the estimation of the likelihood of a novel event occurring. Although several methods have been used, their suitability has been on empirical evaluation rather than a well-founded model. The authors propose the application of a Poisson process model of novelty. Its ability to predict novel ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Theory of lattice-based fine-coarse vector quantization

    Publication Year: 1991, Page(s):1072 - 1084
    Cited by:  Papers (7)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (956 KB)

    The performance of a lattice-based fast vector quantization (VQ) method, which yields rate-distortion performance to that of an optimal VQ, is analyzed. The method, which is a special case of fine-coarse vector quantization (FCVQ), uses the cascade of a fine lattice quantizer and a coarse optimal VQ to encode a given source vector. The second stage is implemented in the form of a lookup table, whi... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Weight enumerators of self-dual codes

    Publication Year: 1991, Page(s):1222 - 1225
    Cited by:  Papers (46)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (332 KB)

    Some construction techniques for self-dual codes are investigated, and the authors construct a singly-even self-dual [48,24,10]-code with a weight enumerator that was not known to be attainable. It is shown that there exists a singly-even self-dual code C' of length n =48 and minimum weight d=10 whose weight enumerator is prescribed in the work of J.H. Conway et al. (see... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Zero-crossings of a wavelet transform

    Publication Year: 1991, Page(s):1019 - 1033
    Cited by:  Papers (384)  |  Patents (11)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (1208 KB)

    The completeness, stability, and application to pattern recognition of a multiscale representation based on zero-crossings is discussed. An alternative projection algorithm is described that reconstructs a signal from a zero-crossing representation, which is stabilized by keeping the value of the wavelet transform integral between each pair of consecutive zero-crossings. The reconstruction algorit... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • On (k, t)-subnormal covering codes

    Publication Year: 1991, Page(s):1203 - 1206
    Cited by:  Papers (11)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (368 KB)

    The concept of a (k, t)-subnormal covering code is defined. It is discussed how an amalgamated-direct-sumlike construction can be used to combine such codes. The existence of optimal (q, n, M) 1 codes C is discussed such that by puncturing the first coordinate of C one obtains a code with (q, 1)-subnorm 2 View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Limiting efficiencies of burst-correcting array codes

    Publication Year: 1991, Page(s):976 - 982
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (440 KB)

    The author evaluates the limiting efficiencies e(-S ) of burst-correcting array codes A(n1,n2, -s) for all negative readouts -s as n2 tends to infinity and n1 is properly chosen to maximize the efficiency. Specializing the result to the products of the first i prim... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Reuseable memories in the light of the old arbitrarily varying and a new outputwise varying channel theory

    Publication Year: 1991, Page(s):1143 - 1150
    Cited by:  Papers (5)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (544 KB)

    Arbitrarily varying channels have been introduced as a model for transmission in cases of jamming. It is shown that this theory applies naturally to memories and yields, in a unified way, some new and old capacity theorems for write-unidirectional memories with side information. The role of cycles via outputwise varying channels is discussed. Exact conditions for memories to have positive capacity... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A Bayesian approach for classification of Markov sources

    Publication Year: 1991, Page(s):1067 - 1071
    Cited by:  Papers (4)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (352 KB)

    A Bayesian approach for classification of Markov sources whose parameters are not explicitly known is developed and studied. A universal classifier is derived and shown to achieve, within a constant factor, the minimum error probability in a Bayesian sense. The proposed classifier is based on sequential estimation of the parameters of the sources, and it is closely related to earlier proposed univ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Explicit formulas for self-complementary normal bases in certain finite fields

    Publication Year: 1991, Page(s):1220 - 1222
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (180 KB)

    Explicit formulas are given for sets of p elements forming a self-complementary normal basis of GF(qp) over GF(q), where p is the characteristic of GF(q ). Using these formulas, a straightforward construction of self-complementary bases for GF(qα) (where α=pm) over GF(q) is also pre... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Optimally near-far resistant multiuser detection in differentially coherent synchronous channels

    Publication Year: 1991, Page(s):1006 - 1018
    Cited by:  Papers (40)  |  Patents (4)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (996 KB)

    The noncoherent demodulation of differentially phase-shift keyed signals transmitted simultaneously via a synchronous code-division multiple-access (CDMA) channel is studied under the assumption of white Gaussian background noise. A class of noncoherent linear detectors is defined with the objective of obtaining the optimal one. The performance criterion considered is near-far resistance that deno... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Decoding binary 2-D cyclic codes by the 2-D Berlekamp-Massey algorithm

    Publication Year: 1991, Page(s):1200 - 1203
    Cited by:  Papers (17)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (376 KB)

    A method of decoding two-dimensional (2-D) cyclic codes by applying the 2-D Berlekamp-Massey algorithm is proposed. To explain this decoding method, the author introduces a subclass of 2-D cyclic codes, which are called 2-D BCH codes due to their similarity with BCH codes. It is shown that there are some short 2-D cyclic codes with a better cost parameter value. The merit of the approach is verifi... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • On multilevel block modulation codes

    Publication Year: 1991, Page(s):965 - 975
    Cited by:  Papers (50)  |  Patents (5)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (864 KB)

    The multilevel technique for combining block coding and modulation is investigated. A general formulation is presented for multilevel modulation codes in terms of component codes with appropriate distance measures. A specific method for constructing multilevel block modulation codes with interdependency among component codes is proposed. Given a multilevel block modulation code C with no ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Two-dimensional harmonic retrieval and its time-domain analysis technique

    Publication Year: 1991, Page(s):1185 - 1188
    Cited by:  Papers (10)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (300 KB)

    The two-dimensional harmonic retrieval is examined in theory by confirming that the 2-D sinusoids in white noise are modeled as a special 2-D autoregressive moving average (ARMA) process whose AR parameters are identical to the MA ones. A new analysis technique for resolving 2-D sinusoids in white noise is proposed View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Modified Q-functions and their use in detection analysis

    Publication Year: 1991, Page(s):1123 - 1142
    Cited by:  Papers (4)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (1152 KB)

    The Marcum's Q-function is extended to complex arguments. In particular, a set of modified generalized Q-functions for imaginary values of one or both arguments are defined and their properties are investigated. An extended probabilistic interpretation for the modified Q-functions is given appealing to a proposed generalization of the concept of random variable. Complex ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Internal models and recursive estimation for 2-D isotropic random fields

    Publication Year: 1991, Page(s):1055 - 1066
    Cited by:  Papers (4)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (884 KB)

    Efficient recursive smoothing algorithms are developed for isotropic random fields that can be obtained by passing white noise through rational filters. The estimation problem is shown to be equivalent to a countably infinite set of 1-D separable two-point boundary value smoothing problems. The 1-D smoothing problems are solved using a Markovianization approach followed by a standard 1-D smoothing... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A systolic Reed-Solomon encoder

    Publication Year: 1991, Page(s):1217 - 1220
    Cited by:  Papers (13)  |  Patents (8)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (268 KB)

    An architecture for a Reed-Solomon (RS) encoder is presented, consisting of r+1 systolic cells, where r is the redundancy of the code. The systolic encoder is systematic, does not contain any feedback or other global signals, its systolic cells are of low complexity, and it is easily reconfigurable for variable redundancy and changes in the choice of generator polynomial of the c... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Worst-case interactive communication. II. Two messages are not optimal

    Publication Year: 1991, Page(s):995 - 1005
    Cited by:  Papers (27)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (792 KB)

    For pt.I see ibid., vol.36, no.5, p.1111-26, (1990). The author defines the chromatic-decomposition number of a hypergraph and shows that, under general conditions, it determines the two message complexity. This result is then used to provide that two messages are not optimal. Protocols, complexities, and the characteristic hypergraph of (X,Y) are defined. The playoffs problem is describe... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Reduced lists of error patterns for maximum likelihood soft decoding

    Publication Year: 1991, Page(s):1194 - 1200
    Cited by:  Papers (28)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (680 KB)

    A method whereby a substantially reduced family of error patterns, called survivors, may be created for maximum likelihood soft decoding is introduced. The family of survivor depends on the received word. A decoder based on this approach first forms the survivors, then scores them. Rather than obtaining the survivors by online elimination of error patterns, the use of predetermined lists that repr... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Note on `The calculation of the probability of detection and the generalized Marcum Q-function'

    Publication Year: 1991
    Cited by:  Papers (5)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (88 KB)

    The author presents corrections to his original paper (see ibid., vol.35, no.2, p.389-400, 1989). The corrections concern computational cases using the steepest descent integration technique. It is pointed out that, for certain specific parameter ranges, the calculation error is too large to be accounted for by accumulated round-off error View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Strengthening Simmons' bound on impersonation

    Publication Year: 1991, Page(s):1182 - 1185
    Cited by:  Papers (11)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (336 KB)

    Simmons' lower bound on impersonation P1⩾2 -I(M;E) where M and E denote the message and the encoding rule, respectively, is strengthened by maximizing over the source statistics and by allowing dependence between the message and the encoding rule. The authors show that a refinement of their argument, which removes the assumption of independence ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Quickest detection of a time-varying change in distribution

    Publication Year: 1991, Page(s):1116 - 1122
    Cited by:  Papers (14)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (492 KB)

    A practical algorithm for quickest detection of time-varying arbitrary one-parameter changes in a sequence of independent random variables is developed. The amplitude of the parameter need not to be known. This model can be applied to the problem of coherent detection of sampled sinusoidal signals of known frequency, but unknown phase and amplitude. The tests are designed according to a maximum al... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Bounds on the redundancy of binary alphabetical codes

    Publication Year: 1991, Page(s):1225 - 1229
    Cited by:  Papers (17)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (416 KB)

    An alphabetical code is a code in which the numerical binary order of the codewords corresponds to the alphabetical order of the encoded symbols. A necessary and sufficient condition for the existence of a binary alphabetical code is presented. The redundancy of the optimum binary alphabetical code is given in comparison with the Huffman code and its upper bound, which is tighter than bounds previ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Minimum complexity density estimation

    Publication Year: 1991, Page(s):1034 - 1054
    Cited by:  Papers (95)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (1928 KB)

    The authors introduce an index of resolvability that is proved to bound the rate of convergence of minimum complexity density estimators as well as the information-theoretic redundancy of the corresponding total description length. The results on the index of resolvability demonstrate the statistical effectiveness of the minimum description-length principle as a method of inference. The minimum co... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.

Aims & Scope

IEEE Transactions on Information Theory publishes papers concerned with the transmission, processing, and utilization of information.

Full Aims & Scope

Meet Our Editors

Editor-in-Chief
Frank R. Kschischang

Department of Electrical and Computer Engineering