By Topic

Vision, Image and Signal Processing, IEE Proceedings -

Issue 2 • Date Apr 1995

Filter Results

Displaying Results 1 - 7 of 7
  • Peano scanning based classified vector quantiser

    Publication Year: 1995 , Page(s): 111 - 119
    Cited by:  Papers (1)
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (1156 KB)  

    The authors present a classified vector quantiser (CVQ) based on Peano scanning. The Peano scanning, which is used to reduce the dimensionality of the data, provides a one-dimensional algorithm to classify an image block. The class of the block is determined based on its Peano scanning value from a look up table (LUT) of representative Peano scanning values and their associated classes. The Peano scanning algorithm is easily implemented in hardware, and the class can be determined in a logarithmic time proportional to the number of entries in the LUT when using a binary search algorithm on the sorted version of the LUT. Moreover, the class lookup table is easily implemented in real time. An effective algorithm is used to generate all the codebooks of the classes simultaneously in a systematic way by growing a greedy tree for each class in an interconnected way. The monochromatic images encoded in the range of 0.625~0.813 bits/pixel, with a 16-dimensional vector size, are shown to preserve the edge integrity and quality as determined by subjective and objective measures View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Mode filtering to reduce ultrasound speckle for feature extraction

    Publication Year: 1995 , Page(s): 87 - 94
    Cited by:  Papers (5)  |  Patents (4)
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (620 KB)  

    The authors investigate the use of filtering techniques to reduce speckle in ultrasound images, to improve their suitability for later feature extraction. The maximum likelihood estimator for a speckle corrupted image is shown to correspond to the statistical mode but this is difficult to determine for small populations, such as those contained by a filter mask. The truncated median filter approximates the mode by using the order of known image statistics and provides a fully automated image processing technique for speckle filtering. The filter's performance is established using a new quantitative evaluation scheme that closely considers the effect of filtering on edges, a key factor when applying features extraction in automated image interpretation. Application to in vivo and phantom test images shows that the truncated median filter provides clear images with strong edges, of quality exceeding that of other techniques. These benefits are confirmed by the application of feature extraction in arterial wall labelling View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Fixed-point error analysis of radix-4 FHT algorithm with optimised scaling schemes

    Publication Year: 1995 , Page(s): 65 - 70
    Cited by:  Papers (2)
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (280 KB)  

    The fixed-point error performance of the various fast Hartley transform (FHT) algorithms have been investigated. Scaling schemes have been proposed for each of the algorithms. However, due to their better error performance, only the decimation-in-time (DIT) FHT algorithms have been examined. The fixed-point error analysis of the radix-4 DIT algorithm is discussed first and is shown to agree closely with the simulation results. These results are then compared with the simulation results for radix-2 and split-radix algorithms. The scaling schemes are then optimised and the simulation results of the three algorithms are compared. It is concluded that the radix-4 DIT algorithm has the best error performance View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Generalised scheme for optimal learning in recurrent neural networks

    Publication Year: 1995 , Page(s): 71 - 77
    Cited by:  Papers (2)
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (388 KB)  

    A new learning scheme is proposed for neural network architectures like the Hopfield network and bidirectional associative memory. This scheme, which replaces the commonly used learning rules, follows from the proof of the result that learning in these connectivity architectures is equivalent to learning in the 2-state perceptron. Consequently, optimal learning algorithms for the perceptron can be directly applied to learning in these connectivity architectures. Similar results are established for learning in the multistate perceptron, thereby leading to an optimal learning algorithm. Experimental results are provided to show the superiority of the proposed method View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • AR modelling of skewed signals using third-order cumulants

    Publication Year: 1995 , Page(s): 78 - 86
    Cited by:  Papers (1)
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (520 KB)  

    Algorithms for selecting the order and estimating the parameters of an AR process, which is driven by noise having an underlying non-Gaussian distribution, from the observed noisy time series are presented. The order selection algorithm makes use of the growing memory covariance predictive least-squares (GMCPLS) criterion together with diagonal slices of the third-order cumulant plane. A triangular region of the third-order cumulant plane is used to estimate the model parameters. Extensive simulation results are presented and based on these trends, one of which has been verified using real data obtained from a rotating machine, recommendations are made on the efficacy of methods for AR order selection and parameter estimation problems View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Pulsed residual excited linear prediction

    Publication Year: 1995 , Page(s): 105 - 110
    Cited by:  Papers (2)
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (460 KB)  

    Linear predictive coding of speech has been widely used at 16 kb/s in the form of adaptive predictive coding (APC) down to 4.8 kb/s in the form of code-excited linear prediction (CELP). Since its invention in 1984 there have been many variations of CELP which differ mainly in the way the final excitation signal (codebook) is produced and quantised. These variations either produce better speech quality or lower complexity. Three new excitation types, all of which are based on a pulsed residual, are proposed. The new pulsed residual excitations improve the speech quality significantly. In addition a novel mathematically equivalent codebook search method which reduces the search complexity significantly is described View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Fourier descriptor based technique for reconstructing 3D contours from stereo images

    Publication Year: 1995 , Page(s): 95 - 104
    Cited by:  Papers (2)
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (548 KB)  

    A Fourier descriptor based hierarchy for stereo correspondence has been suggested by Wu and Sheu (see Proceeding of IPPR conference on computer vision, graphics and image processing, p. 263-270, 1993). However, the offset in starting points and the correspondence between the points constrained on the contours associated with the contours in correspondence were unsolved. An iterative algorithm is proposed for the computation of the 3D FDs, given two sets of 2D FDs associated with the correspondence contours. For precise reconstruction, both the starting point offset and the constrained point correspondence problems are also solved. Experiments with a cylindrical object and a bottle gourd show that such an iterative procedure is both fast in the 3D contour reconstruction and indispensable for precise representation if the sizes of the point sets associated with the two correspondence contours deviate significantly View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.