By Topic

Signal Processing Symposium, 2006. NORSIG 2006. Proceedings of the 7th Nordic

Date 7-9 June 2006

Filter Results

Displaying Results 1 - 25 of 89
  • Harmonic All-Pole Modelling for Glottal Inverse Filtering

    Publication Year: 2006 , Page(s): 182 - 185
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (4635 KB) |  | HTML iconHTML  

    Glottal inverse filtering is a process, where the acoustic effect of the human vocal tract is removed from the speech signal, to obtain the flow through the vocal folds, the glottal flow. In this article, we present a fully automatic algorithm for inverse filtering, harmonic all-pole inverse filtering, that employs the harmonic structure of speech to obtain the impulse response of both the vocal tract and the glottal flow. The method assumes that the vocal tract and glottal flow waveform can be modelled with minimum- and maximum-phase AR-models, respectively, and that each glottal cycle has a distinct excitation. We present results for male and female speakers for normal, breathy and pressed voices View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Log Likelihood Ratio Based Annotation Verification of a Norwegian Speech Synthesis Database

    Publication Year: 2006 , Page(s): 186 - 189
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (4911 KB) |  | HTML iconHTML  

    Accurate labeling and segmentation of the unit inventory database is of vital importance to the quality of unit selection text-to-speech synthesis. Misalignments and mismatch between the predicted and pronounced unit sequences require manual correction to achieve natural sounding synthesis. In this paper we have used a log likelihood ratio based utterance verification to automatically detect annotation errors in a Norwegian two-speaker synthesis database. Each sentence is assigned a confidence score and those falling below a threshold can be discarded or manually inspected and corrected. Using equal reject number as a criterion the transcription sentence error rate was reduced from 9.8% to 2.7%. Insertions are the largest error category, and 95.6% of these were detected. A closer inspection of false rejections was performed to assess (and improve) the phoneme prediction system View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A Preliminary Study on Applying the Conditional Modeling to Automatic Dialect Classification

    Publication Year: 2006 , Page(s): 190 - 193
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (4558 KB) |  | HTML iconHTML  

    This paper addresses the advances in unsupervised dialect classification. There are no transcripts for both the training data and the testing data. In this study, we view the classification problem in speech in an recognition-based way instead of the conventional generative model-based approach and try to bypass the unknown transcript problem. The new algorithm is based on conditional model. The new algorithm has two notable advantages: first, it can train a statistical model without transcripts, so it can work in our transcript-free classification problem; second, the conditional model in the new algorithm can allow arbitrary feature representations, therefore, it can encode more discriminative features than the generative models such as hidden Markov model (HMM), which has to use the independent and local features due to the model restrictions. The conditional model used in the study is the conditional random fields (CRF). Further study on combining the generative model and conditional model is presented. In the Spanish dialect classification evaluation, the CRF and the combined modeling technique show some interesting results View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Analysis of Harmonic and Subharmonic Effects in a Transversal Flute Made from Heracleum Laciniatum

    Publication Year: 2006 , Page(s): 194 - 197
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (4392 KB) |  | HTML iconHTML  

    In this paper, we present an analysis of recordings of a transversal flute made from the dried stem of the Heracleum laciniatum (Tromsoe Palm or Hogweed). While the lower octave exhibits conventional harmonic spectra, the upper octave surprisingly includes subharmonic components. We believe that the subharmonic contributions are due to nonlinear oscillations of the flute material. A time-frequency analysis of the onset of the sound, shows that the subharmonic oscillators are present from the very beginning, as opposed to the higher harmonics. A bispectral analysis indicates that triplets of frequency components are phase locked View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A model for measuring Quality of Experience

    Publication Year: 2006 , Page(s): 198 - 201
    Cited by:  Papers (9)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (3841 KB) |  | HTML iconHTML  

    In this paper we present a model for measuring the quality of experience (QoE) of multimedia services. Our model consists of measurable and non-measurable parameters and describes a framework for quantifying the model parameters by transformation into an XML document with discreet values. The model has been validated by running a field trial over 4 weeks using 3G mobile phones for a video on demand service and mobile TV View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Automatic Removal of Ocular Artifacts in the EEG without an EOG Reference Channel

    Publication Year: 2006 , Page(s): 130 - 133
    Cited by:  Papers (8)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (4858 KB) |  | HTML iconHTML  

    This paper presents a method for removing electroocular (EOG) artifacts in the electroencephalogram (EEG). The procedure is based on blind source separation (BSS) and, in contrast to methods already available in the literature, it is completely automated and does not require the availability of peri-ocular EOG electrodes. The proposed approach removed most EOG artifacts in 6 long-term EEG recordings containing epilectic seizures without distorting the recorded ictal activity View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Unsupervised Change-Detection in Color Fundus Images of the Human Retina

    Publication Year: 2006 , Page(s): 134 - 137
    Cited by:  Papers (1)  |  Patents (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (5238 KB) |  | HTML iconHTML  

    The aim of this paper is to develop an automatic method for the detection of the changes that occurred in multitemporal digital images of the fundus of the human retina, in terms of white and red spots. The images are acquired from the same patient at different times by a color fundus camera. The proposed approach is unsupervised and is based on a minimum-error thresholding technique. This technique is applied both to separate the "change" and the "no-change" classes in a suitably defined difference image, and to distinguish among different typologies of change. The algorithm is tested on 10 multitemporal pairs of images. A quantitative assessment of the change detection performances suggests that the method is able to provide accurate change maps, although possibly affected by misregistration errors or calibration/acquisition artifacts View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Feature Selection for Morphological Feature Extraction using Randomforests

    Publication Year: 2006 , Page(s): 138 - 141
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (3892 KB) |  | HTML iconHTML  

    Morphological feature extraction (MFE) has been successfully used to increase classification accuracy and reduce the noise level for classification or aerial images. In this paper we explore feature selection and extraction for MFE using random forests (RFs) for classification and feature selection. The approach is compared to MFE from principal components extracted from the data, by principal component analysis (PCA), which has been successful in the past. The experimental results presented in this paper show that by estimating the most important features of our data set using RFs, and selecing a few of the features for MFE yields equal or better accuracies than by using PCs. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Hyperspectral Algorithm for Mapping Tissue Oxygen Saturation

    Publication Year: 2006 , Page(s): 142 - 145
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (4743 KB) |  | HTML iconHTML  

    A simple algorithm based on computational estimates of spectral feature amplitudes in hyperspectral images is described for mapping oxygen delivery in tissue. Features are determined from contiguous image bands spanning the prominent absorption wavelengths of hemoglobin. Pixel values along the z-axis at each image coordinate establish a spectral curve. A reference baseline is obtained by linear interpolation between isosbestic points on the curve which are insensitive to changes in oxygen saturation of the blood. An oxygen-sensitive vector is determined from areas between the recorded curve and baseline, which estimates relative amounts of oxyhemoglobin and deoxyhemoglobin that contribute to the blood spectrum. Corrections are applied to account for variation in total light between recordings, and for the volume of blood at each pixel location in the recording. Here we apply the method to map relative oxygen saturation in the optic nerve head and overlying retinal vessels at normal and elevated intraocular pressure View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Eukaryotic Gene Prediction by Spectral Analysis and Pattern Recognition Techniques

    Publication Year: 2006 , Page(s): 146 - 149
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (4009 KB) |  | HTML iconHTML  

    The problem of computational gene prediction in eukaryotic DNA is investigated. The discrete Fourier transform is used to reveal the periodicity of three which is present in the essential subregions of a gene. We introduce a novel method that allows to predict the position of genes in an optimal way (in the sense of minimal error probability) based on the complex Fourier values at the frequency 1/3. Our method is based on training and testing a bayesian classifier. We simulate gene sequences for training, apply the Fourier transform to the sequences, extract feature vectors from the spectral representation of the binary sequences and train classifiers to discriminate coding from non coding regions in the sequence. The classifier is tested on a real gene sequence where the coding and non coding regions are known View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Buried Tag Identification with a new RBF Classifier

    Publication Year: 2006 , Page(s): 150 - 153
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (4624 KB) |  | HTML iconHTML  

    This article presents a new neural classifier based on an RBF network. This classifier increases relatively the recognition rate while decreasing remarkably the number of hidden layer neurons. It is very general RBF classifier, very simple, not requiring any adjustment parameter, and presenting an excellent ratio performances/neurons number. A comparative study of its performances is presented and illustrated by examples on real databases View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Anomaly detection by auto-association

    Publication Year: 2006 , Page(s): 154 - 157
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (4894 KB) |  | HTML iconHTML  

    Anomaly detectors (or novelty detectors) are systems for detecting behaviour that deviates from "normality ", and are useful in a wide range of surveillance, monitoring and diagnosis applications. Feed-forward auto-associative neural networks have, in several studies, shown to be effective anomaly detectors although they have a tendency to produce false negatives. Existing methods rely on anomalous examples (counter-examples) during training to prevent this problem. However, counter-examples may be hard to obtain in practical anomaly detection scenarios. We therefore propose a training scheme based on regularisation, which both reduces the problem of false negatives and also speeds up the training process, without relying on counter-examples. Experimental results on benchmark machine learning problems verify the potential of the proposed approach View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Wavelets and Support Vector Machine for Forecasting the Meteorological Pollution

    Publication Year: 2006 , Page(s): 158 - 161
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (5056 KB) |  | HTML iconHTML  

    The paper presents the method of daily air pollution forecasting by using support vector machine (SVM) and wavelet decomposition. The considerations are presented for the NO2, CO, SO2 and dust concentrations. The prediction is made on the basis of the past pollution observation as well as the meteorological parameters, like wind, temperature, humidity and pressure. We propose the forecasting approach, applying the neural network of SVM type, working in the regression mode and wavelet decomposition of the measured time series data. The paper presents the results of numerical experiments on the basis of the measurements made by the meteorological stations, situated in the northern region of Poland View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Forward Segmented Wavelet Transform

    Publication Year: 2006 , Page(s): 162 - 165
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (3677 KB) |  | HTML iconHTML  

    The new method of segmented wavelet transform (SegWT) makes it possible to exactly compute the discrete-time wavelet transform of a signal segment-by-segment. This means that the method could be utilized for wavelet-type processing of a signal in "real time", or in case we need to process a long signal (not necessarily in real time), but there is insufficient memory capacity for it (for example in the signal processors). Then it is possible to process the signal part-by-part with low memory costs by the new method. The method is suitable for universal utilization in any place where the signal has to be processed via modification of its wavelet coefficients (e.g. signal denoising, compression, music or speech processing, alternative modulation techniques for xDSL systems, image processing and compression). In the paper, the principle of the forward segmented wavelet transform is described View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • On the Significance of the Zerotree Hypothesis for Wavelet-Based Image Coding

    Publication Year: 2006 , Page(s): 166 - 169
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (4624 KB) |  | HTML iconHTML  

    The significance of the zerotree hypothesis and its importance for the coding efficiency of zerotree-based wavelet image coding schemes is discussed. We apply specific coefficient permutations within the JPEG2000 and SPIHT coding pipelines which allow to control the amount of inter and intra subband correlations present in the visual data to be compressed. Results document and emphasize the importance of the zerotree hypothesis for image coding and suggest that it has a high likelihood of being correct View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Conditioning Lofargrams Using Empirical Mode Decomposition

    Publication Year: 2006 , Page(s): 170 - 173
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (4320 KB) |  | HTML iconHTML  

    Recorded sonar data is processed in many different ways to extract target information from the source signals. Trying to separate different sources of sound, or identifying vessels by examining the emitted tonals, is often done in the time-frequency domain, e.g. using the lofargram. Narrowband components appear as long-time duration lines and transients as broadband short-time duration pulses. The transients may be generated by the same source as the tonal lines or by other sources in the vicinity, or they can be a part of the additional ambient noise. If transients and narrowband components are mixed in the estimation window, distortion and masking of the lines may occur especially in low SNR regimes. A method for improving the visibility of relevant spectral lines in a lofargram is applied. Instead of using filtering or outlier rejection methods to condition the data, the signal is decomposed by using the method of empirical decomposition View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Design of regularity constrained linear phase modulated lapped transforms

    Publication Year: 2006 , Page(s): 174 - 177
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (3452 KB) |  | HTML iconHTML  

    The regularity constrained linear phase filter banks reported recently are based on generalized bi-orthogonal lapped transforms. They do not have a modulated structure and hence all the basis functions are to be optimized separately. Design and implementation complexity increases with the size of the block transform. In this paper, a cosine-modulated approach for the design of orthogonal modulated lapped transforms with basis functions having regularity, arbitrary length and linear phase is presented. Due to the modulated structure, only the prototype filter needs to be optimized to satisfy all the design criteria. Besides simplifying the design process they have efficient implementation based on poly-phase structure and discrete cosine transforms. Added regularity features enhance the performance of the transform in capturing dc as well as polynomial signal variations, leading to smooth reconstructed signals even under low bit rate View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Multi-frequency and Multi-channel Bio-impedance Measurement Solution

    Publication Year: 2006 , Page(s): 178 - 181
    Cited by:  Papers (3)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (3639 KB) |  | HTML iconHTML  

    A DSP-based solution for the 16-bit multi-frequency and multi-channel bio-impedance measurement unit, using of under-sampling discrete Fourier' transform (DFT) has been proposed and realized. The unit has multiple outputs (8) and inputs (4), and operates at frequencies up to 5 MHz. Some performance matters of the proposed solution are discussed, mostly related to noise aliasing and accuracy depending on modulation signal frequency View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Study of Image Retrieval Based on Feature Vectors in Compressed Domain

    Publication Year: 2006 , Page(s): 202 - 205
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (3876 KB) |  | HTML iconHTML  

    An image retrieval method is proposed in this article, exploiting information of frequency components in compressed blocks. In this method images are first processed with block transform and quantization. Subsequently, the binary feature vector (BFV) is formulated to represent the local visual information. Special histograms are generated next based on BFV vectors providing statistical description of distribution of BFV vectors. The BFV concept is then extended to ternary feature vector (TFV). The BFV and TFV histograms are used for the image database retrieval. Three different feature vector schemes are proposed and the performances are investigated. Good retrieval results are obtained for standard public face image database View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Fast Image Retrieval Based on Attributes of the Human Visual System

    Publication Year: 2006 , Page(s): 206 - 209
    Cited by:  Papers (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (5124 KB) |  | HTML iconHTML  

    In this paper we present a new method for content-based image retrieval (CBIR), based on the retinal signal processing of the human visual system (HVS). A center-surround operator similar to the receptive fields of the ganglion cells of the retina is employed to create a new form of color histogram, the center-surround histogram (CSH). Unlike other proposed color histograms, the CSH takes into consideration only the visual signal surrounding the zero-crossings of an image. This reduces the processed amount of visual information and minimizes the computational burden. Furthermore, a combination of spatial and chromatic information of the image is also achieved. The method is compared to other contemporary methods for image retrieval, exhibiting better results in shorter computational times View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Edge Relaxation in Images using Directional Hierarchical Image Decomposition

    Publication Year: 2006 , Page(s): 210 - 213
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (4470 KB) |  | HTML iconHTML  

    This paper presents a new hierarchical method of the edge relaxation by using an edge confidence measure. Proposed method is an adaptive and based on directional hierarchical image decomposition and an edge connecting algorithm. It is shown that such combination has low sensitivity to noise, while it is highly robust to outliers, and provide a quality edge connection mechanism View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Leukemia Cell Recognition with Zernike Moments of Holographic Images

    Publication Year: 2006 , Page(s): 214 - 217
    Cited by:  Papers (3)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (3382 KB) |  | HTML iconHTML  

    In this paper, we use the digital holographic method to classify-recognize an unknown leukemia cell. This is a non-invasive method to microscopy biology samples in order to recognize them. We generate the hologram from the 2D digital images of blood cells and make the reconstruction of leukemia cell through a computer simulation. We utilize approximate Fresnel digital holography in order to simulate optical diffraction patterns of hologram. A feature selection process is done on a computer reconstructed holographic image where we use the Zernike moments as the features of digital image. We take advantage of the rotation invariant property of the Zernike moments in the recognition of leukemia cell due to its unknown rotational direction. We compute the Zernike moments from scale and translation invariant geometric moments. In order to classify the leukemia cell, we use the minimum mean distance and the K-nearest neighbor methods using the invariant features View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Stationarity Analysis of Ambient Noise in the Baltic Sea

    Publication Year: 2006 , Page(s): 218 - 221
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (4215 KB) |  | HTML iconHTML  

    Most signal processing techniques are only valid if the assumption of stationarity is true. This is the basis for making reliable and consistent estimates. Stochastic processes can be categorised by their stationarity properties ranging from stationary to non-stationary processes. The degree of stationary has implications on a number of factors in signal processing, but mainly on the level of reliability of any estimate. Estimates from highly non-stationary data can at times be so bad that the variance of the estimate is by far greater than the estimate itself. In this paper, the degree of stationarity is addressed from a stationarity length point of view and sonar data is tested using the Kolmogorov-Smirnov two sample test View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Preconditioner structures for the CLMS adaptive filtering algorithm

    Publication Year: 2006 , Page(s): 222 - 225
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (3557 KB) |  | HTML iconHTML  

    LMS filtering can be viewed as solving the Wiener-Hopf equation iteratively using the Richardson's iteration with an identity matrix for a preconditioner. The ideal preconditioner in this situation is the inverse of the autocorrelation matrix of the input signal. This is why LMS is the optimal adaptive filter for white input signals. In situations where the input signal is not white one can improve the convergence of the adaptive filter by specifying a fixed preconditioning matrix other than the identity matrix by using approximate a priori knowledge about the input signal's autocorrelation. This is the main idea behind the CLMS algorithm. We develop methods to obtain such preconditioning matrices with different structures that also make the algorithm computationally efficient and test these matrices for convergence rate on AR-1 signals View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The Physical Meaning of Independent Components and Artifact Removal of Hyperspectral Data from Mars using ICA

    Publication Year: 2006 , Page(s): 226 - 229
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (4219 KB) |  | HTML iconHTML  

    The truth about what chemicals are to be found on the surface of Mars lies hidden in Gigabytes of hyperspectral data. How to reveal this mystery is the subject of this paper. Independent component analysis (ICA) is used for identification and classification of endmembers and for artifact removal. The classification results are compared with the result of a wavelet classifier and reference spectra are used for identification of known substances. CO2 ice and water ice and an intimate mixture of CO2 ice and dust are effectively found as independent components, but because of high negative correlation of dust and CO2 ice, dust is not found as a separate component. ICA can be used to valuate the atmospheric effect removal, which is currently being used and can help in this preprocessing. ICA can also be used for other artifacts, such as to find and clean corrupted channels and to detect the effect of the overlay of sensors. It is proposed to view the mixing matrix as a collection of independent components (ICs) spectra, and use this for automatic detection of known endmembers View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.