By Topic

Signal Processing and Information Technology (ISSPIT), 2011 IEEE International Symposium on

Date 14-17 Dec. 2011

Filter Results

Displaying Results 1 - 25 of 105
  • [Front cover]

    Publication Year: 2011 , Page(s): c1
    Save to Project icon | Request Permissions | PDF file iconPDF (95 KB)  
    Freely Available from IEEE
  • Program

    Publication Year: 2011 , Page(s): 1 - 26
    Save to Project icon | Request Permissions | PDF file iconPDF (4930 KB)  
    Freely Available from IEEE
  • Organizers

    Publication Year: 2011 , Page(s): 1
    Save to Project icon | Request Permissions | PDF file iconPDF (223 KB)  
    Freely Available from IEEE
  • Support from

    Publication Year: 2011 , Page(s): 1
    Save to Project icon | Request Permissions | PDF file iconPDF (122 KB)  
    Freely Available from IEEE
  • Table of contents

    Publication Year: 2011 , Page(s): 1 - 9
    Save to Project icon | Request Permissions | PDF file iconPDF (3689 KB)  
    Freely Available from IEEE
  • Mechanical vibration signal compression by LOT-based subband coding

    Publication Year: 2011 , Page(s): 001 - 004
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (975 KB) |  | HTML iconHTML  

    A novel compression method for the mechanical vibration signals is proposed in this paper. The vibration signal is first decomposed into subbands, by the intermediate of the Lapped Orthogonal Transform. Next, adaptive bit allocation is done on per subband basis and uniform quantization is performed in each subband. The method is applied on a large number of mechanical vibration signals issued by aircraft engines and it shows good results. Due to the quality of the decoding, the reconstructed signals are usable by post-compression treatments, such as fault detection for health monitoring purposes. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An interactive city framework: Mobile Cloud Computing approach

    Publication Year: 2011 , Page(s): 005 - 010
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1197 KB) |  | HTML iconHTML  

    This paper presents a mobile augmented reality framework for an interactive three-dimensional city. One of the challenging issues in the design of mobile scalable augmented reality systems is the limitation of computational power. The proposed framework provides a scalable, service-based and open source approach to integrate the real and virtual scene. The key strength of the proposed framework is adopting an emerging and challenging approach: Mobile Cloud Computing. The main advantage of this approach is reducing mobile constraints: limited battery life and limited memory size. The paper focus on the proposed framework architecture and rationale behind designing its infrastructure to provide interactivity and scalability features. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A Bayesian network-based tunable image segmentation algorithm for object recognition

    Publication Year: 2011 , Page(s): 011 - 016
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1922 KB) |  | HTML iconHTML  

    We present a Bayesian network-based tunable image segmentation algorithm that can be used to segment a particular object of interest (OOI). In tasks such as object recognition, semantically accurate segmentation of the OOI is a critical step. Due to the OOI consisting of different-looking fragments, traditional image segmentation algorithms that are based on the identification of homogeneous regions tend to oversegment. The algorithm presented in this paper uses Multiple Instance Learning to learn prototypical representations of each fragment of the OOI and a Bayesian network to learn the spatial relationships that exist among those fragments. The Bayesian network, as a probabilistic graphical model, in turn becomes evidence that is used for the process of semantically accurate segmentation of future instances of the OOI. The key contribution of this paper is the inclusion of domain-specific information in terms of spatial relationships as an input to a conventional Bayesian network structure learning algorithm. Preliminary results indicate that the proposed method improves segmentation performance. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Content-based image copy detection using dual signatures

    Publication Year: 2011 , Page(s): 017 - 022
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1373 KB) |  | HTML iconHTML  

    We are interested in content-based copy detection of images as a means for protecting intellectual property. The proposed methodology makes use of the discrete cosine transform (DCT) of an averaged image to extract two complementary features, namely ordinal measures and sign information, yielding a dual signature, i. e., a compact feature vector ensuring efficient storage in the image database. Moreover, a specific similarity measurement scheme is designed to handle dual signature comparison during the image retrieval process. Simulation results show the proposed method to outperform two known copy detection methods in terms of retrieval accuracy. Many common image manipulations can be handled such as noise addition, image resizing, Gamma and contrast adjustment, slight shifting, image flipping and 180° rotation. Achieved retrieval rates are very high and confirm the superiority of the proposed scheme. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Simulation of the anti-collision process of RFID systems based on multiple access protocol modeling

    Publication Year: 2011 , Page(s): 023 - 028
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (802 KB) |  | HTML iconHTML  

    In this paper, the authors targeted multiple access protocols to analyze the problem of collision and collision avoidance in RFID systems. By considering user terminals as tags and access point as reader, the authors efficiently modelled the wireless communication channel by considering the capture effect. Otherwise, if the capture effect is not considered, the system is modelled under wired communication. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Performance analysis of basic tree protocols and ALOHA-based anti-collision algorithms used in RFID systems

    Publication Year: 2011 , Page(s): 029 - 034
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1223 KB) |  | HTML iconHTML  

    Radio Frequency Identification (RFID) is a popular automatic identification and an emerging wireless technology with highly promising applications. In an RFID system, the anti-collision algorithm is one of the primary mechanisms to address the issue of multi-object identification. This paper focused on a detailed study of the ALOHA anti-collision problem and a comparative study with the basic binary tree protocols. Two tag estimation methods were investigated and a performance analysis was successfully achieved based on various key design parameters such as the probability of a successful transmission, the collision ratio, the offered load, the system efficiency, the system throughput and the estimation accuracy of successful transmission. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Mining rare cases in post-operative pain by means of outlier detection

    Publication Year: 2011 , Page(s): 035 - 041
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (2090 KB) |  | HTML iconHTML  

    Rare cases are often interesting for health professionals, physicians, researchers and clinicians in order to reuse and disseminate experiences in healthcare. However, mining, i.e. identification of rare cases in electronic patient records, is non-trivial for information technology. This paper investigates a number of well-known clustering algorithms and finally applies a 2nd order clustering approach by combining the Fuzzy C-means algorithm with the Hierarchical one. The approach was used to identify rare cases from 1572 patient cases in the domain of post-operative pain treatment. The results show that the approach enables the identification of rare cases in the domain of post-operative pain treatment and 18% of cases were identified as rare. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Improving N-Finder technique for extracting endmembers

    Publication Year: 2011 , Page(s): 042 - 049
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1035 KB) |  | HTML iconHTML  

    N-FINDER algorithm is widely used for endmember extraction. One of the disadvantages of N- FINDER is that its implementations take long run time due to the relatively large computational complexity of N- FINDER. Successfully reducing the size of the input data set -the hyperspectral image that the algorithm works on can reduce the overall run time of the algorithm. A method for successfully selecting the proper sample of the data set to work on is provided in this paper. Using this reduction technique, a faster and statistically more accurate version of N-FINDER is presented. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A correlation-based algorithm for classifying technical articles

    Publication Year: 2011 , Page(s): 050 - 053
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (2008 KB) |  | HTML iconHTML  

    An enormous amount of information is constantly generated by scientists in various branches of science as a result of research conducted especially in the field of Biology. These research outcomes are reported in journal and conference articles. For example, Pubmed currently stores millions of abstracts and is growing at a rapid pace. Given such a large repository, one of the challenges for any biologist will be to search for articles that will likely have specific information that (s) he is looking for. A computational tool that can come up with a short list of papers that are likely to contain the information of interest will be of great use to any scientist. In this paper we present generic computational techniques that can be used to build such tools. A typical tool that we envision will take as input a set of keywords (that characterize the information of interest) and will develop a learner that is capable of classifying papers into two types. A Type 1 paper does have information of interest and a Type 2 paper does not. It is noteworthy that there are tools reported in the literature that are similar to what we study in this paper. An example is the TextMine algorithm of [11]. We show that our algorithms yield better results than TextMine. For each PubMed paper, the TextMine algorithm computes the likelihood of this paper containing information on minimotifs. As a result, the algorithm assigns a score for each paper. Those papers that have a score above a threshold will be output for the biologists to read manually. TextMine has proven to be a very valuable tool for enhancing the minimotif database of the MnM system [12] [13]. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • CSA-based design of feedforward scalable montgomery modular multiplier

    Publication Year: 2011 , Page(s): 054 - 059
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (749 KB) |  | HTML iconHTML  

    Scalable Montgomery modular multiplier is composed of a queue of processing elements, and the total computation time is proportional to the latency between such elements. By a feedforward architecture proposed by Huang et al., the latency can be brought down from 2 clock cycles to 1 clock cycle. This paper presents both radix-2 and radix-4 CSA-based designs of the new architecture, and by Booth coding and the auxiliary coding the radix-4 design is faster than superior to the radix-2 design in terms of TimexArea. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Statistical analysis of parkinson disease gait classification using Artificial Neural Network

    Publication Year: 2011 , Page(s): 060 - 065
    Cited by:  Papers (11)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1922 KB) |  | HTML iconHTML  

    The aim of this study is to investigate the parameters that could be used to identify abnormal gait pattern in Parkinson's disease subjects during normal walking. Hence, three types of gait parameters namely basic, kinematic and kinetic are evaluated. Initial findings showed that the average mean of cadence, step length and walking speed for Parkinson's disease patients are lower than normal subjects, while the mean of stride time for Parkinson's disease patients are higher. Further, for kinematic parameter, overall joint angle of hip, knee and ankle mean values are lower for Parkinson's disease patients as compared to normal group. In addition, for kinetic parameter, all mean values of ground reaction force parameters are higher for normal subjects with walking speed contributed as the major determinant. To evaluate the significant features that could be used as identification between PD and normal subjects, statistical analysis is conducted. Hence, based on the statistical analysis results, it was found that step length, walking speed, knee angle as well as vertical parameter of ground reaction force are the four significant features as indicators for classification of subject with Parkinson's disease based on the accuracy attained with Artificial Neural Network as classifier. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A technique of time domain sequential data embedding into real object image using spatially modulated illumination

    Publication Year: 2011 , Page(s): 066 - 070
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (746 KB) |  | HTML iconHTML  

    We propose a new technique that uses spatially modulated illumination for embedding lime domain sequential data into moving picture data. This is based on the "optical watermarking" technology that can be applied to protect "analog" objects like pictures painted by artists from having photographs taken of them illegally. Another important application of optical watermarking is embedding real lime data sequence into real object images unconsciously, which may be moving pictures. We carried out experiments that sequential watermarking image data that had 10 fps (frame per second) were irradiated onto objects with projector, and moving image data with the picture rate of 30 fps were taken with a video camera. The result was that the embedded sequential watermarking information could be detected accurately from the obtained moving image data. In the experiments we embedded 256 bit information into watermarking image area of each frame that had the image size of 128 × 128 pixels, that is, the bit-rate of embedding information was 2.5kbps. The experimental results indicate that faster bit-rate and improved detection accuracy may be achieved when lager size of the area of embedding watermarks and higher resolution image for each frame are used. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Dynamic transmission of 3D mesh in wireless walkthrough applications

    Publication Year: 2011 , Page(s): 071 - 079
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1038 KB) |  | HTML iconHTML  

    Many challenges face transmitting highly detailed geometric models in wireless medium such as rendering performance, limited bandwidth, limited resources of mobile devices and storage capacities. In this paper, we proposed a client server communication architecture in wireless medium using progressive mesh to build a dynamic level of details of 3D objects. The dense triangle mesh is decomposed into base mesh and a series of displacement values. This scheme facilitates mesh transmission, storage and selective refinement as well. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Bitwise hill crypto system

    Publication Year: 2011 , Page(s): 080 - 085
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1749 KB) |  | HTML iconHTML  

    This paper describes a modification to the conventional Hill Cipher system. The purpose of this paper is to explore the adeptness of Hill Cipher for binary data. The plaintext is any binary data such as images, audio, video, etc. The plaintext is subjected to scrambling by dividing it into 8 planes. Each of these planes is encrypted using a different key. From the study conducted on Bitwise Hill Cipher, it is found that it has enough security required for commercial applications. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Achieving the workload balance of the clusters

    Publication Year: 2011 , Page(s): 086 - 092
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (2545 KB) |  | HTML iconHTML  

    Workload focuses not on time or level of effort but on the quality of attention devoted to individual cases. However, the load balancing is a technique to distribute workload evenly across two or more computers, network links, CPUs, hard drives, or other resources, in order to get optimal resource utilization, maximize throughput, minimize response time, and avoid overload. Using multiple components with load balancing, instead of a single component, may increase reliability through redundancy. We focus on load balancing policies for homogeneous clustered. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • On the Automatic Prediction of PM10 with in-situ measurements, satellite AOT retrievals and ancillary data

    Publication Year: 2011 , Page(s): 093 - 098
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1352 KB) |  | HTML iconHTML  

    Daily monitoring of unhealthy particles suspended in the low troposphere is of major concern around the world, and ground- based measuring stations represent a reliable but still inadequate means for a full spatial coverage assessment. Advances in satellite sensors have provided new datasets and though less precise than in-situ observations, they can be combined altogether to enhance the prediction of paniculate matter. In this article we evaluate a methodology for automatic multi-variate estimation of PM10 dry mass concentrations along with a comparison of three different cokriging estimators, which integrate ground measurements of PM10, satellite MODIS-derived retrievals of aerosols optical thickness and further auxiliary data. Results highlight the need for further improvements and studies. The analysis employs the available data in 2007 over the Emilia Romagna region (Padana Plain, Northern Italy), where stagnant meteorological conditions further urge for a comprehensive air quality monitoring. Qualitative PM10 full maps of Emilia Romagna are then automatically yielded on-line in a dynamic CIS environment for multi-temporal analysis on air quality. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • EM algorithm of spherical models for binned data

    Publication Year: 2011 , Page(s): 099 - 105
    Cited by:  Papers (4)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (2200 KB) |  | HTML iconHTML  

    In cluster analysis, dealing with large quantity of data is computational expensive. And binning data can be efficient in solving this problem. In the former study, basing cluster analysis on Gaussian mixture models becomes a classical and powerful approach. EM and CEM algorithm are commonly used in mixture approach and classification approach respectively. According to the parametrization of the variance matrices (allowing some of the features of clusters be the same or different: orientation, shape and volume), 14 Gaussian parsimonious models can be generated. Choosing the right parsimonious model is important in obtaining a good result. According to the existing study, Binned-EM algorithm was performed for the most general and diagonal model. In this paper, we apply binned-EM algorithm on spherical models. Two spherical models are studied and their performances on simulated data are compared. The influence of the size of bins in binned-EM algorithm is analyzed. Practical application is shown by applying on Iris data. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • No-reference Image Semantic Quality Approach using Neural Network

    Publication Year: 2011 , Page(s): 106 - 113
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (2260 KB) |  | HTML iconHTML  

    Assessment for image quality traditionally needs its original image as a reference but the most of time it is not the case. So, No-Reference (NR) Image Quality Assessment (IQA) seeks to assign quality scores that are consistent with human perception but without an explicit comparison with the reference image. Unfortunately, the field of NR IQA has been largely unexplored. This paper presents a new NR Image Semantic Quality Approach (NR-1SQA) that employs adaptive Neural Networks (NN) to assess the semantic quality of image color. This NN measures the quality of an image by predicting the mean opinion score (MOS) of human observer, using a set of proposed key features especially to describe color. This challenging issues aim at emulating judgment and replacing very complex and time- consuming subjective quality assessment. Two variants of our approach are proposed: the direct and the progressive of the overall quality image. The results show the performances of the proposed approach compared with the human performances. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Criteria for performance improvement in metropolitan area WDM ring networks

    Publication Year: 2011 , Page(s): 114 - 119
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1335 KB) |  | HTML iconHTML  

    In this paper, we present three efficient and combined algorithms suitable for WDM ring metropolitan area networks. The first algorithm determines the input traffic allocation into a multiple buffer architecture at each node. The second algorithm defines an effective access scheme to avoid both the data wavelengths and the receiver collisions. Finally, the third algorithm introduces an effective buffer selection for transmission technique that combines the priority criteria of receiver collisions avoidance and packet age. In this way, we apply a slotted WDMA protocol in order to improve the limited bandwidth utilization that many WDMA protocols for MANs introduce, especially at high loads. The proposed protocol behaviour is investigated through exhaustive simulations assuming Poisson traffic sources. It is proven that we achieve dropping probability and delay reduction, while we manage throughput improvement. Our study concludes with the determination of the required number of buffers per node to maximize throughput. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Time-frequency signal and image processing of non-stationary signals with application to the classification of newborn EEG abnormalities

    Publication Year: 2011 , Page(s): 120 - 129
    Cited by:  Papers (5)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (3339 KB) |  | HTML iconHTML  

    This paper presents an introduction to time-frequency (T-F) methods in signal processing, and a novel approach for EEG abnormalities detection and classification based on a combination of signal related features and image related features. These features which characterize the non- stationary nature and the multi-component characteristic of EEG signals, are extracted from the T-F representation of the signals. The signal related features are derived from the T-F representation of EEG signals and include the instantaneous frequency, singular value decomposition, and energy based features. The image related features are extracted from the T-F representation considered as an image, using T-F image processing techniques. These combined signal and image features allow to extract more information from a signal. The results obtained on newborn and adult EEG data, show that the image related features improve the performance of the EEG seizure detection in classification systems based on multi-SVM classifier. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.