Scheduled System Maintenance on May 29th, 2015:
IEEE Xplore will be upgraded between 11:00 AM and 10:00 PM EDT. During this time there may be intermittent impact on performance. For technical support, please contact us at onlinesupport@ieee.org. We apologize for any inconvenience.
By Topic

Information Technology and e-Services (ICITeS), 2012 International Conference on

Date 24-26 March 2012

Filter Results

Displaying Results 1 - 25 of 102
  • [Title page]

    Publication Year: 2012 , Page(s): 1
    Save to Project icon | Request Permissions | PDF file iconPDF (26 KB)  
    Freely Available from IEEE
  • [Copyright notice]

    Publication Year: 2012 , Page(s): 1
    Save to Project icon | Request Permissions | PDF file iconPDF (38 KB)  
    Freely Available from IEEE
  • Performance evaluation of encryption algorithm for wireless sensor networks

    Publication Year: 2012 , Page(s): 1 - 8
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (168 KB) |  | HTML iconHTML  

    With the widespread growth in applications for resource limited Wireless Sensor Networks (WSN), the need for reliable and efficient security mechanisms for them has increased manifold but its implementation is a non-trivial task. Limitations in processing speed, battery power, bandwidth and memory constrain the applicability of existing cryptography Algorithms for WSNs. Several security mechanisms, such as TinySec, have been introduced to address the need for security in WSNs. The cost of security, however, still mostly remains an unknown variable. To provide a better understanding of this cost we have studied three encryption algorithms, AES, RC5 and RC6. We have measured and compared their memory and energy consumption on Mica2 sensor motes. The results of our experiments provide insight into the suitability of different security algorithms for use in WSN environments and could be used by WSN designers to construct the security architecture of their systems in a way that both satisfies the requirements of the application and reasonably uses the constrained sensor resources. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Information retrieval model based on neural networks using neighborhood

    Publication Year: 2012 , Page(s): 1 - 5
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (723 KB) |  | HTML iconHTML  

    The information plays an inevitable role in the handling systems of numerical data existing today. The amount of information handled by various organizations in the world today is high and its management without the computer is no longer imaginable. Information Retrieval (IR) is a research area in computer science whose goal is to facilitate access to a set of documents in electronic form (corpus) and allow a user to find the relevant ones for him, that is to say, those whose content best matches the information needs of the user. We propose in this paper a system called “Neuro-RI” putting into practice our model of information retrieval based on neural networks with neighborhood. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Syntactico-semantic algorithm for automatic ontology merging

    Publication Year: 2012 , Page(s): 1 - 5
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (121 KB) |  | HTML iconHTML  

    Ontologies are becoming efficient models for information representation and storage. They facilitate treatment and knowledge management through AI techniques by offering the potential of integrating a large quantity of information via what we call “Ontology Merging”. Beforehand, each data source or DB may be the object of an ontology construction. Our contribution is to design a syntactico-semantic algorithm for automatic ontology merging. It combines syntactic and semantic measures for identifying similar concepts that will be merged in a signal one in the resulting merged ontology. Hence, synonym and homonym problems of the syntactic measures can be solved. The syntactic part of the algorithm is based on calculating the distance between the two compared concepts while the semantic one is based on the extensional models of the source ontologies from WordNet which constitute the basis of semantic similarity measure between the synsets of the two concepts in question. After combining the two results, similar concepts are merged in a single one. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Digital signature forming and keys protection based on person's characteristics

    Publication Year: 2012 , Page(s): 1 - 6
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1005 KB) |  | HTML iconHTML  

    In today's commercial environment, establishing a framework for the authentication came with different challenges, need of secure document exchange, secure bank transactions, and other e-commerce needs. Challenges are in term of confidentiality. Digital signature (DS) is the only means of achieving it. This paper shows a method in signing and verifying a document digitally online. A document is first signed, using a Secure Hash Algorithm - 160 (SHA-1), then protected by sender's keys. The receiver verifies the signature using keys stored in the smart card (SMC) that is derived from his fingerprint. This paper investigates DSs techniques that is based on use of SMCs. It shows how true user identity can be verified when used keys are derived from human characteristics. The obtained results were translated in term of speed and security enhancement which is highly in demand of e-commerce society. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • QoS-aware service selection based on swarm particle optimization

    Publication Year: 2012 , Page(s): 1 - 6
    Cited by:  Papers (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (147 KB) |  | HTML iconHTML  

    The growing number of web service over the internet urges us to conceive an efficient selection approach, especially for composite requests. In general, we can find a set of services that provide the same functionality (inputs/outputs), but differ in QOS criteria, in this situation we must select the best ones, by applying some optimization algorithm. In this paper, we propose a reactive multi-agent solution, based on swarm particle optimization. The proposed system adopts a set of particle's groups, which explore the space search in order to maximize a single objective function. The obtained results show a high rate of optimality and merit to be continued. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Formal verification of mobile agent based worflows

    Publication Year: 2012 , Page(s): 1 - 6
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (416 KB) |  | HTML iconHTML  

    Workflows are naturally built upon complex distributed information systems and resources. Modern workflows systems require a lot of flexibility i.e. the capability to react to changes during their execution and a great degree of adaptivity. The agent technology seems to be a good candidate to these requirements. One solution is that the workflow enactment is done by a mobile agent. These systems are so complex such as the usage of formal tools for verification, simulation and prototyping to facilitate their design and their validation is essential and of great interest. The aim of this paper is twofold: firstly we discuss the importance of applying agent technology to workflows and then a new formal method based on rewriting logic is proposed to verify mobile agent based workflows and to enable their rapid prototyping. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A Distributed Health Navigation System based on opportunistic mobile WSN

    Publication Year: 2012 , Page(s): 1 - 6
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (489 KB) |  | HTML iconHTML  

    Accurate pollution monitoring in urban environment requires an extremely large number of measurement station. The very complex 3D structure of urban area and its fluid dynamic behavior cause to be necessary very dense sampling grid to evaluate quantitative pollution indexes that can be correlated to real observed health effects or to obtain accurate pollution trend analysis. In order to reduce the high building cost and the management complexity of these high density monitoring grids, some papers proposed mobile monitoring stations to piggyback on public buses obtaining more dense sampling grid with fewer stations. In this paper we propose an Health Navigation System application for smartphone, based on a network of low cost, high precision, miniaturized wireless mobile monitoring system that can be easily embedded on bike frame. The mobile network makes available an accurate pollution urban map to our Navigation System that bikers can use to determine the healthiest route. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Similarity measure for semi-structured information retrieval based on the path and neighborhood

    Publication Year: 2012 , Page(s): 1 - 5
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (419 KB) |  | HTML iconHTML  

    With the appearance of semi-structured documents, such as XML documents, information retrieval has been challenging due to the introduction of the structural information known by his complex presentation. The system of information research must organize, store information and then provide the documents which correspond to user information needs. These systems are based on models of information retrieval and use similarity measures taking into account the structural and textual information. This paper presents a new similarity measure, inspired by that of CASIT model (in French: CAlcul de SImilarité Textuelle). It is adapted to semi-structured documents, specifically XML documents. This measure is used to calculate a rate of resemblance between a required XML document and each document of an XML database, by generating of interference wave presenting the existence and importance of the vocabulary of the required document in each document of database. Two important notions are used: the neighborhood that allows the valuation of terms and the path of tags followed to reach lexical units. This similarity measure has been exploited by a system of semi-structured information retrieval which we realized. We have used an experimental XML database and defined the time as criterion of evaluation. Consequently the running time is linear, which makes use of a huge database possible. Then tested in term of quality and answers relevance by the measure: recall / precision. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Formal analysis of PKM using scyther tool

    Publication Year: 2012 , Page(s): 1 - 6
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (132 KB) |  | HTML iconHTML  

    Owing to the natural characteristics of wireless communication, anyone can intercept or inject frames, making wireless communication much more vulnerable to attacks than its wired equivalents. In this paper we focused on the PKM protocol which provides the authorization process and secure distribution of keying data from the base station to mobile station. Concentrating on PKMv2, we give a formal analysis of this version and we found that is vulnerable to replay, DoS, Man-in-the middle attacks. We propose a new methodology to prevent the authorization protocol from such attacks by using nonce and timestamp together. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Voting multiple classifiers decisions for spam detection

    Publication Year: 2012 , Page(s): 1 - 6
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (356 KB) |  | HTML iconHTML  

    A considerable amount of research and technology development has been emerged to address the problem of spam detection. Based on a Boolean cellular approach and naïve Bayes technique built as individual classifiers, we evaluate a novel method that combines these two classifiers to determine whether we can more accurately detect Spam. Experimental results show that the proposed combination increases the classification performance as measured on LingSpam dataset. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A simple approach for reducing timed automata

    Publication Year: 2012 , Page(s): 1 - 6
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (188 KB) |  | HTML iconHTML  

    Today model checking is the most useful verification method of real time systems, so there is a serious need for improving its efficiency with respect to both time and resources. In this paper we present a new approach for reducing timed automata. In fact regions of a region automaton are aggregated according to a coarse equivalence class partitioning based on traces. We will show that the proposed algorithm terminates and preserves original timed automaton behavior. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The comparison of Adaptive Neuro-Fuzzy Inference System (ANFIS) with nonlinear regression for estimation and prediction

    Publication Year: 2012 , Page(s): 1 - 7
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (326 KB) |  | HTML iconHTML  

    The main purpose of the most research, especially the economic one is to access a good estimate as well as the prediction for the future. The last objective is to explore the future by which the economic plan is adopted, and the strategic policy is development. The success or failure of these plans and strategies depends on the credibility of the prediction. In spite of the Adaptive Neuro-Fuzzy Inference System (ANFIS) is characterized by being simple and flexible and have the ability to reach a perfect estimate in most cases, but the potentials in the access a good predictions are questionable. In this paper, the comparison of the (ANFIS) as an intelligence method and the Nonlinear regression (NL) as a classic method applied to define the ability of estimation and prediction. For this purpose, we choose ten nonlinear types of data, which is different in length and shape. The Anderson-Darling test is used. We conclude that six of them distributed as normal distribution while the remaining are not. By the analysis, it seems clearly that the NL is best for prediction, while the ANFIS is perfect for the estimation. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Reasoning about design patterns with an Aspect-Oriented approach

    Publication Year: 2012 , Page(s): 1 - 7
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (167 KB) |  | HTML iconHTML  

    The formal specification of design pattern clarifies the concepts underlying patterns, eliminates ambiguity, and complements the informal text-based descriptions. Based on previous works on formal specification of design patterns, this paper focuses on enriching the formal specification of design patterns using the Aspect-Oriented approach. This paper presents some properties of patterns in order to prove how predicate logic can be used to reason in the Aspect-Oriented approach. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Dynamic horizontal fragmentation, replication and allocation model in DDBSs

    Publication Year: 2012 , Page(s): 1 - 7
    Cited by:  Papers (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (158 KB) |  | HTML iconHTML  

    Distributed processing is an effective way to improve performance of database systems. Therefore, fragmentation and proper allocation of fragments across various sites of the network is considered as a key research area in distributed database environment. However, the issue of allocating fragments to the most appropriate sites is not an easy task to perform. In this paper, a synchronized horizontal fragmentation, replication and allocation model in the context of relational database is proposed. A heuristic technique to satisfy horizontal fragmentation and allocation using a cost model to minimize the total cost of distribution is developed. Experimental results are consistent with the hypothesis confirming that the proposed model can efficiently solve dynamic fragmentation and allocation problem in a distributed relational database systems. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Locating candidate web service in legacy software: A search based approach

    Publication Year: 2012 , Page(s): 1 - 6
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (130 KB) |  | HTML iconHTML  

    Locating candidate web services in legacy software is the most challenging task in the process of migrating (i.e. reengineering) legacy software towards service oriented architectures and web services technologies. In this paper and for the first time, we formulate the problem of locating services as search problem and we justify the adoption of functional cohesion measures and GA (i.e. Genetic Algorithms) to find an approximate solution, a set of modules(i.e. procedures or functions) contributing in the computation of searched service, to this problem. This approach is experimented on a two medium-sized legacy software and shown that it outperforms features location approaches based on information retrieval tools when used to locate services in legacy software. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Process network modeling and system level performance evaluation for H.264/AVC encoder design

    Publication Year: 2012 , Page(s): 1 - 6
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (254 KB) |  | HTML iconHTML  

    Given the substantially increasing complexity of embedded systems, the use of relatively detailed clock cycle-accurate simulators for the design-space exploration is impractical in the early design stages. Raising the abstraction level is nowadays widely seen as a solution to bridge the gap between the increasing system complexity and the low design productivity. For this, several system-level design tools and methodologies have been introduced to efficiently explore the design space of heterogeneous signal processing systems. In this paper, we demonstrate the effectiveness of the SystemC methodology for efficient system modeling and rapid performance evaluation at high abstraction level of an increasing complexity embedded media system. For this purpose, we have selected a system level design of a very high complexity media application; a H.264/AVC (Advanced Video Codec) video encoder. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Modelisation of bio inspired systems using MDA

    Publication Year: 2012 , Page(s): 1 - 6
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (331 KB) |  | HTML iconHTML  

    As the abstractions presented by biologically inspired systems, systems architects will be required to include the abstractions in their architecture in order to communicate the design to system implementers. The paper describes a new formalism based on biology and Model Driven Architecture (MDA) in order to find a new and easy way to design and understand (reverse engineering) a complex bio inspired system. The paper also describes then a set of bio inspired views which are used when describing bio inspired system. Finally we use the proposed approach to model Xor artificial neural network. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • BEMD for three-dimensional medical images decomposition

    Publication Year: 2012 , Page(s): 1 - 5
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (614 KB) |  | HTML iconHTML  

    This paper presents an approach to three-dimensional medical images decomposition with Bidimensional Empirical Mode Decomposition (BEMD) technique. This decomposition, obtained by a process known as sifting process, allows extracting the structures at different scales and spatial frequencies with modulation in amplitudes and frequency. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Evaluation of real-time object-oriented data models for real-time databases

    Publication Year: 2012 , Page(s): 1 - 6
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (130 KB) |  | HTML iconHTML  

    The design of real-time databases needs to incorporate methods specifically developed to represent the timing constraints of the system under consideration. Real-time databases have timing constraints on data and on transactions upon the data. During the last few years, several research approaches have been directed towards using the relational model as a data model for real-time databases. However, neither approach adds a temporal dimension to the data model. As realtime applications complexity grows, the real-time database community migrated towards object-oriented technology, both because one can use its rich data semantics in timing transactions processing and because complex real-time applications may benefit from its capabilities for modeling, storing and manipulating complex objects. In this paper we explore the realtime object-oriented data models for real-time databases, performing an overview of the existing models and presenting a set of features suitable to evaluate the quality of each model. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • e-Services using any phone & user's voice: Bridging Digital Divide & help global development

    Publication Year: 2012 , Page(s): 1 - 6
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (140 KB) |  | HTML iconHTML  

    With the rapid growth of e-Services (e-Learning, e-Health, e-Gov, e-Farming and the like), it is important that everyone can access such valuable services and get associated benefits. While such services are accessible and widely used by most educated people having access to the Internet via computers or high end phones, such services are not accessible by many people on the other side of the Digital Divide, namely, illiterate people, people having no access to a computer or high end phone, elderly people and people with disabilities. In this paper, we propose Voice Internet based e-Services that uses a state of the art “rendering” technology to convert existing e-Service application content into Short, Precise, easily Navigable, Meaningful and pleasant to listen to content in real time. Such “rendering” technology also supports voice based easy natural interaction with the e-Service applications by allowing to fill various on-line forms. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Semantic similarity measure based on multiple ressources

    Publication Year: 2012 , Page(s): 1 - 6
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (152 KB) |  | HTML iconHTML  

    The ability to accurately judge the semantic similarity between words is critical to the performance of several applications such as Information Retrieval and Natural Language Processing. Therefore, in this paper we propose a semantic similarity measure that uses in one hand, an online English dictionary provided by the Semantic Atlas project of the French National Centre for Scientific Research (CNRS) and on the other hand, a page counts based metric returned by a social website whose content is generated by users. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Using vector quantization in Automatic Speaker Verification

    Publication Year: 2012 , Page(s): 1 - 5
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (186 KB) |  | HTML iconHTML  

    This article investigates several technique based on vector quantization (VQ) and maximum a posteriori adaptation (MAP) in Automatic Speaker Verification ASV. We propose to create multiple codebooks of Universal Background Model UBM by Vector Quantization and compare them with traditional approach in VQ, MAP adaptation and Gaussian Mixture Models. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Information challenges for E-Mental Health: Case NFB database user requirements

    Publication Year: 2012 , Page(s): 1 - 5
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (115 KB) |  | HTML iconHTML  

    Interest on E Mental Health keeps growing, the potential benefits to the individual clients, and to the psychiatry at large will be yet far more greater, should we solve the challenges including e.g. privacy. When practicing neurofeedback (NFB) therapy the authors have faced the information sharing challenges, the great potential within systemic and standardized data recording and anonymous sharing for clinic internal, client-centric, research and better practice purposes. The advances in image processing, cryptology, secure data handling are to be introduced for E Mental Health domain ensuring client privacy in information collection and transfer. The dialogue with future networks roadmapping on e-Health requirements will yield mutually substantial results. Common voice is needed in enabling the best use of medical research data for achieving the potential of the psychiatry at large and serving best our clients. This demands an active position in policy discussion having on impact on harmonious legislation. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.