Scheduled System Maintenance:
On April 27th, single article purchases and IEEE account management will be unavailable from 2:00 PM - 4:00 PM ET (18:00 - 20:00 UTC).
We apologize for the inconvenience.
By Topic

Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing, 2005 and First ACIS International Workshop on Self-Assembling Wireless Networks. SNPD/SAWN 2005. Sixth International Conference on

Date 23-25 May 2005

Filter Results

Displaying Results 1 - 25 of 80
  • Sixth International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing and First ACIS International Workshop on Self-Assembling Wireless Networks — SNPD/SAWN 2005 - Cover

    Publication Year: 2005 , Page(s): c1
    Save to Project icon | Request Permissions | PDF file iconPDF (153 KB)  
    Freely Available from IEEE
  • Proceedings. 6th International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/ Distributed Computing and First ACIS International Workshop on Self-Assembling Wireless Networks

    Publication Year: 2005
    Save to Project icon | Request Permissions | PDF file iconPDF (45 KB)  
    Freely Available from IEEE
  • Sixth International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing and First ACIS International Workshop on Self-Assembling Wireless Networks — SNPD/SAWN 2005 - Copyright Page

    Publication Year: 2005 , Page(s): iv
    Save to Project icon | Request Permissions | PDF file iconPDF (35 KB)  
    Freely Available from IEEE
  • Sixth International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing and First ACIS International Workshop on Self-Assembling Wireless Networks — SNPD/SAWN 2005 - Table of contents

    Publication Year: 2005 , Page(s): v - x
    Save to Project icon | Request Permissions | PDF file iconPDF (47 KB)  
    Freely Available from IEEE
  • Message from the Conference Chairs

    Publication Year: 2005 , Page(s): xi
    Save to Project icon | Request Permissions | PDF file iconPDF (18 KB) |  | HTML iconHTML  
    Freely Available from IEEE
  • Message from the Program Chairs

    Publication Year: 2005 , Page(s): xii
    Save to Project icon | Request Permissions | PDF file iconPDF (19 KB) |  | HTML iconHTML  
    Freely Available from IEEE
  • Message from the Workshop Chairs

    Publication Year: 2005 , Page(s): xiii
    Save to Project icon | Request Permissions | PDF file iconPDF (21 KB) |  | HTML iconHTML  
    Freely Available from IEEE
  • Automatic Target Classification - Experiments on the MSTAR SAR Images

    Publication Year: 2005 , Page(s): 2 - 7
    Cited by:  Papers (3)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (158 KB) |  | HTML iconHTML  

    SAR (Synthetic Aperture Radar) can produce target images in range and cross-range with sufficient resolution for recognition. In this paper, we did an experimental test on three different feature extraction techniques (Principle Components Analysis PCA, Independent Components Analysis ICA, and Hu moments) by using different target SAR images taken from the MSTAR database. The performance of these techniques is analyzed. A number of classification techniques, such as Linear (LDC), Quadratic (QDC), K-nearest Neighbor (K-NN), and Support Vector Machine (SVM) are tested and compared for their performance on the target classification. Our experimental results provide a guideline for selecting feature extracting techniques and classifiers in automatic target recognition using SAR image data. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Soft computing algorithms applied to the segmentation of nerve cell images

    Publication Year: 2005 , Page(s): 8 - 13
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (344 KB) |  | HTML iconHTML  

    Microscopic images of stained nerve cells are routinely analyzed during neuropathological research. Manual analysis relies heavily on operator knowledge, and therefore can be highly subjective. The process is also time consuming. This paper investigates the use of fuzzy C-means to automate the analysis of nerve cell images. Using fuzzy C-means clustering, nerve cells are detected in an image. The nerve cells are then classified into degrees of health based upon their physical characteristics. A fuzzy approach is taken in order to account for vagueness in the data. This ambiguity stems from both the nature of digital images and the nature of biological systems. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Pattern recognition based on time-frequency distributions of radar micro-Doppler dynamics

    Publication Year: 2005 , Page(s): 14 - 18
    Cited by:  Papers (6)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (152 KB) |  | HTML iconHTML  

    Radar micro-Doppler signatures are of great potential for identifying properties of unknown targets. An effective tool to extract information from the signatures is time-frequency analysis, based on which target identification and object recognition can be extended. In this paper, a method has been proposed for feature extraction and selection from simulated time-frequency distribution of micro-Doppler dynamics. Experimental results have shown that a highly discriminative feature set can be established by using this method. With this feature set, high classification performances both in training and testing stages for different classifiers have been achieved. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A quantitative software quality evaluation model for the artifacts of component based development

    Publication Year: 2005 , Page(s): 20 - 25
    Cited by:  Papers (4)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (160 KB) |  | HTML iconHTML  

    Recently, software quality evaluation based on ISO/IEC 9126 and ISO/IEC 14598 has been used widely. However, these standards for software quality don't provide practical guidelines to apply the quality model and the evaluation process to real projects. Hence, this paper presents a quantitative software quality evaluation model for the artifacts of the component based development (CBD) methodology which is developed by the Ministry of National Defense of the Republic of Korea, Particularly, our model adopts the weights of quality characteristics which are obtained by carefully selected questionnaires for the stakeholders and analytic hierarchical process (AHP) technique. We also present the evaluation process using checklists and the result of a trial evaluation for validation of our model. As a result, we believe that the proposed model helps to acquire high quality software. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Matching effectiveness and OTS model richness

    Publication Year: 2005 , Page(s): 26 - 31
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (280 KB) |  | HTML iconHTML  

    The proposal of CAT (component aware techniques) is effectively using components that meet the stakeholders' needs for a component based application (CBA). Matching off-the-shelf (OTS) components, using a representation of OTS components as an aggregate of their functional and non-functional requirements and architecture is an important activity. This paper explores the relationship between matching effectiveness with the richness of OTS component structure illustrated using a home appliance control system (HACS) example. Intuition tells that the richer the structure of OTS component, the more effective the matching is. This paper shows positive relationship between OTS model richness and matching effectiveness with experimental study. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A new approach for software requirements elicitation

    Publication Year: 2005 , Page(s): 32 - 42
    Cited by:  Papers (6)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (264 KB) |  | HTML iconHTML  

    Requirements elicitation is both the hardest and most critical part of software development, since errors at this beginning stage propagate through the development process and are the hardest to repair later. This paper proposes an improved process for requirements elicitation. The key improvements are: (1) to train the non-technical stakeholders (primarily the users) in the capabilities and limitations of computer hardware, software, and of software developers; (2) identify keywords while interviewing the stakeholders, visually as well as in text form; (3) use keyword mapping to generate candidate system requirements; (4) apply the techniques of quality function deployment (QFD) and the Capability Maturity Model (CMM) during the elicitation process. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Hardware support: a cache lock mechanism without retry

    Publication Year: 2005 , Page(s): 44 - 49
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (128 KB) |  | HTML iconHTML  

    A lock mechanism is essential for synchronization on the multiprocessor systems. The conventional queuing lock has two bus traffics that are the initial and retry of the lock-read. This paper proposes the new locking protocol, called WPV (waiting processor variable) lock mechanism, which has only one lock-read bus traffic command. The WPV mechanism accesses the shared data in the initial lock-read phase that is held in the pipelined protocol until the shared data is transferred. The WPV mechanism also uses the cache state lock mechanism to reduce the locking overhead and guarantees the FIFO lock operations in the multiple lock contentions. In this paper, we also derive the analytical model of WPV lock mechanism as well as conventional memory and cache queuing lock mechanisms. The simulation results on the WPV lock mechanism show that about 50% of access time is reduced comparing with the conventional queuing lock mechanism. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • How to run C++ applications on a bare PC?

    Publication Year: 2005 , Page(s): 50 - 55
    Cited by:  Papers (9)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (76 KB) |  | HTML iconHTML  

    Most of the computer applications today run on a given operating system environment. The application programs written in a programming language such as C++ are intertwined with operating system and environment to run on a given machine. Thus, a C++ program requires a processor such as an Intel Pentium and an operating system such as a Microsoft Windows. Why do we have to run applications in such a constrained environment? It may be because, that is how evolution of computing happened since the inception of personal computers in the 80s. In this paper, we describe details on how to run C++ applications on a bare machine. We provide some benefits of running applications on a bare machine without any operating system. We present some sample applications that are built to demonstrate the capability of running C++ applications on a bare machine. Finally, we describe our future research direction that may potentially offer a revolution in computing architecture and application development. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Compiling C++ programs to Java bytecode

    Publication Year: 2005 , Page(s): 56 - 61
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (136 KB) |  | HTML iconHTML  

    It is very desirable to run programs on a variety of platforms. The only programs today that can run on different platforms are those written in Java. Although methods have been developed to allow cross-language applications, these applications are still mostly be hardware and/or operating system platform dependent. In this paper, we describe a platform-neutral compiler for a C++ like language that generates Java bytecode to run on any platform where a Java runtime is available. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Using the KDSM methodology for knowledge discovery from a labor domain

    Publication Year: 2005 , Page(s): 64 - 69
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (152 KB) |  | HTML iconHTML  

    The paper presents the knowledge discovery in serial measures (KDSM) methodology as an easy and optimal way for analyzing repeated very short serial measures with a blocking factor. An application to labor the domain is described using KDSM. A novel knowledge about labor domain's behavior was obtained once KDSM was applied to this specific domain. KDSM is a hybrid methodology (statistic and artificial intelligence) that gives a possible solution to a knowledge problem, especially when seemingly there are no relevant attributes. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Using data mining technology to design an intelligent CIM system for IC manufacturing

    Publication Year: 2005 , Page(s): 70 - 75
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (248 KB) |  | HTML iconHTML  

    This paper aims to explain a specific intelligent computer integrated manufacturing system by integrating the following five major domains: computer integrated manufacturing, data warehouse, online analytical processing, data mining and artificial intelligence. The data mining system makes use of the decision tree algorithm and classification model in exploring the meaningful information, which is useful in the process of decision making. Subsequently, the rules discovered by the data mining system are expressed through the rule based knowledge presentation method of the expert system. The intelligent CIM system is applied to semiconductor packing factories and also point at the great fluctuation of dynamic random access memory prices. The contribution can increase business competitiveness, reduce production cost, and promote the rate of available promise for order. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Data mining for imprecise temporal associations

    Publication Year: 2005 , Page(s): 76 - 81
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (152 KB) |  | HTML iconHTML  

    The field of data mining is dedicated to the analysis of data in order to find underlying connections and the discovery of new patterns. Since the volume of data to be analyzed is sometimes quite significant, there is the need for efficient data mining algorithms to be implemented. The market-basket algorithm can represent a breakthrough in data mining techniques. As the associations that are to be analyzed grow more and more abstract, the market-basket approach is unable to deal with imprecise temporal associations, leaving a big area uncharted. This research is dedicated to the analysis of temporal imprecise associations through the modification of a standard a-priori approach by means of fuzzy set relations to classify the associations relating different sources of data. The results of this research show that it is possible to investigate such relations with the help of fuzzy set classification for temporal associations, and the result of such exploration is as easily understandable as the standard a-priori algorithm. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Analysis of breast cancer using data mining & statistical techniques

    Publication Year: 2005 , Page(s): 82 - 87
    Cited by:  Papers (4)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (192 KB) |  | HTML iconHTML  

    Data mining & statistics analysis is the search for valuable information in large volumes of data. It is now widely used in health care industry. Especially breast cancer is the second most cause of cancer and the second most dangerous cancer. The best way to improve a breast cancer victim's chance of long-term survival is to detect it as early as possible. Currently there are three methods to diagnose breast cancer: mammography, FNA (fine needle aspirate) and surgical biopsy. The diagnose accuracy of mammography is from 68% to 79%, the accuracy of FNA is inconsistent with varying from 65% to 98%t the accuracy of a surgical biopsy is nearly 100%. The procedure of a surgical biopsy, however, is both unpleasant and costly. In this paper, we use a FNA with a data mining & statistics method to get an easy way to achieve a best result. We combine some statistical methods such as PCA, PLS linear regression analysis with data mining methods such as select attribute, decision trees and association rules to find the unsuspected relationships. In addition, the experimental results are shown and discussed. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Spring framework for rapid open source J2EE Web application development: a case study

    Publication Year: 2005 , Page(s): 90 - 95
    Cited by:  Papers (3)  |  Patents (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (128 KB) |  | HTML iconHTML  

    In the highly competitive arena of Web application development, it is important to develop the application as accurately, economically, and efficiently as possible. One way to increase productivity is to decrease complexity. This has been an underlying theme in a movement to change the way programmers approach developing Java 2 Platform, Enterprise Edition (J2EE) Web applications. The focus of the change is how to create J2EE-compliant software without using Enterprise Java Beans (EJB). The foremost alternative is the Spring framework, which provides less services but it is much less intrusive than EJB. The driving force behind this shift is the need for greater productivity and reduced complexity in the area of Web application software development and implementation. In this paper, we briefly describe Spring underlying architecture and present a case study using Spring. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Analyzing the conditions of coupling existence based on program slicing and some abstract information-flow

    Publication Year: 2005 , Page(s): 96 - 101
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (112 KB) |  | HTML iconHTML  

    In this article, we pay more attention to the discussion of the conditions, which possibly produces software coupling between basic components in object-oriented programs. Six possible conditions are discussed in this article with some illuminations. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A method level based approach for OO integration testing: an experimental study

    Publication Year: 2005 , Page(s): 102 - 109
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (152 KB) |  | HTML iconHTML  

    Objects interact in order to implement behavior. One important problem when integrating and testing object-oriented software is to reduce the number of required test stubs and to determine an effective class integration order. The strong connectivity between classes complicates this task. We present, in this paper, a new class integration testing strategy based on a new class dependency model (CDM). The CDM model takes into account the interactions between classes. In order to validate our approach and to compare it to some of the existing object-oriented integration strategies, we conducted an experimental study on several real-world Java programs. The obtained results show that the strategy we propose reduce considerably the number of required test stubs. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A study of model layers and reflection

    Publication Year: 2005 , Page(s): 110 - 113
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (58 KB) |  | HTML iconHTML  

    Systems of reflection ability can change their structure and behavior during their own execution to adapt to changed environment. This paper studies an approach to reflection realization from the point of view of model layers and language and presents a prototype framework incarnating reflection ideas. Reflection can be obtained from modeling languages that can change model elements structure and behavior with changed condition. Existing meta models stress specification of structure of languages and care little about their behavior. This paper explicitly introduces operations into meta models to control the structure and behavior of model elements. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A general scalable implementation of fast matrix multiplication algorithms on distributed memory computers

    Publication Year: 2005 , Page(s): 116 - 122
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (576 KB) |  | HTML iconHTML  

    Fast matrix multiplication (FMM) algorithms to multiply two n × n matrices reduce the asymptotic operation count from O(n3) of the traditional algorithm to O(n2.38), thus on distributed memory computers, the association of FMM algorithms and the parallel matrix multiplication algorithms always gives remarkable results. Within this association, the application of FMM algorithms at inter-processor level requires us to solve more difficult problems in designing but it forms the most effective algorithms. In this paper, a general model of these algorithms is presented and we also introduce a scalable method to implement this model on distributed memory computers. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.