By Topic

Computer Research and Development, 2010 Second International Conference on

Date 7-10 May 2010

Filter Results

Displaying Results 1 - 25 of 188
  • [Front cover]

    Page(s): C1
    Save to Project icon | Request Permissions | PDF file iconPDF (221 KB)  
    Freely Available from IEEE
  • [Title page i]

    Page(s): i
    Save to Project icon | Request Permissions | PDF file iconPDF (25 KB)  
    Freely Available from IEEE
  • [Title page iii]

    Page(s): iii
    Save to Project icon | Request Permissions | PDF file iconPDF (89 KB)  
    Freely Available from IEEE
  • [Copyright notice]

    Page(s): iv
    Save to Project icon | Request Permissions | PDF file iconPDF (147 KB)  
    Freely Available from IEEE
  • Table of contents

    Page(s): v - xvii
    Save to Project icon | Request Permissions | PDF file iconPDF (182 KB)  
    Freely Available from IEEE
  • Preface

    Page(s): xviii - xix
    Save to Project icon | Request Permissions | PDF file iconPDF (67 KB)  
    Freely Available from IEEE
  • Organizing Committee

    Page(s): xx
    Save to Project icon | Request Permissions | PDF file iconPDF (85 KB)  
    Freely Available from IEEE
  • Reviewers

    Page(s): xxi - xxii
    Save to Project icon | Request Permissions | PDF file iconPDF (92 KB)  
    Freely Available from IEEE
  • A Guideline to Enforce Data Protection and Privacy Digital Laws in Malaysia

    Page(s): 3 - 6
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (220 KB) |  | HTML iconHTML  

    In the Malaysian cyber law, no law has been enacted to protect the personal data of the people when a theft of personal data occurs. This paper will suggest a guideline for applying and enforcing the data protection and privacy digital laws in Malaysia. By enforcing the suggested guideline we achieve a means to process data strictly by commercial sectors for commercial activities in sectors of tourism, finance, insurance, telecommunications, and such other commercial transaction sectors. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Fast Traversal Algorithm for Detecting Object Interference Using Hierarchical Representation between Rigid Bodies

    Page(s): 7 - 11
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (492 KB) |  | HTML iconHTML  

    Searching for fast and efficient algorithm to perform collision detection between static and moving objects is always fundamental problems in virtual environment. Most of previous method seems trying to tackle the problems of involving specific geometric models colliding pairs with restricted rules and guidelines. For example, convex hull bounding-volume tends to solve the collision detection problems by make the collision more accurate. However, its limitation of performing fast collision detection method must be left behind. In this paper, we introduce new traversal scheme using depth first search algorithm called earlier node detection algorithm. This algorithm automatically performs simultaneously for child nodes intersection compared to root intersection test. Result shows that our algorithm works perfectly for rigid models when performing collision detection. In practice, the new traversal algorithms helps improve the situation of determine two or more collision between rigid models especially in urban simulation. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A New Algorithm for Calculating Adaptive Character Wordlength via Estimating Compressed File Size

    Page(s): 12 - 16
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (232 KB) |  | HTML iconHTML  

    In statistical compression algorithms, character word length of 8 bit is being used. Using an optimum character word length, b, where b>8, will improve compression ratio. There are different ways to calculate optimum character word length. In this paper we introduce a new algorithm named Estimated Compressed File Size (ECF) to find optimum character word length by estimating compressed file size. In this new algorithm, different metrics such as header file size and entropy are effective to determine optimum character word length. The ECF algorithm is a good lower bound for estimating compressed file size. ECF is a rapid algorithm for large files and does not depend to statistical algorithms. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Discovery of Patterns Using DataStream Engine

    Page(s): 17 - 19
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (457 KB) |  | HTML iconHTML  

    Data Stream Mining is the process of extracting knowledge structures from continuous, rapid data records. A data stream is an ordered sequence of instances in many applications of data stream mining can be read only once or a less number of times using limited computing and storage capabilities. The proposed system allows the user to select any type of mining algorithm desired. This makes the system flexible and widely useful in different sectors and stream engine does not make it specific. The results can be compared and viewed as graphs. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • WordNet Based Sindhi Text to Speech Synthesis System

    Page(s): 20 - 24
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (269 KB) |  | HTML iconHTML  

    The text-to-speech (TTS) synthesis technology enables machine to convert text into audible speech and used throughout the world to enhance the accessibility of the information. The important component of any TTS synthesis system is the database of sounds. In this study, three types of sound units i.e., phonemes, diphones and syllables are concatenated to produce natural sound for good quality Sindhi text to speech (STTS) system. The object of this paper consists in treating the phonemes, diphones and syllables under the aspect of the lexicon. The methodology used in STTS is to exploit acoustic representations of speech for synthesis, together with linguistic analyses of text. Sindhi is highly homographic language, the text is written without diacritics in real life applications, that creates lexical and morphological ambiguity. The problem of understating non-diacritic words can be solved using semantic knowledge. This paper describes a Sindhi TTS synthesis system that relies on a WordNet to identify the analogical relations between words in the text. The proposed approach is focused on the use of WordNet structures for the task of synthesis. The architecture and novel algorithm for STTS is proposed. The experiments using WordNet that show promising results and the accuracy of our proposed approach is acceptable. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Parallel Bipartite Graph Algorithm for Protein Structure Matching Using OpenMP

    Page(s): 25 - 29
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (290 KB) |  | HTML iconHTML  

    Current work on bipartite graph-based algorithm for protein tertiary structure matching shows that the algorithm demands heavy computation and extensive processing time during graph preparation and matching. In this work, we deployed multithreading approach using OpenMP to enhance the performance of the algorithm. The experiment on dual quad core machines shows that the speedup is 4.9 on 8 threads with the minimum execution time of 121.39 seconds, before it gradually increases as more threads are added. The parallel code has successfully utilized the available multi-cores resources and it extends the capability of the algorithm to match more structures in a given time. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Watermark Embedder and Detector for Medical Images: The Requirements and Design

    Page(s): 30 - 33
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (491 KB) |  | HTML iconHTML  

    Watermarking had been proven to be able to protect medical images from unauthorized tampering and modification. Numerous watermarking schemes are available for different purposes depending on the security needs. Watermark embedding and detection process need to be performed in an efficient manner so that the operation of the health institution is not affected. This paper will focus on the development of a watermark embedding and detection application using existing tools from the design point of view to operate in a PACS (Picture Archiving and Communication Systems) environment. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • MPLS Assisted Handover in IP-Based Mobility Management Schemes: A Survey

    Page(s): 34 - 38
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (329 KB) |  | HTML iconHTML  

    Mobility in IP-based wireless networks is basically handled by Mobile IP protocols which suffer from long handover latency and high signaling overhead, this latency is partly because of tunneling process between mobile node and mobility agents. MPLS (Multi Protocol Label switching) has been integrated into Mobile IP schemes to lessen the burden of IP packets tunneling in mobility domain. In this paper we presented a review of more prominent and mostly cited combinational proposals of MPLS and Mobile IP. To reach a comparative sense, a general taxonomy based on three important aspects of this combination was suggested: Integration level, Hierarchical or non-hierarchical structure and length of MPLS tunnels. Consequently a functional comparison of MPLS Assisted IP-based mobility management has been provided. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A User's Preference Based Method for Web Service Selection

    Page(s): 39 - 45
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (371 KB) |  | HTML iconHTML  

    Both criterion values and user's preference of criterion should be taken into account when selecting web services with same functionalities. Usually, user describes his/her preference of criterion using qualitative concept. But the quantitative value corresponding to qualitative concept is uncertain. Thus, the subjective weight acquired based on user's preference is uncertain. This paper provides a method to select web service using uncertain subjective weight. Firstly, we present two ways of calculating subjective weight; Normalization and Least square method. The subjective weight is described using cloud model. Secondly, the criterion values of each web service are synthesized using marginal substitution method with indifference curve of Cobb-Douglas preferences. Thirdly, an algorithm for service selection is presented. Using this scheme, an experiment is conducted on selecting weather forecast web service. The result of the experiment is validated by the sensibility analysis of weight. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Building a Believable Agent for a 3D Boxing Simulation Game

    Page(s): 46 - 50
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (276 KB) |  | HTML iconHTML  

    This paper describes an approach used to build a practical AI solution for a 3D boxing simulation game. The features of the designed AI agent are based on our deliberate concentration on believability, i.e. human-likeness of agent's behavior. We show how learning by observation and case-based reasoning techniques can be used to create an AI decision-making system for an industrial-level computer game. The chosen AI design principles support high usability and maintainability, which is important for game developers. We prove experimentally that our AI system provides both believable and effective behavior. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Grammars Controlled by Petri Nets with Place Capacities

    Page(s): 51 - 55
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (208 KB) |  | HTML iconHTML  

    A Petri net controlled grammar is a grammar equipped with a Petri net whose transitions are labeled with production rules and nonterminals of the grammar, and the associated language consists of all terminal strings which can be derived in the grammar and the the sequence of rules in every terminal derivation corresponds to some occurrence sequence of transitions of the Petri net which is enabled at the initial marking and finished at a final marking of the net. In the paper we investigate the generative power of grammars controlled by Petri nets with place capacities. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Aligning Data Records Using WordNet

    Page(s): 56 - 60
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (585 KB) |  | HTML iconHTML  

    Visual wrappers use visual information in addition to the DOM Tree properties in the extraction of data records. However, a closer look indicates that visual information can also be used to align data records into tabular form. In this paper, we propose a data alignment algorithm to align data records using DOM Tree properties and visual cue of data records. Our data alignment algorithm uses a regular expression rule and incorporates visual cue such as relative position and size of data items to provide options for the alignment of iterative and disjunctive data items. Results show that our wrapper performs better than existing state of the art wrappers. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • CGLT: An Effective Computer-Based Learning Tool

    Page(s): 61 - 64
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (602 KB) |  | HTML iconHTML  

    Behavioral high-level hardware design tools are currently considered powerful and can largely facilitate the hardware development cycle as a whole. Modern hardware design tools can target high-density programmable logic devices, such as, Field Programmable Gate Arrays. Currently, hardware/software co-design is witnessing a growing focus on finding alternative methods that could further improve the design process. In this paper, we explore the effectiveness and extend a formal methodology for hardware design. The method adopts a a step-wise refinement approach that starts development from formal specifications. A functional programming notation is used for specifying algorithms and for reasoning about them. The method is aided by off-the-shelf refinements based on the operators of Communicating Sequential Processes that map easily to programs written in Handel-C. Handel-Cdescriptions are directly compiled into reconfigurable hardware. The practical realization of this methodology is evidenced by a case studying data-parallel implementations of a matrix multiplication algorithm. The developed designs are compiled and tested under Agility's RC-1000 reconfigurable computer with its 2 million gates Virtex-E FPGA. Performance analysis and evaluation of the presented implementations are included. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Identifying Quality Factors within Data Warehouse

    Page(s): 65 - 72
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (332 KB) |  | HTML iconHTML  

    The popularity of data warehouses for data analysis has grown tremendously. Data warehouse systems have emerged as the core of management information systems. Data warehouse is part of a larger infrastructure that includes legacy data sources, external data sources, data acquisition software, a repository, analytical tools, and user interface. The difficulties of data warehouse implementations have been widely cited in the literature but research on the factors for initial and ongoing data warehouse implementation success is rare and fragmented. Through a comprehensive review of the literature, 10 factors were found to be critical to DW implementation success-Organizational, Technical, Project, Environmental, Infrastructure, Information Quality, System Quality, Service Quality, Relationship Quality, and Net Benefits. A proposed research model is developed in this paper to determine the impact of quality and success factors on the implementation of data warehouse by adapting the updated (2003) DeLone and McLean Information System Success Model, new dimensions are proposed to the model. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Integrating DICOM Information Model with Risk Management Process Area of CMMI for Radiotherapy Applications

    Page(s): 73 - 76
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (218 KB) |  | HTML iconHTML  

    Digital Imaging and Communications in Medicine (DICOM) provides a standard to facilitate interoperability among medical devices from the perspective of hardware and software. Though DICOM is widely adopted by medical device manufacturers, it does not specify risk management details in developing medical device software. This paper hypothesizes that this shortage of DICOM can be complemented by the risk management process area of CMMI® at maturity level 3. The objective of this research is to integrate DICOM information model with risk management process area of CMMI® for radiotherapy application. This research will enable the developers of radiotherapy applications to define and mitigate risks effectively. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Malaysian Address Semantic: The Process of Standardization

    Page(s): 77 - 80
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (309 KB) |  | HTML iconHTML  

    Mail delivery has always been problematic to the Malaysian postal services due to incomplete or incorrect postal addresses. To this effect, the Malaysian Standard Body known as SIRIM has developed the Malaysian Standard Address with the purpose of standardizing Malaysian postal addresses and hence reducing the problem of undelivered mail. This article presents the findings of a study aimed at investigating the various postal-related problems in the Malaysian context. 275 domestic mails, randomly picked from Pos Malaysia Mail Processing Centre were examined and compared against the Malaysian Standard Address. The findings suggest that 58% of the addresses were non-conforming to the SIRIM address standard. Based on this finding, an automated system based on the semantic approach is proposed to address the issue of incomplete and inaccurate address in the context of Malaysia. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Per-Priority Flow Control (PPFC) for Enhancing Metro Ethernet QoS

    Page(s): 83 - 87
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (305 KB) |  | HTML iconHTML  

    This paper presents a novel scheme called per-priority flow control (PPFC) as an extension to the IEEE802.3x standard using a simple and efficient node buffer management strategy, where a physical channel was divided into several virtual channel based on the bandwidth value (QoS parameter) embedded into the IPv6 flow label field of each flow. Moreover, it investigates performance of new proposed priority PAUSE, which includes two new flexible programmable fields (predetermined pause time and priority) in order to provide granular flow control. The proposed PPFC is fair in the sense that it distributes the resources among the flows based on their priorities during congestion. Moreover it reduces the amount of buffer overflow and packet loss and it would provide an efficient use of link bandwidth with high throughput. In addition, priority PAUSE enables optimal bandwidth management of virtual channel for different IPv6 traffic priorities and it enhances the QoS in Metro Ethernet. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.