By Topic

Advances in System Simulation, 2009. SIMUL '09. First International Conference on

Date 20-25 Sept. 2009

Filter Results

Displaying Results 1 - 25 of 36
  • [Front cover]

    Page(s): C1
    Save to Project icon | Request Permissions | PDF file iconPDF (316 KB)  
    Freely Available from IEEE
  • [Title page i]

    Page(s): i
    Save to Project icon | Request Permissions | PDF file iconPDF (11 KB)  
    Freely Available from IEEE
  • [Title page iii]

    Page(s): iii
    Save to Project icon | Request Permissions | PDF file iconPDF (58 KB)  
    Freely Available from IEEE
  • [Copyright notice]

    Page(s): iv
    Save to Project icon | Request Permissions | PDF file iconPDF (109 KB)  
    Freely Available from IEEE
  • Table of contents

    Page(s): v - vii
    Save to Project icon | Request Permissions | PDF file iconPDF (168 KB)  
    Freely Available from IEEE
  • Preface

    Page(s): viii
    Save to Project icon | Request Permissions | PDF file iconPDF (82 KB)  
    Freely Available from IEEE
  • Organizing Committee

    Page(s): ix - x
    Save to Project icon | Request Permissions | PDF file iconPDF (88 KB)  
    Freely Available from IEEE
  • list-reviewer

    Page(s): xi - xii
    Save to Project icon | Request Permissions | PDF file iconPDF (78 KB)  
    Freely Available from IEEE
  • An Index Based Threat Modeling Method for Path Planning

    Page(s): 1 - 5
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1511 KB) |  | HTML iconHTML  

    Path planning is an effective means to improve the operational effectiveness of aircrafts. A reasonable estimation and description of threats existing in operation space are significant before generating a specific path for an aircraft. To avoid depending on experts' experience excessively when we evaluate the threat situations, an index based threat modeling method named IBTMM is proposed. IBTMM uses effectiveness indexes of defense systems to measure the threat of the aircraft faces and gives the dynamic threat degree which is changing according to the aircraft's flying time and location. IBTMM does not give an absolute threat degree but a relative estimation of threats. When an aircraft is flying over a certain area, IBTMM can tell whether it is more dangerous or less than flying over other areas. IBTMM is helpful with the absence of simulation data in the earlier stage of path planning, as it reduces the planning space and meanwhile provides a good way to establish the path planning cost function. An example is presented to show the threat situation generated by IBTMM and demonstrate its effectiveness. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A Monte Carlo Based Procedure for Analyzing Discrete-Time, Nonstationary Simulation Responses Using Classical Time Series Models

    Page(s): 6 - 11
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (727 KB) |  | HTML iconHTML  

    In this paper, we propose a procedure for analyzing discrete-time; nonstationary discrete-event simulation responses based on Monte Carlo integration and the use of classical ARIMA (or SARIMA) time-series models. The procedure is illustrated with an exploding single-server queue and a bounded cyclical traffic situation. Some conclusions and recommendations for further work are stated. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Comparison of Monte Carlo and Quasi Monte Carlo Sampling Methods in High Dimensional Model Representation

    Page(s): 12 - 17
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (179 KB) |  | HTML iconHTML  

    A number of new techniques which improve the efficiency of random sampling-high dimensional model representation (RS-HDMR) is presented. Comparison shows that quasi Monte Carlo based HDMR (QRS-HDRM) significantly outperforms RS-HDMR. RS/QRS-HDRM based methods also show faster convergence than the Sobol method for sensitivity indices calculation. Numerical tests prove that the developed methods for choosing optimal orders of polynomials and the number of sampled points are robust and efficient. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Two Level Approach for Validation of Microscopic Simulation Models

    Page(s): 18 - 22
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (236 KB) |  | HTML iconHTML  

    New driver assistance systems can have consequential effects on traffic flow. Investigative simulations are necessary for the design, optimization and evaluation of these assistance systems. Previous calibration and validation methods utilized either microscopic or macroscopic measurement data.This paper's purpose is to argue that the formerly held calibration and validation perspectives with regard to traffic simulations are incomplete. Moreover, these assistance systems have their own set of particular requirements, and require the simultaneous consideration of microscopic and macroscopic system behavior. Therefore, this paper recommends that a measurement concept is needed to gain the required data necessary for proper calibration and validation. The concept presented in this paper advocates simultaneous measurements sourced in both a vehicle (microscopic) and overall traffic (macroscopic) perspective. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Simulation Model of a Single-Server Order Picking Workstation Using Aggregate Process Times

    Page(s): 23 - 31
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1292 KB) |  | HTML iconHTML  

    In this paper we propose a simulation modeling approach based on aggregate process times for the performance analysis of order picking workstations in automated warehouses with first-in-first-out processing of orders. The aggregate process time distribution is calculated from tote arrival and departure times. We refer to the aggregate process time as the effective process time. We distinguish between the effective process time distribution for the first tote of an order and that of the remaining totes of an order. These two distributions are used in an aggregate model to predict tote and order flow times. Results from a test case show that the aggregate model accurately predicts the mean and variability of tote and order flow times. The effect of the order size distribution on the flow time prediction accuracy is also investigated. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Modeling for Web Services Composition System with Restricted Resources

    Page(s): 32 - 37
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (4740 KB) |  | HTML iconHTML  

    The Web services composition system usually has restricted resources in many practical applications. The system performance is closely related to the system resource. Firstly, a basic generalized stochastic Petri net model, constructed via service composition plan or language mapping for the purpose of performance presented. Nevertheless, the basic model can not reflect the characteristics of system resource restriction exactly. Therefore, through analyzing the various resource restrictions in Web services composition system, a series of extended modeling rules and methods are presented in order to acquire exact performance analysis results. The experiments indicate the difference of performance analysis results between basic model in ideal situation and extended model with restricted resources. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Priority Cycle Time Behavior Modeling for Semiconductor Fabs

    Page(s): 38 - 43
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (397 KB) |  | HTML iconHTML  

    Semiconductor wafer fabrication factories (fabs)often offer wafer manufacturing services of multiple priorities in terms of differentiated cycle time-based targets and fab production must be planned accordingly. This paper aims to develop modeling methods and fab behavior models to capture cycle times of differentiated manufacturing services for semiconductor supply chain management. A novel, hybrid decomposition approximation-based priority queueing network model is designed to characterize how cycle times of individual priorities (PCTs) are affected by wafer release rates and fab capacity utilization. The approach integrates queueing network analyzer that approximates production flow behaviors among tool groups, sequential decomposition approximation among priorities, fixed point iteration to exploit the re-entrant flows and an empirical data-based model tuning technique. Model tuning and validation by comparison with simulation results demonstrate that the decomposition approximation based models yield quick and good quality estimations of PCTs and related performance indices. The fab models thus constructed enable efficient what-if analysis for supply chain management of differentiated manufacturing services. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A Simulation of the Pharmaceutical Supply Chain to Provide Realistic Test Data

    Page(s): 44 - 49
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (161 KB) |  | HTML iconHTML  

    The amount of counterfeit pharmaceuticals in the European-pharmaceutical supply chain increased enormously in the past years. Thus, the European Commission introduced an amendment which will lead to new information systems. No realistic test data for these information systems is available yet which hinders the progress in designing and implementing appropriate information systems. We make a first step to close this gap by providing realistic test data which respects the upcoming legislative changes. For this purpose, we provide four different scenarios which differ in supply chain size and an aspect of the legal requirements which is still subject to consideration. The test data is available on the web, so researchers and software engineers can use it to evaluate their information systems. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Modelling of the Uncertainty of Nuclear Fuel Thermal Behaviour Using the URANIE Framework

    Page(s): 50 - 55
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (569 KB) |  | HTML iconHTML  

    In the global framework of nuclear fuel behaviour simulation, the response of the models describing the physical phenomena occurring during the irradiation in reactor is mainly conditioned by the confidence in the calculated temperature of the fuel. Amongst all parameters influencing the temperature calculation in our fuel rod simulation code (METEOR V2), three sources of uncertainty have been identified as being the most sensitive: thermal conductivity of UO2, radial distribution of power in the fuel pellet and local linear heat rate in the fuel rod. Because the validation of the models is based on measurement-simulation comparisons, a fourth parameter needs to be modelled: the inner diameter of the hole in the pellet that contains the thermocouple used to measure the temperature. Expert judgement and inverse methods have been used to model the uncertainty of these four parameters, either by distribution laws, or by defining a representative sample of the variables. Propagation of these uncertainties in the METEOR V2 code using the URANIE framework and a Monte-Carlo technique has been performed in the case of an irradiation experiment of 5000 hours. At each moment of the experiment, we get a temperature statistical distribution which results from the initial distributions of the uncertain parameters, and this distribution covers the measured values. This final result makes up a good validation of the whole modelling process which allows an accurate quantification of the uncertainty on fuel temperature calculation. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Toward a Collection of Principles, Techniques, and Elements of Modeling and Simulation Software

    Page(s): 56 - 61
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (236 KB) |  | HTML iconHTML  

    Numerous modeling and simulation software products have been developed in the last decades. Most of them have been created from scratch, often dedicated to single formalisms, single simulation algorithms, hardware platforms, or applications. But nevertheless each of these software products has to follow principles, and it has to contain techniques and elements to be usable for modeling and simulation. We identify these principles, and techniques as well as a list of essential elements which is accompanied by a list of additional elements. These lists can be used as guidance on creating a software product for M&S, and they can be used as a base to compare products on. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Experiments with Single Core, Multi-core, and GPU Based Computation of Cellular Automata

    Page(s): 62 - 67
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (520 KB) |  | HTML iconHTML  

    Cellular automata are a well-known modeling formalism exploited in a wide range of application areas. In many of those, the complexity of models hampers a thorough analysis of the system under study. Therefore, efficient simulation algorithms are required. We present here a comparison of seven different simulation algorithms for cellular automata: the classical ldquofullrdquo simulator, the classical ldquodiscrete eventrdquo simulator, a threaded (multicore) variant of each of these, an adaptable threaded variant, and a GPU based algorithm with and without readback of calculated states. The comparison is done based on the M&S framework JAMES II by using a set of well-known models. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An Ontology for Aircraft Route Planning

    Page(s): 68 - 72
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (644 KB) |  | HTML iconHTML  

    An amount of methods used in route planning (RP) for aircraft have emerged during the last few years. It is difficult to master these methods and make a fast choice to use among them, which results from the lack of domain knowledge representation and reuse of RP. To achieve the domain knowledge reuse and efficient planning process, an ontology defined by Web ontology language (OWL) for RP dubbed ONTRP is developed to specify the domain knowledge and fill the gap between conceptual modeling of RP and software implementation of it. ONTRP can help people who lack experience on RP make an analysis and comparison between two RP algorithms by means of automatic reasoning and establish the models for RP efficiently. Despite a number of simplifications in ONTRP, it is the first conceptual model of RP problem established by ontology as well as the first attempt to dig for common knowledge in RP problem for the sake of reuse. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A Design of Network Simulation Environment Using SSFNet

    Page(s): 73 - 78
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (380 KB) |  | HTML iconHTML  

    Network simulation tool is needed to verify the functionalities and performance of networks. The SSFNet (scalable simulation framework network models) is a network simulation tool using open source software with various network simulation applications. It has been designed for the expansion of network including topology, protocols, traffic, and etc, and is able to support simulation for the large-scale network like Internet. However it is not easy for general users to perform network simulation using SSFNet because the SSFNet does not provide users with any supplementary tools for designing of network elements and topology, and analyzing of simulation results. The network modeling and analysis process must be done manually by users themselves. This circumstance makes it difficult to perform reliable network simulation. In this paper, we design a network simulation environment available for SSFNet. Using this environment, users can build network simulation model effectually and analyze the simulation results without difficulty. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A Review of Available Software for the Creation of Testbeds for Internet Security Research

    Page(s): 79 - 87
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (517 KB) |  | HTML iconHTML  

    The increasing use of experimental platforms for networking research is due to their ability to support experimentation with complex systems, like the Internet, that simplistic simulators and small scale testbeds fail to reproduce. Therefore many projects and research initiatives have spawned - mainly in the field of Future Internet architectures. Although numerous publications can be found, most of them refer to prototypes and work in progress rather than to publicly available software that is ready to be widely used for the creation of testbeds. The first contribution is the development of a framework for comparing the available software based on their features. The second contribution is a literature review of state-of-the-art tools and their comparison under common criteria. This systematic analysis allows other researchers to make informed decisions about the usability of already available tools and decrease the initial cost of developing a new testbed, leading to an even wider use of such platforms. Our work provides the reader with a useful reference list of readily available software to choose from while designing or upgrading a research infrastructure, laboratory or experimentation facility. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • G-Sense - A Graphical Interface for SENSE Simulator

    Page(s): 88 - 93
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (856 KB) |  | HTML iconHTML  

    Wireless sensor networks greatly benefit from simulation before deployment, since some of these networks may contain thousands of nodes. The new challenges compared to traditional computer networks led to several approaches for network simulation, namely SENSE - Sensor Network Emulator and Simulator. However this approach presents a limited user interface, namely based on text, forcing users to have knowledge on C++ programming language. This paper presents a tool, called G-Sense, that greatly improves SENSE user friendliness, with graphical input of simulation parameters, save and load simulation features, and simulation results management with plot view. This new tool uses SENSE simulation engine in a transparent way, so the user may be focused on the simulation itself, not in the underlying simulation tool. We present G-Sense architecture, usability and extensive experiments for its validation. We believe that this tool will contribute for SENSE adoption for wireless sensor network simulation, clearly improving on its ease of use. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Uncertainty Assessments in Severe Nuclear Accident Scenarios

    Page(s): 94 - 99
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (377 KB) |  | HTML iconHTML  

    Managing uncertainties in industrial systems is a daily challenge to ensure improved design, robust operation, accountable performance and responsive risk control. This paper aims to illustrate the different depth analyses that the uncertainty software LEONAR, devoted to a specific application, can propose. The physical model of LEONAR describes some of the phenomena, related to the molten core behavior, which may arise in severe accidents in pressurized water reactors, starting from the core degradation and ending either with stabilization or with the complete ablation of the concrete of the pit. LEONAR computes several statistical quantities (failure probabilities of the reactor vessel and pit, probability distributions of output variables) and performs sensitivity analyses. This paper presents several examples of LEONAR use: precise and punctual needs for not specialist engineers, detailed analyses for confirmed users and helps to physical model developers. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Numerical Study of the Metamodel Validation Process

    Page(s): 100 - 105
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (312 KB) |  | HTML iconHTML  

    Complex computer codes are often too time expensive to be directly used to perform uncertainty, sensitivity, optimization and robustness analyses. A widely accepted method to circumvent this problem consists in replacing cpu time expensive computer models by cpu inexpensive mathematical functions, called metamodels. In this paper, we focus on the essential step of the metamodel validation phase which consists in evaluating the metamodel predictivity.It allows to allocate some confidence degrees to the results obtained by using the metamodel instead of the initial numerical model. We propose and test an algorithm which optimizes the distance between the validation points and the metamodel training points in order to estimate the true metamodel predictivity with a minimum number of additional calculations. Comparisons are made with classical validation algorithms and application to a nuclear safety computer code is shown.These tests show the relevance of this new validation design called the Feuillard design. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.