By Topic

Systems, Man and Cybernetics, Part A: Systems and Humans, IEEE Transactions on

Issue 2 • Date March 2006

Filter Results

Displaying Results 1 - 19 of 19
  • Table of contents

    Page(s): c1
    Save to Project icon | Request Permissions | PDF file iconPDF (39 KB)  
    Freely Available from IEEE
  • IEEE Transactions on Systems, Man, and Cybernetics—Part A: Systems and Humans publication information

    Page(s): c2
    Save to Project icon | Request Permissions | PDF file iconPDF (35 KB)  
    Freely Available from IEEE
  • Scheduling web banner advertisements with multiple display frequencies

    Page(s): 245 - 251
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (184 KB) |  | HTML iconHTML  

    Online advertising continues to be a significant source of income for many Internet-based organizations. Recent indications of improved economic growth are having an impact on advertisement revenue, with the estimated online advertising revenue in the United States for the fourth quarter of 2003 totaling a record of $2.2 billion. A substantial portion of this income comes from banner advertisements, and efficient scheduling of these advertisements could result in a considerable increase in profits. The problem of scheduling banner advertisements has been observed to be intractable via traditional optimization techniques and has received only limited attention in the literature. In addition, all past attempts to address this problem have been based on an "all-or-nothing" framework, where a customer specifies the exact number of copies of the ad to be displayed over the planning horizon, if it is selected for display by the provider of the advertisement space. This paper extends it to a more realistic setting, where the customer is allowed to specify a set of acceptable display frequencies. The Lagrangian decomposition-based solution approaches presented in this paper are observed to provide good schedules in a reasonable period of time. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • BioWar: scalable agent-based model of bioattacks

    Page(s): 252 - 265
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (448 KB)  

    While structured by social and institutional networks, disease outbreaks are modulated by physical, economical, technological, communication, health, and governmental infrastructures. To systematically reason about the nature of outbreaks, the potential outcomes of media, prophylaxis, and vaccination campaigns, and the relative value of various early warning devices, social context, and infrastructure, must be considered. Numerical models provide a cost-effective ethical system for reasoning about such events. BioWar, a scalable citywide multiagent network numerical model, is described in this paper. BioWar simulates individuals as agents who are embedded in social, health, and professional networks and tracks the incidence of background and maliciously introduced diseases. In addition to epidemiology, BioWar simulates health-care-seeking behaviors, absenteeism patterns, and pharmaceutical purchases, information useful for syndromic and behavioral surveillance algorithms. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Belief rule-base inference methodology using the evidential reasoning Approach-RIMER

    Page(s): 266 - 285
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (488 KB) |  | HTML iconHTML  

    In this paper, a generic rule-base inference methodology using the evidential reasoning (RIMER) approach is proposed. Existing knowledge-base structures are first examined, and knowledge representation schemes under uncertainty are then briefly analyzed. Based on this analysis, a new knowledge representation scheme in a rule base is proposed using a belief structure. In this scheme, a rule base is designed with belief degrees embedded in all possible consequents of a rule. Such a rule base is capable of capturing vagueness, incompleteness, and nonlinear causal relationships, while traditional if-then rules can be represented as a special case. Other knowledge representation parameters such as the weights of both attributes and rules are also investigated in the scheme. In an established rule base, an input to an antecedent attribute is transformed into a belief distribution. Subsequently, inference in such a rule base is implemented using the evidential reasoning (ER) approach. The scheme is further extended to inference in hierarchical rule bases. A numerical study is provided to illustrate the potential applications of the proposed methodology. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Assembling off-the-shelf components: "Learn as you Go" systems engineering

    Page(s): 286 - 297
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (288 KB) |  | HTML iconHTML  

    The process of developing new information systems has evolved from custom software development to assembly of off-the-shelf components. The change has significantly reduced both the costs and time to develop new capabilities, and as a notable result, e-business systems have been implemented at a very rapid pace. An assembly sequence (components to be assembled, corresponding dates and costs) has several risks including: 1) technical risk: successful (or not) function of assembled components by planned schedule milestones; 2) operational risk: achieving (or not) the desired business value by using the new system of assembled components; and 3) programmatic (schedule and cost) risks: accomplishing the assembly within time and budget constraints. As assembly proceeds, estimates of technical performance and operational value at the time of system completion can be adjusted, and one should consider what early milestones of component assembly suggest about later milestones. The technical community can be both hesitant to reveal and ascertain the results of combining off-the-shelf products into a working system, and it is typical to have significant cost and schedule overruns due to technical problems that are discovered late in system assembly. The operational community can be surprised by the results achieved in applying new capabilities, causing significant changes to what was originally desired from a new system. This paper presents a framework for planning and adjusting milestone sequences in assembling off-the-shelf software components. The framework balances technical and operational risks within established cost and time constraints. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Effectiveness of visual interactive modeling in the context of multiple-criteria Group decisions

    Page(s): 298 - 318
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (712 KB) |  | HTML iconHTML  

    Significant research about the impact of information presentation on decision processes, group-decision support systems, and multicriteria decision making has occurred over the past 10 to 15 years. Advances in hardware and software technologies have significantly reduced costs and enhanced information presentation capability, specifically in the areas of virtual reality and visual interactive modeling (VIM). These visualization technologies can aid in the assimilation of complex qualitative and quantitative information by the decision maker and allow the abstraction of vast information space. Thus, in group-decision situations, visualization has the potential to enhance the decision makers' ability to make appropriate tradeoffs and improve communication between group members resulting in quicker and better consensus decisions. This paper focuses on the study of the effectiveness of advanced information-presentation technologies, such as VIM in complex decision situations involving multiple criteria and groups of decision makers. The effectiveness of VIM is evaluated through a controlled experimental study. The study finds that VIM leads to greater efficiency in decision making and improved group-member attitude and satisfaction with the decision-making process and group-decision solution. However, somewhat contrary to our a priori expectations, the quality of the decision that is made by the groups using visual-interaction modeling is not better than those without this support. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An exploratory study of chaos in human-Machine system dynamics

    Page(s): 319 - 326
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (224 KB)  

    The human-machine system behavior and performance are dynamic, nonlinear, and possibly chaotic. Various techniques have been used to describe such dynamic and nonlinear system characteristics. However, these techniques have rarely been able to accommodate the chaotic behavior of such a nonlinear system. Therefore, this study proposes the use of nonlinear dynamic system theory as one possible technique to account for the dynamic, nonlinear, and possibly chaotic human-machine system characteristics. It briefly describes some of the available nonlinear dynamic system techniques and illustrates how their application can explain various properties of the human-machine system. A pilot's heart interbeat interval (IBI) and altitude tracking error time series data are used in the illustration. Further, the possible applications of the theory in various domains of human factors for on-line assessment, short-term prediction, and control of human-machine system behavior and performance are discussed. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Deadlock-free scheduling and control of flexible manufacturing cells using automata theory

    Page(s): 327 - 337
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (672 KB)  

    This paper presents a novel method for the scheduling and control of flexible manufacturing cells (FMCs). The approach employs automata, augmented by time labels proposed herein, for the modeling of machines, transportation devices, buffers, precedence constraints, and part routes. Ramadge-Wonham's supervisory-control theory is then used to synthesize a deadlock-free controller that is also capable of keeping track of time. For a given set of parts to be processed by the cell, A* search algorithm is subsequently employed using a proposed heuristic function. Three different production configurations are considered: Case 1) each part has a unique route; Case 2) parts may have multiple routes, but same devices in each route; and Case 3) parts may have multiple routes with different devices. The proposed approach yields optimal deadlock-free schedules for the first two cases. For Case 3, our simulations have yielded effective solutions but in practice, optimal deadlock-free schedules may not be obtainable without sacrificing computational time efficiency. One such nontime-efficient method is included in this paper. The proposed approach is illustrated through three typical manufacturing-cell simulation examples; the first adopted from a Petri-net-based scheduling paper, the second adopted from a mathematical-programming-based scheduling paper, and the third, a new example that deals with a more complex FMC scenario where parts have multiple routes for their production. These and other simulations clearly demonstrate the effectiveness of the proposed automata-based scheduling methodology. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Optimal design of link structure for e-supermarket website

    Page(s): 338 - 355
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (435 KB) |  | HTML iconHTML  

    The objective of this research is to optimize the link structure of webpages for an e-supermarket. Customers are looking for greater convenience in shopping when they visit the website of an e-supermarket, while e-supermarket managers prefer webpages that contain information about profitable products to be visited more frequently. In order to balance the interests of both parties and to aid the webmaster in updating the website regularly, we present a mathematical model with the objective of minimizing the overall weighted distances between webpages. An updating algorithm is used to determine the distance between pages. It is proved to be more efficient under certain special circumstances. We propose the statistical Hopfield neural-network and the strategic oscillation-based tabu-search algorithms as solving methods. The former is appropriate for optimizing small-scale problems. The latter is good at solving large-scale problems approximately. The preliminary validity of the model and the performance of the algorithms is demonstrated by experiments on a small website and several large websites, using randomly generated data. The destination pages that customers and website managers preferred are proved to be more accessible after optimization. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A usability-evaluation metric based on a soft-computing approach

    Page(s): 356 - 372
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (552 KB)  

    Usability of software should measure both user preference and user performance. The notion of usability involves several dimensions. These dimensions include: system feedback, consistency, error prevention, performance/efficiency, user like/dislike, and error recovery. Each of these dimensions are characterized by fuzzy aspects and linguistic terms. This paper develops a model for each of the dimensions using fuzzy-set theory. It then uses the Takagi-Sugeno fuzzy-inference approach for developing an overall measure of usability. Results are presented for several different user interfaces. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Constructing a model hierarchy with background knowledge for structural risk minimization: application to biological treatment of wastewater

    Page(s): 373 - 383
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (216 KB)  

    This article introduces a novel approach to the issue of learning from empirical data coming from complex systems that are continuous, dynamic, highly nonlinear, and stochastic. The main feature of this approach is that it attempts to integrate the powerful statistical learning theoretic methods and the valuable background knowledge that one possesses about the system under study. The learning machines that have been used, up to now, for the implementation of Vapnik's inductive principle of structural risk minimization (IPSRM) are of the "black-box" type, such as artificial neural networks, ARMA models, or polynomial functions. These are generic models that contain absolutely no knowledge about the problem at hand. They are used to approximate the behavior of any system and are prodigal in their requirements of training data. In addition, the conditions that underlie the theory of statistical learning would not hold true when these "black-box" models are used to describe highly complex systems. In this paper, it is argued that the use of a learning machine whose structure is developed on the basis of the physical mechanisms of the system under study is more advantageous. Such a machine will indeed be specific to the problem at hand and will require many less data points for training than their black-box counterparts. Furthermore, because this machine contains background knowledge about the system, it will provide better approximations of the various dynamic modes of this system and will, therefore, satisfy some of the prerequisites that are needed for meeting the conditions of statistical learning theory (SLT). This paper shows how to develop such a mechanistically based learning machine (i.e., a machine that contains background knowledge) for the case of biological wastewater treatment systems. Fuzzy logic concepts, combined with the results of the research in the area of wastewater engineering, will be utilized to construct such a machine. This machine has a hierarchical property and can, therefore, be used to implement the IPSRM. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Decentralized failure diagnosis of discrete event systems

    Page(s): 384 - 395
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (312 KB) |  | HTML iconHTML  

    By decentralized diagnosis we mean diagnosis using multiple diagnosers, each possessing its own set of sensors, without involving any communication among diagnosers or to any coordinators. The notion of decentralized diagnosis is formalized by introducing the notion of codiagnosability that requires that a failure be detected by one of the diagnosers within a bounded delay. Algorithms of complexity polynomial in the size of the system and the nonfault specification are provided for: 1) testing codiagnosability, 2) computing the bound in delay of diagnosis, 3) offline synthesis of individual diagnosers, and 4) online diagnosis using them. The notion of codiagnosability and the above algorithms are initially presented in a setting of a specification language (violation of which represents a fault) and are later specialized to the case where faults are modeled as the occurrences of certain events. The notion of strong codiagnosability is also introduced to capture the ability of being certain about both the failure as well as the nonfailure conditions in a system within a bounded delay. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A supervised clustering and classification algorithm for mining data with mixed variables

    Page(s): 396 - 406
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (224 KB) |  | HTML iconHTML  

    This paper presents a data mining algorithm based on supervised clustering to learn data patterns and use these patterns for data classification. This algorithm enables a scalable incremental learning of patterns from data with both numeric and nominal variables. Two different methods of combining numeric and nominal variables in calculating the distance between clusters are investigated. In one method, separate distance measures are calculated for numeric and nominal variables, respectively, and are then combined into an overall distance measure. In another method, nominal variables are converted into numeric variables, and then a distance measure is calculated using all variables. We analyze the computational complexity, and thus, the scalability, of the algorithm, and test its performance on a number of data sets from various application domains. The prediction accuracy and reliability of the algorithm are analyzed, tested, and compared with those of several other data mining algorithms. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Optimization of lift-gas allocation using dynamic programming

    Page(s): 407 - 414
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (232 KB)  

    The continuous gas-lift method is one of the most used artificial lifting techniques in which the allocation of gas injection rates is a very important optimization problem. In this work, we develop a dynamic programming (DP) algorithm that solves the profit maximization problem for a cluster of oil wells producing via gas lift, with multiple well performance curves (WPCs) and constrained by the amount of lift gas available for injection. The algorithm is a low-cost and high-efficiency decision support tool that outperforms alternative methods found in the literature. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • IEEE order form for reprints

    Page(s): 415
    Save to Project icon | Request Permissions | PDF file iconPDF (354 KB)  
    Freely Available from IEEE
  • Quality without compromise [advertisement]

    Page(s): 416
    Save to Project icon | Request Permissions | PDF file iconPDF (319 KB)  
    Freely Available from IEEE
  • IEEE Systems, Man, and Cybernetics Society Information

    Page(s): c3
    Save to Project icon | Request Permissions | PDF file iconPDF (26 KB)  
    Freely Available from IEEE
  • IEEE Transactions on Systems, Man, and Cybernetics—Part A: Systems and Humans Information for authors

    Page(s): c4
    Save to Project icon | Request Permissions | PDF file iconPDF (35 KB)  
    Freely Available from IEEE

Aims & Scope

The fields of systems engineering and human machine systems: systems engineering includes efforts that involve issue formulation, issue analysis and modeling, and decision making and issue interpretation at any of the lifecycle phases associated with the definition, development, and implementation of large systems.

 

This Transactions ceased production in 2012. The current retitled publication is IEEE Transactions on Systems, Man, and Cybernetics: Systems.

Full Aims & Scope

Meet Our Editors

Editor-in-Chief
Dr. Witold Pedrycz
University of Alberta