Notification:
We are currently experiencing intermittent issues impacting performance. We apologize for the inconvenience.
By Topic

Systems, Man, and Cybernetics, Part C: Applications and Reviews, IEEE Transactions on

Issue 6 • Date Nov. 2008

Filter Results

Displaying Results 1 - 24 of 24
  • Table of contents

    Publication Year: 2008 , Page(s): C1
    Save to Project icon | Request Permissions | PDF file iconPDF (41 KB)  
    Freely Available from IEEE
  • IEEE Transactions on Systems, Man, and Cybernetics—Part C: Applications and Reviews publication information

    Publication Year: 2008 , Page(s): C2
    Save to Project icon | Request Permissions | PDF file iconPDF (37 KB)  
    Freely Available from IEEE
  • Beyond Traditional Kernels: Classification in Two Dissimilarity-Based Representation Spaces

    Publication Year: 2008 , Page(s): 729 - 744
    Cited by:  Papers (6)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (629 KB) |  | HTML iconHTML  

    Proximity captures the degree of similarity between examples and is thereby fundamental in learning. Learning from pairwise proximity data usually relies on either kernel methods for specifically designed kernels or the nearest neighbor (NN) rule. Kernel methods are powerful, but often cannot handle arbitrary proximities without necessary corrections. The NN rule can work well in such cases, but suffers from local decisions. The aim of this paper is to provide an indispensable explanation and insights about two simple yet powerful alternatives when neither conventional kernel methods nor the NN rule can perform best. These strategies use two proximity-based representation spaces (RSs) in which accurate classifiers are trained on all training objects and demand comparisons to a small set of prototypes. They can handle all meaningful dissimilarity measures, including non-Euclidean and nonmetric ones. Practical examples illustrate that these RSs can be highly advantageous in supervised learning. Simple classifiers built there tend to outperform the NN rule. Moreover, computational complexity may be controlled. Consequently, these approaches offer an appealing alternative to learn from proximity data for which kernel methods cannot directly be applied, are too costly or impractical, while the NN rule leads to noisy results. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A Business Process Intelligence System for Enterprise Process Performance Management

    Publication Year: 2008 , Page(s): 745 - 756
    Cited by:  Papers (23)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (457 KB) |  | HTML iconHTML  

    Business process management systems traditionally focused on supporting the modeling and automation of business processes, with the objective of enabling fast and cost-effective process execution. As more and more processes become automated, customers become increasingly interested in managing process execution. This paper presents a set of concepts and a methodology toward business process intelligence (BPI) using dynamic process performance evaluation, including measurement models based on activity-based management (ABM) and a dynamic enterprise process performance evaluation methodology. The proposed measurement models support the analysis of six process flows within a manufacturing enterprise including activity flow, information flow, resource flow, cost flow, cash flow, and profit flow, which are crucial for enterprise managers to control the process execution quality and detect problems and areas for improvements. The proposed process performance evaluation methodology uses time, quality, service, cost, speed, efficiency, and importance as seven evaluation criteria. A prototype system supporting dynamic enterprise process modeling, analysis of six process flows, and process performance prediction has been implemented to validate the proposed methodology. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Reliability-Constrained Optimum Placement of Reclosers and Distributed Generators in Distribution Networks Using an Ant Colony System Algorithm

    Publication Year: 2008 , Page(s): 757 - 764
    Cited by:  Papers (25)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (293 KB) |  | HTML iconHTML  

    Optimal placement of protection devices and distributed generators (DGs) in radial feeders is important to ensure power system reliability. Distributed generation is being adopted in distribution networks with one of the objectives being enhancement of system reliability. In this paper, an ant colony system algorithm is used to derive the optimal recloser and DG placement scheme for radial distribution networks. A composite reliability index is used as the objective function in the optimization procedure. Simulations are carried out based on two practical distribution systems to validate the effectiveness of the proposed method. Furthermore, comparative studies in relation to genetic algorithm are also conducted. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Short-Term Schedulability Analysis of Crude Oil Operations in Refinery With Oil Residency Time Constraint Using Petri Nets

    Publication Year: 2008 , Page(s): 765 - 778
    Cited by:  Papers (14)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (356 KB) |  | HTML iconHTML  

    A short-term schedule for oil refinery should arrange all the activities in every detail for the whole scheduling horizon, leading to a complex problem. There lacks efficient techniques and software tools for its solution applicable to industrial oil refinery. Considering that the feasibility of a schedule is essential, this paper studies the feasibility problem from a control perspective. A short-term schedule is composed of a series of operation decisions (ODs), each of which can be seen as a control. When an OD is executed, it transfers the system from one state to another. To guarantee the schedule feasibility, the system should be always be kept in safe states. The system is modeled by a Petri net model that is under control of the ODs. With this model, schedulability conditions for a system with one distiller are presented. These conditions reveal the relationship among the number of charging tanks, the oil transportation flow rate of the pipeline, and the production rate. The conditions are presented in a constructive way. Based on the conditions, when a realizable refining schedule is verified, a detailed short-term schedule is created for practical use. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Why Did the California Electricity Crisis Occur?: A Numerical Analysis Using a Multiagent Intelligent Simulator

    Publication Year: 2008 , Page(s): 779 - 790
    Cited by:  Papers (7)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (704 KB) |  | HTML iconHTML  

    During the summer of 2000, wholesale electricity prices in California were approximately 500% higher than those during the same months in 1998-1999. The price hike was unexpected by many policy makers and individuals who were involved in the electric utility industry. They have been long wondering whether the electricity deregulation policy (1996) produced benefits of competition promised to consumers. This study proposes a use of a multiagent intelligent simulator (MAIS) to numerically examine several reasons regarding why the crisis has occurred during May 2000 to January 2001. The MAIS explains the price fluctuation of wholesale electricity during the crisis with an estimation accuracy (91.15%). We also find that 40.46% of the price increase was due to an increase in marginal production cost, 17.85% due to traders' greediness, 5.27% due to a real demand change, and 3.56% due to market power. The remaining 32.86% came from other unknown components. This result indicates that the price hike has occurred due to an increase in fuel prices and real demand. The two market fundamentals explained 45.73% (=40.46% + 5.27%) of the price increase. The responsibility of energy firms was 21.41% (=17.85% + 3.56%). The numerical evidences are different from the very well-known research of Joskow and Kahn, which has attributed the exercise of market power by large energy firms. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Machine Vision/GPS Integration Using EKF for the UAV Aerial Refueling Problem

    Publication Year: 2008 , Page(s): 791 - 801
    Cited by:  Papers (7)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (749 KB) |  | HTML iconHTML  

    The purpose of this paper is to propose the application of an extended Kalman filter (EKF) for the sensors fusion task within the problem of aerial refueling for unmanned aerial vehicles (UAVs). Specifically, the EKF is used to combine the position data from a global positioning system (GPS) and a machine vision (MV)-based system for providing a reliable estimation of the tanker-UAV relative position throughout the docking and the refueling phase. The performance of the scheme has been evaluated using a virtual environment specifically developed for the study of the UAV aerial refueling problem. Particularly, the EKF-based sensor fusion scheme integrates GPS data with MV-based estimates of the tanker-UAV position derived through a combination of feature extraction, feature classification, and pose estimation algorithms. The achieved results indicate that the accuracy of the relative position using GPS or MV estimates can be improved by at least one order of magnitude with the use of EKF in lieu of other sensor fusion techniques. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A Hybrid System Integrating a Wavelet and TSK Fuzzy Rules for Stock Price Forecasting

    Publication Year: 2008 , Page(s): 802 - 815
    Cited by:  Papers (21)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1546 KB) |  | HTML iconHTML  

    The prediction of future time series values based on past and present information is very useful and necessary for various industrial and financial applications. In this study, a novel approach that integrates the wavelet and Takagi-Sugeno-Kang (TSK)-fuzzy-rule-based systems for stock price prediction is developed. A wavelet transform using the Haar wavelet will be applied to decompose the time series in the Haar basis. From the hierarchical scalewise decomposition provided by the wavelet transform, we will next select a number of interesting representations of the time series for further analysis. Then, the TSK fuzzy-rule-based system is employed to predict the stock price based on a set of selected technical indices. To avoid rule explosion, the k-means algorithm is applied to cluster the data and a fuzzy rule is generated in each cluster. Finally, a K nearest neighbor (KNN) is applied as a sliding window to further fine-tune the forecasted result from the TSK model. Simulation results show that the model has successfully forecasted the price variation for stocks with accuracy up to 99.1% in Taiwan Stock Exchange index. Comparative studies with existing prediction models indicate that the proposed model is very promising and can be implemented in a real-time trading system for stock price prediction. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Gaussian Mixture Modeling of Keystroke Patterns for Biometric Applications

    Publication Year: 2008 , Page(s): 816 - 826
    Cited by:  Papers (13)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (286 KB) |  | HTML iconHTML  

    The keystroke patterns produced during typing have been shown to be unique biometric signatures. Therefore, these patterns can be used as digital signatures to verify the identity of computer users remotely over the Internet or locally at a specific workstation. In particular, keystroke recognition can enhance the username and password security model by monitoring the way that these strings are typed. To this end, this paper proposes a novel up--up keystroke latency (UUKL) feature and compares its performance with existing features using a Gaussian mixture model (GMM)-based verification system that utilizes an adaptive and user-specific threshold based on the leave-one-out method (LOOM). The results show that the UUKL feature significantly outperforms the commonly used key hold-down time (KD) and down--down keystroke latency (DDKL) features. Overall, the inclusion of the UUKL feature led to an equal error rate (EER) of 4.4% based on a database of 41 users, which is a 2.1% improvement as compared to the existing features. Comprehensive results are also presented for a two-stage authentication system that has shown significant benefits. Lastly, due to many inconsistencies in previous works, a formal keystroke protocol is recommended that consolidates a number of parameters concerning how to improve performance, reliability, and accuracy of keystroke-recognition systems. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Predicting the Parts Weight in Plastic Injection Molding Using Least Squares Support Vector Regression

    Publication Year: 2008 , Page(s): 827 - 833
    Cited by:  Papers (9)  |  Patents (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (556 KB) |  | HTML iconHTML  

    To achieve the desired quality in plastic injection molding, advanced monitoring techniques are often recommended in the workshop. Unfortunately, the signal in plastic injection modeling process such as nozzle pressure that is relevant to part quality is not easy to obtain because of the cost of sensors. The sensor-based modeling idea is therefore adopted. In this paper, a new method for predicting the parts weight in plastic injection molding using least squares support vector regression (LS-SVR) is proposed, which is composed of two steps. The first step is to estimate the nozzle pressure with the hydraulic system pressure using an LS-SVR model. The second step is to predict product weight using the estimated nozzle pressure, which is done using another LS-SVR model. The experimental results show that the new method is very effective. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Two-Layered Confabulation Architecture for an Artificial Creature's Behavior Selection

    Publication Year: 2008 , Page(s): 834 - 840
    Cited by:  Papers (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (588 KB) |  | HTML iconHTML  

    This paper proposes a novel two-layered confabulation architecture for an artificial creature to select a proper behavior considering the internally generated will and the context of the external environment consecutively. The architecture is composed of seven main modules for processing perception, internal state, context, memory, learning, behavior selection, and actuation. The two-layered confabulation in a behavior module is processed by a will-based confabulation and a context-based confabulation consecutively by referring to confabulation probabilities in a memory module. An arbiter in the behavior module chooses a proper behavior among the suggested ones from the two confabulations, which is to be put into an action. To demonstrate the effectiveness of the proposed architecture, experiments are carried out for an artificial creature, implemented in the 3-D virtual environment, which behaves as per its will considering the context in the environment. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Comments on "An Adaptive Multimodal Biometric Management Algorithm

    Publication Year: 2008 , Page(s): 841 - 843
    Cited by:  Papers (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (248 KB) |  | HTML iconHTML  

    We note that there are some discrepancies in the results reported in the previous titled paper. Our experiments indicate that the authors have considered only a subset of all possible fusion rules, contradicting the statement that all possible rules have been considered. Moreover, the authors state that only monotonic rules can be optimal, and therefore, all other rules can be ignored. However, our experimental results examining all possible rules demonstrate that a nonmonotonic rule can also be an optimum fusion rule. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Vision-Based Grasp Tracking for Planar Objects

    Publication Year: 2008 , Page(s): 844 - 849
    Cited by:  Papers (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (565 KB) |  | HTML iconHTML  

    In robotics, the manipulation of a priori unknown objects involves several steps and problems that must be carefully considered and solved by proper planning and control algorithms. For example, once suitable contact points have been computed, the control system should be able to track them in the approach phase, i.e., while the relative position/orientation of the object and the gripper of the robotic system change due to the approaching movement of the robot toward the object. This correspondence paper proposes a practical method for the tracking of grasp points in image space that is based on transferring previously computed grasp points from an initial image to subsequent ones and on the analysis of the new grasp configuration. Three different options are proposed for this transference. Experimental results show the interesting practical performance of the general procedure. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • IEEE Transactions on Systems, Man, and Cybernetics—Part C: Applications and Reviews information for authors

    Publication Year: 2008 , Page(s): C4
    Save to Project icon | Request Permissions | PDF file iconPDF (37 KB)  
    Freely Available from IEEE
  • A Tool for the Accumulation and Evaluation of Multimodal Research

    Publication Year: 2008 , Page(s): 850 - 855
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (481 KB) |  | HTML iconHTML  

    A surge of interest exists in multimodal research and interfaces. This is due, at least in part, to an exponential increase in the amount and type of information that can be presented to a user. When a great deal of information is presented via a single sensory modality, it can exceed the operator's capacity to manage the information efficiently, generating cognitive overload. As a consequence, the user's performance becomes susceptible to slower response times, loss of situational awareness, faulty decision making, and execution errors. Researchers and designers have responded to these issues with the development and application of multimodal information displays. The cross-disciplinary flavor of multimodal applications presents a challenge to the accumulation, evaluation, and dissemination of relevant research. We describe the development of a taxonomy for the evaluation and comparison of multimodal display research studies, and the implementation of the taxonomy into a database: the Multimodal Query System (MQueS). View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Designing Effective Alarms for Radiation Detection in Homeland Security Screening

    Publication Year: 2008 , Page(s): 856 - 860
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (240 KB) |  | HTML iconHTML  

    In this correspondence, the human factors involved in the design of effective homeland security threat detection systems are described and illustrated for radiation portal monitor (RPM) systems deployed at U.S. ports of entry. Due to the occurrence of nuisance alarms based on naturally occurring radioactive material and the low base rate of nuclear smuggling incidents, it is shown that the probability of a true threat alarm for these systems is extremely low. Receiver operating characteristic analysis of RPM systems illustrates good simple detection capability, but threat classification performance only at the chance level. Application of the human factors concept of the threat likelihood display, based on energy spectrum and cargo commodity data fusion for signal classification, reduces nuisance alarms and increases the probability of a true threat alarm to potentially effective levels. Thus, threat likelihood displays offer an approach for enhancing the effectiveness of homeland security detection and warning systems by raising the credibility of the alerts that are provided. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Reducing the Uncertainty on Location Estimation of Mobile Users to Support Hospital Work

    Publication Year: 2008 , Page(s): 861 - 866
    Cited by:  Papers (4)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (693 KB) |  | HTML iconHTML  

    The nature of a context-aware application in hospital work demands a reliable and accurate location system. The activity for which this location information is needed determines to a great extent the relevancy of this contextual variable, since a minor error in delivering patient-based information can be critical. In this correspondence, we present an enhanced technique to infer the location of users in a hospital setting based on the strength of radio-frequency signals received by mobile devices that are used to train a neural network. The approach uses the neighbors surrounding the location to be estimated to track users continuously. This neighborhood eases the training and is used to simulate previous time instant guesses to reduce the location estimation error and alleviate the hopping trajectories of users. The results obtained by using this approach are in the order of 1.3 m for the average distance error during continuous motion. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Hierarchical Pattern Discovery in Graphs

    Publication Year: 2008 , Page(s): 867 - 872
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (275 KB) |  | HTML iconHTML  

    In this correspondence, an efficient technique for detecting repeating patterns in a graph is described. For this purpose, the searching capability of evolutionary programming is utilized for discovering patterns that are often repeating in such structural data. The approach adopted in this correspondence is hierarchical in nature. Once a pattern is discovered in a particular level of the hierarchy, the graph is compressed using it, and the substructure discovery algorithm is repeated with the compressed graph. The proposed technique is useful for mining knowledge from databases that can be conveniently represented as graphs. The importance of such an endeavor can hardly be overemphasized, given that substantial portion of data that are generated and collected is either structural in nature or is composed of parts and relations between the parts, which can be naturally represented as graphs. A typical example can be the structure of protein as well as computer-aided design circuits that have a natural graphical representation. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Reviewers 2008

    Publication Year: 2008 , Page(s): 873 - 876
    Save to Project icon | Request Permissions | PDF file iconPDF (37 KB)  
    Freely Available from IEEE
  • 2008 Index IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews) Vol. 38

    Publication Year: 2008 , Page(s): 877 - 887
    Save to Project icon | Request Permissions | PDF file iconPDF ( KB)  
    Freely Available from IEEE
  • 2009 ICMLC ICWAPR

    Publication Year: 2008 , Page(s): 1
    Save to Project icon | Request Permissions | PDF file iconPDF (633 KB)  
    Freely Available from IEEE
  • IEEE Systems, Man, and Cybernetics Society Information

    Publication Year: 2008 , Page(s): C3
    Save to Project icon | Request Permissions | PDF file iconPDF (29 KB)  
    Freely Available from IEEE
  • IEEE Transactions on Systems, Man, and Cybernetics—Part C: Applications and Reviews Information for authors

    Publication Year: 2008 , Page(s): C4
    Save to Project icon | Request Permissions | PDF file iconPDF (37 KB)  
    Freely Available from IEEE

Aims & Scope

Overview, tutorial and application papers concerning all areas of interest to the SMC Society: systems engineering, human factors and human machine systems, and cybernetics and computational intelligence. 

Authors should submit human-machine systems papers to the IEEE Transactions on Human-Machine Systems.

Authors should submit systems engineering papers to the IEEE Transactions on Systems, Man and Cybernetics: Systems.

Authors should submit cybernetics papers to the IEEE Transactions on Cybernetics.

Authors should submit social system papers to the IEEE Transactions on Computational Social Systems.

Full Aims & Scope

Meet Our Editors

Editor-in-Chief
Dr. Vladimir Marik
(until 31 December 2012)