Notification:
We are currently experiencing intermittent issues impacting performance. We apologize for the inconvenience.
By Topic

Systems, Man and Cybernetics, Part A: Systems and Humans, IEEE Transactions on

Issue 4 • Date July 2008

Filter Results

Displaying Results 1 - 24 of 24
  • Table of contents

    Publication Year: 2008 , Page(s): C1
    Save to Project icon | Request Permissions | PDF file iconPDF (53 KB)  
    Freely Available from IEEE
  • IEEE Transactions on Systems, Man, and Cybernetics—Part A: Systems and Humans publication information

    Publication Year: 2008 , Page(s): C2
    Save to Project icon | Request Permissions | PDF file iconPDF (39 KB)  
    Freely Available from IEEE
  • Multiple-Agent Perspectives in Reasoning About Situations for Context-Aware Pervasive Computing Systems

    Publication Year: 2008 , Page(s): 729 - 742
    Cited by:  Papers (32)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (943 KB) |  | HTML iconHTML  

    In open heterogeneous context-aware pervasive computing systems, suitable context models and reasoning approaches are necessary to enable collaboration and distributed reasoning among agents. This paper proposes, develops, and demonstrates the following: 1) a novel context model and reasoning approach developed with concepts from the state-space model, which describes context and situations as geometrical structures in a multidimensional space; and 2) a context algebra based on the model, which enables distributed reasoning by merging and partitioning context models that represent different perspectives of computing entities over the object of reasoning. We show how merging and reconciling different points of view over context enhances the outcomes of reasoning about the context. We develop and evaluate our proposed algebraic operators and reasoning approaches with cases using real sensors and with simulations. We embed agents and mobile agents with these modeling and reasoning capabilities, thus facilitating context-aware and adaptive mobile agents operating in open pervasive environments. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A New Measurement of Systematic Similarity

    Publication Year: 2008 , Page(s): 743 - 758
    Cited by:  Papers (5)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (707 KB) |  | HTML iconHTML  

    The relationship of similarity may be the most universal relationship that exists between every two objects in either the material world or the mental world. Although similarity modeling has been the focus of cognitive science for decades, many theoretical and realistic issues are still under controversy. In this paper, a new theoretical framework that conforms to the nature of similarity and incorporates the current similarity models into a universal model is presented. The new model, i.e., the systematic similarity model, which is inspired by the contrast model of similarity and structure mapping theory in cognitive psychology, is the universal similarity measurement that has many potential applications in text, image, or video retrieval. The text relevance ranking experiments undertaken in this research tentatively show the validity of the new model. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Human-Automated Judge Learning: A Methodology for Examining Human Interaction With Information Analysis Automation

    Publication Year: 2008 , Page(s): 759 - 776
    Cited by:  Papers (9)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1075 KB) |  | HTML iconHTML  

    Human-automated judge learning (HAJL) is a methodology providing a three-phase process, quantitative measures, and analytical methods to support design of information analysis automation. HAJL's measures capture the human and automation's judgment processes, relevant features of the environment, and the relationships between each. Specific measures include achievement of the human and the automation, conflict between them, compromise and adaptation by the human toward the automation, and the human's ability to predict the automation. HAJL's utility is demonstrated herein using a simplified air traffic conflict prediction task. HAJL was able to capture patterns of behavior within and across the three phases with measures of individual judgments and human-automation interaction. Its measures were also used for statistical tests of aggregate effects across human judges. Two between-subject manipulations were crossed to investigate HAJL's sensitivity to interventions in the human's training (sensor noise during training) and in display design (information from the automation about its judgment strategy). HAJL identified that the design intervention impacted conflict and compromise with the automation, participants learned from the automation over time, and those with higher individual judgment achievement were also better able to predict the automation. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Expanding the Criteria for Evaluating Socio-Technical Software

    Publication Year: 2008 , Page(s): 777 - 790
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (450 KB) |  | HTML iconHTML  

    This paper compares two evaluation criterion frameworks for sociotechnical software. Research on the technology acceptance model (TAM) confirms that perceived usefulness and perceived ease of use are relevant criteria for users evaluating organizational software. However, information technology has changed considerably since TAM's 1989 inception, so an upgraded evaluation framework may apply. The web of system performance (WOSP) model suggests eight evaluation criteria, based on a systems theory definition of performance. This paper compares WOSP and TAM criterion frameworks in a performance evaluation experiment using the analytic hierarchy process method. Subjects who used both TAM and WOSP criteria preferred the WOSP criteria, were more satisfied with its decision outcomes, and found the WOSP evaluation more accurate and complete. As sociotechnical software becomes more complex, users may need (or prefer) more comprehensive evaluation criterion frameworks. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Preserving Languages and Properties in Stepwise Refinement-Based Synthesis of Petri Nets

    Publication Year: 2008 , Page(s): 791 - 801
    Cited by:  Papers (9)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (459 KB) |  | HTML iconHTML  

    The current stepwise refinement operation of Petri nets mainly concentrates on property preservation, which is an effective way to analyze and verify complex systems. Further steps into this field are needed from the perspective of system synthesis and language preservation. First, the refinement of Petri nets is introduced based on a k-well-behaved Petri net, in which k tokens can be processed. Then, according to the different compositions of subsystems, well-, under- and overmatched refined Petri nets are proposed. In addition, the language and property relationships among sub-, original, and refined nets are studied to demonstrate behavior characteristics and property preservation in a system synthesis process. A manufacturing system is given as an example to illustrate the effectiveness of the proposed approach in synthesizing and analyzing the Petri nets of complex systems. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Modeling and Analysis for Workflow Constrained by Resources and Nondetermined Time: An Approach Based on Petri Nets

    Publication Year: 2008 , Page(s): 802 - 817
    Cited by:  Papers (24)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (670 KB) |  | HTML iconHTML  

    Time and resource management and verification are two important aspects of workflow management systems. In this paper, we present a modeling and analysis approach for a kind of workflow constrained by resources and nondetermined time based on Petri nets. Different from previous modeling approaches, there are two kinds of places in our model to represent the activities and resources of a workflow, respectively. For each activity place, its input and output transitions represent the start and termination of the activity, respectively, and there are two timing functions in it to define the minimum and maximum duration times of the corresponding activity. Using the constructed Petri net model, the earliest and latest times to start each activity can be calculated. With the reachability graph of the Petri net model, the timing factors influencing the implementation of the workflow can be calculated and verified. In this paper, the sufficient conditions for the existence of the best implementation case of a workflow are proved, and the method for obtaining such an implementation case is presented. The obtained results will benefit the evaluation and verification of the implementation of a workflow constrained by resources and nondetermined time. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An Effective PSO-Based Hybrid Algorithm for Multiobjective Permutation Flow Shop Scheduling

    Publication Year: 2008 , Page(s): 818 - 831
    Cited by:  Papers (21)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (670 KB) |  | HTML iconHTML  

    This paper proposes a hybrid algorithm based on particle swarm optimization (PSO) for a multiobjective permutation flow shop scheduling problem, which is a typical NP-hard combinatorial optimization problem with strong engineering backgrounds. Not only does the proposed multiobjective algorithm (named MOPSO) apply the parallel evolution mechanism of PSO characterized by individual improvement, population cooperation, and competition to effectively perform exploration but it also utilizes several adaptive local search methods to perform exploitation. First, to make PSO suitable for solving scheduling problems, a ranked-order value (ROV) rule based on a random key technique to convert the continuous position values of particles to job permutations is presented. Second, a multiobjective local search based on the Nawaz-Enscore-Ham heuristic is applied to good solutions with a specified probability to enhance the exploitation ability. Third, to enrich the searching behavior and to avoid premature convergence, a multiobjective local search based on simulated annealing with multiple different neighborhoods is designed, and an adaptive meta-Lamarckian learning strategy is employed to decide which neighborhood will be used. Due to the fusion of multiple different searching operations, good solutions approximating the real Pareto front can be obtained. In addition, MOPSO adopts a random weighted linear sum function to aggregate multiple objectives to a single one for solution evaluation and for guiding the evolution process in the multiobjective sense. Due to the randomness of weights, searching direction can be enriched, and solutions with good diversity can be obtained. Simulation results and comparisons based on a variety of instances demonstrate the effectiveness, efficiency, and robustness of the proposed hybrid algorithm. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A Formal Virtual Enterprise Access Control Model

    Publication Year: 2008 , Page(s): 832 - 851
    Cited by:  Papers (6)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1597 KB) |  | HTML iconHTML  

    A virtual enterprise (VE) refers to a cooperative alliance of legally independent enterprises, institutions, or single persons that collaborate with each other by sharing business processes and resources across enterprises in order to raise enterprise competitiveness and reduce production costs. Successful VEs require complete information transparency and suitable resource sharing among coworkers across enterprises. Hence, this investigation proposes a formal flexible integration solution, named the formal VE access control (VEAC) model, based on the role-based AC model, to integrate and share distributed resources owned by VE members. The formal VEAC model comprises a fundamental VEAC model, a project AC policy (PACP) language model, and a model construction methodology. The fundamental VEAC model manages VE resources and the resources of participating enterprises, in which various project relationships are presented to facilitate different degrees of resource sharing across projects and enterprise boundaries, and cooperative modes among VE roles are presented to enable collaboration among coworkers in a VE. This PACP language model features object-subject-action-condition AC policies that jointly determine user access authorizations. In addition, the methodology supplies a systematic method to identify fundamental elements of the VEAC model and to establish assignments between elements and relations. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Pose-Robust Facial Expression Recognition Using View-Based 2D + 3D AAM

    Publication Year: 2008 , Page(s): 852 - 866
    Cited by:  Papers (10)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1318 KB) |  | HTML iconHTML  

    This paper proposes a pose-robust face tracking and facial expression recognition method using a view-based 2D 3D active appearance model (AAM) that extends the 2D 3D AAM to the view-based approach, where one independent face model is used for a specific view and an appropriate face model is selected for the input face image. Our extension has been conducted in many aspects. First, we use principal component analysis with missing data to construct the 2D 3D AAM due to the missing data in the posed face images. Second, we develop an effective model selection method that directly uses the estimated pose angle from the 2D 3D AAM, which makes face tracking pose-robust and feature extraction for facial expression recognition accurate. Third, we propose a double-layered generalized discriminant analysis (GDA) for facial expression recognition. Experimental results show the following: 1) The face tracking by the view-based 2D 3D AAM, which uses multiple face models with one face model per each view, is more robust to pose change than that by an integrated 2D 3D AAM, which uses an integrated face model for all three views; 2) the double-layered GDA extracts good features for facial expression recognition; and 3) the view-based 2D 3D AAM outperforms other existing models at pose-varying facial expression recognition. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A Novel Hybrid Model Framework to Blind Color Image Deconvolution

    Publication Year: 2008 , Page(s): 867 - 880
    Cited by:  Papers (3)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1427 KB) |  | HTML iconHTML  

    This paper presents a new hybrid model framework to address blind color image deconvolution. Blind color image deconvolution is a challenging problem due to the limited information on the blurring function. Conventional methods based on the single-input single-output (SISO) model experience suboptimal results as each color channel is processed independently. On the other hand, there are limitations on the practicality of using a multiinput multioutput (MIMO) model in solving this problem as the color channels are usually highly correlated. In view of these constraints, this paper proposes a novel framework to solve blind color image deconvolution by first decomposing the color channels into wavelet subbands, and performing image deconvolution using a hybrid of SISO and single-input multioutput models. The proposed method utilizes the correlation information among different color channels to alleviate the constraints imposed by the MIMO systems. Experimental results show that the method is able to achieve satisfactory restored images under different noise and blurring environments. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An Effective Technique for Subpixel Image Registration Under Noisy Conditions

    Publication Year: 2008 , Page(s): 881 - 887
    Cited by:  Papers (5)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (538 KB) |  | HTML iconHTML  

    This paper proposes an effective higher order statistics method to address subpixel image registration. Conventional power spectrum-based techniques employ second-order statistics to estimate subpixel translation between two images. They are, however, susceptible to noise, thereby leading to significant performance deterioration in low signal-to-noise ratio environments or in the presence of cross-correlated channel noise. In view of this, we propose a bispectrum-based approach to alleviate this difficulty. The new method utilizes the characteristics of bispectrum to suppress Gaussian noise. It develops a phase relationship between the image pair and estimates the subpixel translation by solving a set of nonlinear equations. Experimental results show that the proposed technique provides performance improvement over conventional power-spectrum-based methods under different noise levels and conditions. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • QoS-Aware Web Service Configuration

    Publication Year: 2008 , Page(s): 888 - 895
    Cited by:  Papers (45)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (438 KB) |  | HTML iconHTML  

    With the development of enterprise-wide and cross-enterprise application integration and interoperation toward Web service, Web service providers try to not only fulfill the functional requirements of Web service users but also satisfy their nonfunctional conditions in order to survive in the competitive market. A hot research topic is how to configure Web services to meet their demand when the diversity of user requirements, distinction of service components' performance, and limitation of resources are considered. This paper builds a Web service configuration net based on Petri nets in order to exhibit Web service configurations in a formal way. Then, an optimal algorithm is presented to help choose the best configuration with the highest quality of service to meet users' nonfunctional requirements. Finally, the simulation results and related analysis prove the soundness and correctness of our model and algorithm. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Extending WebDAV With Multiple-Granularity Locking

    Publication Year: 2008 , Page(s): 896 - 916
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1327 KB) |  | HTML iconHTML  

    In distributed Web authoring, shared documents can be accessed concurrently by multiple authors who must be coordinated to avoid conflicts. The current Web standard for distributed authoring and versioning uses a two-phase locking to coordinate concurrent access. As the degree to which authors work concurrently may vary though among cooperative sessions, it is necessary to extend the aforementioned standard so as to support a multitude of lock granularity levels. In this paper, we first examine related protocols from the database literature, and then, we comment on their suitability for distributed authoring in the World Wide Web. Our main contribution is a multiple-granularity locking protocol, in which the locks are optional and they convey the meanings of access mode, locking scope, and locking effect. This protocol allows synchronous collaboration by guaranteeing a conflict-free environment and avoiding update loss while it also supports version control. Specifically, by identifying and timestamping object versions, the protocol preserves author intention and operation causality, which were possible so far with operational transformation only. The protocol's efficiency, finally, is demonstrated by a real test with human users and evaluated with simulation experiments, which reveal significant advantages over other protocols of this kind. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Mining With Noise Knowledge: Error-Aware Data Mining

    Publication Year: 2008 , Page(s): 917 - 932
    Cited by:  Papers (18)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1081 KB) |  | HTML iconHTML  

    Real-world data mining deals with noisy information sources where data collection inaccuracy, device limitations, data transmission and discretization errors, or man-made perturbations frequently result in imprecise or vague data. Two common practices are to adopt either data cleansing approaches to enhance the data consistency or simply take noisy data as quality sources and feed them into the data mining algorithms. Either way may substantially sacrifice the mining performance. In this paper, we consider an error-aware (EA) data mining design, which takes advantage of statistical error information (such as noise level and noise distribution) to improve data mining results. We assume that such noise knowledge is available in advance, and we propose a solution to incorporate it into the mining process. More specifically, we use noise knowledge to restore original data distributions, which are further used to rectify the model built from noise- corrupted data. We materialize this concept by the proposed EA naive Bayes classification algorithm. Experimental comparisons on real-world datasets will demonstrate the effectiveness of this design. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Stable and Quadratic Optimal Control for TS Fuzzy-Model-Based Time-Delay Control Systems

    Publication Year: 2008 , Page(s): 933 - 944
    Cited by:  Papers (9)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (563 KB) |  | HTML iconHTML  

    For the finite-horizon optimal control problem of the Takagi-Sugeno (TS) fuzzy-model-based time-delay control systems, by integrating the delay-dependent stabilizability condition, the shifted-Chebyshev-series approach (SCSA), and the hybrid Taguchi-genetic algorithm (HTGA), an integrative method is presented to design the stable and quadratic optimal parallel distributed compensation (PDC) controllers. In this paper, the delay-dependent stabilizability condition is proposed in terms of linear matrix inequalities (LMIs). Based on the SCSA, an algebraic algorithm only involving the algebraic computation is derived in this paper for solving the TS fuzzy-model-based time-delay feedback dynamic equations. In addition, by using the SCSA, the stable and quadratic optimal PDC control problem for the TS fuzzy-model-based time-delay control systems is replaced by a static parameter optimization problem represented by the algebraic equations with constraint of the LMI-based stabilizability condition, thus greatly simplifying the stable and optimal PDC control design problem. The computational complexity for both differential and integral in the stable and optimal PDC control design of the original dynamic systems may therefore be reduced considerably. Then, for the static constrained optimization problem, the HTGA is employed to find the stable and quadratic optimal PDC controllers of the TS fuzzy-model-based time-delay control systems. A design example of the stable and quadratic optimal PDC controllers for the continuous-stirred-tank-reactor system is given to demonstrate the applicability of the proposed integrative approach. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A Guideline for Low-Force Robotic Guidance for Enhancing Human Performance of Positioning and Trajectory Tracking: It Should Be Stiff and Appropriately Slow

    Publication Year: 2008 , Page(s): 945 - 957
    Cited by:  Papers (7)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1137 KB) |  | HTML iconHTML  

    This paper considers the application of a low-force robotic manipulator to guide a human user's movements to place a tool (or the user's hand) at a predetermined position or move it along a predetermined trajectory. This application is potentially useful, e.g., skill training for humans, rehabilitation, and human-machine coordination in the manufacturing industry. A proportional-derivative (PD)-type position control can be used for this application, but the parameters for the controller should be appropriately chosen for enhancing the human performance of positioning and trajectory tracking. We hypothesize that the robot's position control should be stiff and appropriately slow, i.e., the proportional gain should be high and the time constant (the ratio of the derivative gain to the proportional gain) should be appropriately large. Such characteristic has been difficult to be realized in ordinary PD position control because it requires direct high-gain velocity feedback. However, our recent technique, which is proxy-based sliding mode control (PSMC), is capable of producing such a hypothetically preferred response and allows us to empirically validate the hypothesis. The results of experiments using two distinctly different robotic devices supported the hypothesis, showing that the time constant should be set around 0.1 s rather than 0.01 and 0.5 s. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Image and Pattern Analysis of 1650 B.C. Wall Paintings and Reconstruction

    Publication Year: 2008 , Page(s): 958 - 965
    Cited by:  Papers (4)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (731 KB) |  | HTML iconHTML  

    In this paper, a color image segmentation method and a pattern analysis are presented, in connection with the extraordinary 1650 B.C. wall paintings found in the Greek island of Thera. These wall paintings are usually reconstructed from thousands of fragments widely scattered in the excavated site. The fragments' depiction manifests inhomogeneous color decay, cracks, added extraneous material, etc. The proposed color image segmentation method takes into account the decay problems and offers a very good approximation of the initial fragment depiction as the artist drew it in the late Bronze Age. The algorithm performs essentially better than other standard segmentation schemes as extensive qualitative tests indicate. Moreover, it offers clear-cut color regions and region borders for each fragment depiction. The whole approach is based on classifying the pixels into a number of regions where each region is described by a normal distribution, followed by fragment-decay reduction and edge refining. Extensive pattern analysis to the obtained region borders leads to the conclusion that 3650 years ago, the artist most probably used advanced geometrical methods in order to construct handcrafted ldquoFrench curvesrdquo (stencils or templates) and use them to draw certain figures. On the basis of the aforementioned results, specific pattern matching techniques are employed for the reconstruction of wall paintings, depicting spirals, from their constituent fragments. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Parameter Identification and Intersample Output Estimation for Dual-Rate Systems

    Publication Year: 2008 , Page(s): 966 - 975
    Cited by:  Papers (63)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (673 KB) |  | HTML iconHTML  

    In this paper, we derive a mathematical model for dual-rate systems and present a stochastic gradient identification algorithm to estimate the model parameters and an output estimation algorithm to compute the intersample outputs based on the dual-rate input-output data directly. Moreover, we investigate convergence properties of the parameter and intersample estimation, and we test the proposed algorithms with example systems, including an experimental water-level system. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Data Warehousing Infusion and Organizational Effectiveness

    Publication Year: 2008 , Page(s): 976 - 994
    Cited by:  Papers (3)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1390 KB) |  | HTML iconHTML  

    Data warehousing (DW) has emerged as one of the most powerful technology innovations in recent years to support organization-wide decision making and has become a key component in the information technology (IT) infrastructure. Proponents of DW claim that its infusion can dramatically enhance the ability of businesses to improve the access, distribution, and sharing of information and provide managerial decision support for complex business questions. DW is also an enabling technology for data mining, customer-relationship management, and other business-intelligence applications. Although data warehouses have been around for quite some time, they have been plagued by high failure rates and limited spread or use. Drawing upon past research on the adoption and diffusion of innovations and on the implementation of information systems (IS), we examine the key organizational and innovation factors that influence the infusion (diffusion) of DW within organizations and also examine if more extensive infusion leads to improved organizational outcomes. In this paper, we conducted a field study, where two senior managers (one from IS and the other from a line function) from 117 companies participated, and developed a structural model to test the research hypotheses. The results indicate that four of the seven variables examined in this paper-organizational support, quality of the project management process, compatibility, and complexity-significantly influence the degree of infusion of DW and that the infusion, in turn, significantly influences organization-level benefits and stakeholder satisfaction. The findings of this paper have interesting implications for both research and practice in IT and DW infusion, as well as in the organization-level impact of the infusion of enterprise-wide infrastructural and decision support technologies such as DW. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Remarks on “Measuring Ambiguity in the Evidence Theory”

    Publication Year: 2008 , Page(s): 995 - 999
    Cited by:  Papers (8)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (167 KB) |  | HTML iconHTML  

    In a recent paper, a functional AM (ambiguity measure) is introduced and an attempt is made to show that this functional qualifies as a measure of total aggregated uncertainty in the Dempster-Shafer theory. We show that this attempt fails due to a particular error in the proof of one of the principal theorems in the paper. Some additional remarks are made regarding recent research pertaining to the subject of the discussed paper. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • IEEE Systems, Man, and Cybernetics Society Information

    Publication Year: 2008 , Page(s): C3
    Save to Project icon | Request Permissions | PDF file iconPDF (29 KB)  
    Freely Available from IEEE
  • IEEE Transactions on Systems, Man, and Cybernetics—Part A: Systems and Humans Information for authors

    Publication Year: 2008 , Page(s): C4
    Save to Project icon | Request Permissions | PDF file iconPDF (36 KB)  
    Freely Available from IEEE

Aims & Scope

The fields of systems engineering and human machine systems: systems engineering includes efforts that involve issue formulation, issue analysis and modeling, and decision making and issue interpretation at any of the lifecycle phases associated with the definition, development, and implementation of large systems.

 

This Transactions ceased production in 2012. The current retitled publication is IEEE Transactions on Systems, Man, and Cybernetics: Systems.

Full Aims & Scope

Meet Our Editors

Editor-in-Chief
Dr. Witold Pedrycz
University of Alberta