By Topic

Systems, Man and Cybernetics, IEEE Transactions on

Issue 1 • Date Jan/Feb 1990

Filter Results

Displaying Results 1 - 25 of 27
  • Model-base structures to support adaptive planning in command/control systems

    Publication Year: 1990 , Page(s): 18 - 32
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1308 KB)  

    The basic rationale for an organization to constitute itself as a command/control system is to increase its potential for coping with particularly perilous, volatile and ill-constrained competitive environments. One characteristic common to such environments is the likelihood of confrontation by unanticipated or unplanned for situations. The emergence of such a situation might require adaptive planning, yet as things now stand, there is virtually nothing in the way of a consensus on the sorts of technical facilities that might help support adaptive planning. Prospects are explored for the development of two classes of model-base structures-categorical and template-driven-both of which are intended to allow command/control systems to significantly accelerate the performance of certain key planning-related activities View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Representing and learning Boolean functions of multivalued features

    Publication Year: 1990 , Page(s): 67 - 80
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1328 KB)  

    An analysis and empirical measurement of threshold linear functions of multivalued features is presented. The number of thresholded linear functions, maximum weight size, training speed, and the number of nodes necessary to represent arbitrary Boolean functions are all shown to increase polynomially with the number of distinct values the input features can assume and exponentially with the number of features. Two network training algorithms, focusing and back propagation, are described. Empirically, they are capable of learning arbitrary Boolean functions of multivalued features in a two-level net. Focusing is proved to converge to a correct classification and permits some time-space complexity analysis. Training time for this algorithm is polynomial in the number of values of a feature can assume, and exponential in the number of features. Back propagation is not necessarily convergent, but for randomly generated Boolean functions, the empirical behavior of the implementation is similar to that of the focusing algorithm View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Induced system restrictiveness: an experimental demonstration

    Publication Year: 1990 , Page(s): 195 - 201
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (684 KB)  

    Decision support systems (DSSs) built to expand human decision capabilities can simultaneously restrain decision-making. This characteristic is referred to as system restrictiveness. The existing conceptualization of system restrictiveness is reviewed, and the concept of induced restrictiveness as a subtle force that restricts decision-making by inducing decision-makers to take a particular decision process is introduced. The validity of this concept has been demonstrated by a controlled laboratory experiment that employed a protocol analysis technique. The experiment evaluated the decision process induced by Lotus 1-2-3 for a particular type of task. The results indicated that Lotus tended to induce the incremental decision process as opposed to the synoptic decision process favored by the control group. The incremental decision process generates alternatives by making marginal changes to a previous solution, whereas the synoptic decision process is characterized by unbiased search for solutions. These results are discussed in the context of induced restrictiveness View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An object-oriented expert system based on pattern recognition

    Publication Year: 1990 , Page(s): 33 - 44
    Cited by:  Papers (1)  |  Patents (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (996 KB)  

    The design and implementation of a knowledge elicitation tool for use in specifying real-time control system requirements with the Core method is presented. The tool uses pattern recognition based on a Holland classifier to propose a control-system object space for subsequent manipulation, and it is in this area that emphasis is placed. Classifier systems in general are discussed, and an implementation using the Smalltalk-800 object-oriented programming environment is presented. Practical results are included View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Composite edge detection with random field models

    Publication Year: 1990 , Page(s): 81 - 93
    Cited by:  Papers (4)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (2524 KB)  

    A pixel is regarded as an edge pixel if there is either a sharp change in the intensity values on either side of the pixel like a step function, yielding a so-called step edge or the texture on either side of the pixel is different, yielding a so-called texture edge. A two stage generate-and-confirm paradigm for detecting all the edge pixels in the scene is used. In the first stage, a directional derivatives approach for determining all potential edge pixels and the direction of the edge is employed. At this stage some of the edge pixels could be spurious, typically caused by either the noise in the image or the microedges inside a texture. In the second stage, each candidate pixel is subjected to two separate tests to confirm whether the edge pixel is a step edge or texture edge. The texture edge is confirmed by a likelihood-ratio test. The likelihood function is computed by fitting a nonsymmetric half-plane random-field model to the texture in a rectangular strip where the dominant direction is perpendicular to the estimated edge direction. Only the edge pixels that pass at least one of the two tests is accepted. The validity of the method is demonstrated by testing four different images View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Knowledge acquisition and fault diagnosis: experiments with PLAULT

    Publication Year: 1990 , Page(s): 225 - 242
    Cited by:  Papers (5)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1844 KB)  

    A series of experiments using a paradigm called PLAULT, where subjects have to discover the structure of a logical network, is reported. The links in the network are invisible, and subjects have to infer where they are by examining the status of the logic gates under a variety of conditions. Three experiments explore the interaction of features of the task, such as its spatial and graphical complexity, with the cognitive limitations of the learner. A combination of these features predicts which parts of a network subjects can and cannot learn, when knowledge is tested by requiring subjects to draw diagrams of the supposed network structure. These task features also predict efficiency of transfer to a fault diagnosis task. The experiments demonstrate the predictive power that can be gained by identifying task features that are likely to prevent thorough learning of a system. These results are also important for understanding the dissociation of the ability to perform routine tasks and the ability to respond to abnormal events View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An overview of automated reasoning

    Publication Year: 1990 , Page(s): 202 - 224
    Cited by:  Papers (4)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (2264 KB)  

    Two general approaches to reasoning with imperfect information are discussed: nonmonotonic reasoning and a calculus of uncertainty. Default reasoning is posed as an approach that is potentially capable of integrating many facets of these two approaches. Practical requirements for default reasoning are then established. This is done by identifying a number of cases that involve incomplete and uncertain information and showing how they can be addressed by default reasoning. Parametric and symbolic reasoning are differentiated, and it is shown that both types are necessary. This distinction is important, as most approaches tend to neglect either the parametric or the symbolic aspect of default reasoning, thereby restricting its use to one of the two approaches discussed above. Five capabilities that are necessary to effect default reasoning are identified. The major characteristics for systems that handle incomplete and uncertain information as well as other types of imperfect information are established View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Estimation of mixing probabilities in multiclass finite mixtures

    Publication Year: 1990 , Page(s): 149 - 158
    Cited by:  Papers (6)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (796 KB)  

    The problem of estimating prior probabilities in a mixture of M classes with known class conditional distributions is studied. The observation is a sequence of n independent, identically distributed mixture random variables. The first moments of appropriately formulated functions of observations are used to facilitate estimation. The complexity of these functions may vary from linear functions of the observations (in some cases) to complex functions of class conditional density functions of observations, depending on the desired balance between computational simplicity and theoretical properties. A closed-form, recursive, unbiased, convergent estimator using the density function is presented: the result is valid for any problem in which prior probabilities are identifiable. Discrete and mixed densities require a minor modification. Three application examples are described. The class conditional expectations of density functions, required for the initialization of the estimator algorithm, are analytically evaluated for Gaussian and exponential densities View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • On the hierarchical modeling analysis and simulation of flexible manufacturing systems with extended Petri nets

    Publication Year: 1990 , Page(s): 94 - 110
    Cited by:  Papers (26)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1460 KB)  

    A class of Petri nets, called extended Petri nets, with multiple types of places, multiple classes of tokens and multiple arcs is proposed. It is utilized for the hierarchical modeling of flexible manufacturing systems, ensuring a priori that the extended Petri net system model obtained is live, bounded, consistent and error free. The proposed method views the operation of the flexible manufacturing system as a process that is decomposed into operations with specified precedence relations. For each operation the required resources are identified, and on the basis of these requirements the overall system is decomposed into a set of finite subsystems. The operation of each subsystem is modeled as an event graph representing a single resource activity cycle. The extended Petri net system model is synthesized from these component nets using certain synthesis rules. A software package has been developed to simulate the execution of the model obtained View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A study on graphic presentation formats and information processing using query tracing

    Publication Year: 1990 , Page(s): 252 - 257
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (640 KB)  

    A study to determine whether graphics are suitable for displaying information in complex information processing tasks is discussed. Using an information query system capable of responding in either graphical or tabular format, a laboratory experiment was conducted. A method called query tracing was introduced to observe information processing in a decision support system environment. The results indicate that use of graphs simplifies complex information processing tasks View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • On a generalization of variable precision logic

    Publication Year: 1990 , Page(s): 248 - 252
    Cited by:  Papers (3)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (460 KB)  

    Variable-precision logic was introduced by R.S. Michalski and P.H. Winston (Artif. Intell., vol.29, p.121-46, 1986) as a means for representing reasoning in the face of exceptions. Using the framework of belief structures, some of their ideas are extended. In particular, a general framework that easily allows the representation of different types of rules in a unified manner is provided. This approach also allows for a lack of specificity in the associated qualifying probabilities View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Quantitative analysis of a moment-based edge operator

    Publication Year: 1990 , Page(s): 58 - 66
    Cited by:  Papers (5)  |  Patents (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (596 KB)  

    An operator that is based on the sample variance of a group of pixels is introduced. It exhibits three unique properties: freedom from a predefined shape, low computational complexity and a rigorous stochastic formulation. The utility of the latter property in applying detection-theoretic principles to the task of edge detection and in theoretical predictions of experimental performance measures is demonstrated. The operator is compared with other common edge operators on natural scenes View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Neural correlation based on the IPFM model

    Publication Year: 1990 , Page(s): 262 - 268
    Cited by:  Papers (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (528 KB)  

    The performance and the sensitivity of the neural correlator based on integral pulse frequency modulation (IPFM) are analyzed. The performance of the neural correlator is evaluated and shown to match many psychophysical test results. The behavior of the neural coder is expressed in terms of a stochastic nonlinear law. An analytical optimality condition for the mean-spike firing rate is derived. Numerical comparison between optimal and neurophysiological values shows that natural systems meet the optimality condition View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A framework for networked knowledge-based systems

    Publication Year: 1990 , Page(s): 119 - 127
    Cited by:  Papers (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (832 KB)  

    A conceptual framework for building a network of knowledge-based systems (KBS) to fully utilize the benefits of these systems within complex decision-making environments is proposed. The authors focus on three issues: (1) what should be the capabilities of the system supporting the human decision-maker; (2) how the nodes of such a network should be linked; and (3) what knowledge each node in the network should possess. A framework for modeling knowledge and communications between the nodes of a networked KBS is outlined. The framework is developed from the viewpoint that the decision support that is provided by the KBS is not restricted to the easy access to, and processing and management of, models and data. It also includes expertise, which the user does not possess and might need for the solution of the decision problems. The resulting conceptual framework draws on concepts from decision-support systems and expert systems View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A new method for linearization of dynamic robot models

    Publication Year: 1990 , Page(s): 2 - 17
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (864 KB)  

    A method for linearizing dynamic models for robot manipulators along a nominal trajectory for robot manipulators is developed from the straightforward Lagrangian formulation. The method is very simple and systematic. It can be applied both to the computation of the feedforward control law (i.e. the joint generalized forces/torques) along the desired nominal trajectory and to the design of the feedback controller that reduces or eliminates any deviations from the desired nominal trajectory. The salient advantage of using this method is that the amount of computation required for deriving the complete linearized dynamic model for a manipulator is so small that it makes the real-time computation on a mini- or microcomputer possible. The computation for a general manipulator with six degrees of freedom requires at most 2000 multiplications and 1700 additions. For most industrial manipulators with six degrees of freedom, it requires at most 1400 multiplications and 1300 additions View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Information/communication and dispatching strategies for networks with mobile servers

    Publication Year: 1990 , Page(s): 111 - 118
    Cited by:  Patents (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (540 KB)  

    The dispatch problem for a network with a general number of nonstationary service units is discussed. The possibility of dispatching a service unit to a call for service while the unit is in motion depends on the quality of the information/communication system available. Three information/communication systems are investigated: two extreme cases in which only stationary units can be dispatched from their home location or any unit can be dispatched from anywhere at any time, and an intermediate case. For each of the three systems, the dispatching policy is derived, and the systems are compared on the basis of the expected response time to a random request for service View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The bond energy algorithm revisited

    Publication Year: 1990 , Page(s): 268 - 274
    Cited by:  Papers (3)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (720 KB)  

    The bond energy algorithm of W.T. McCormick, P.J. Schweitzer, and T.W. White (Oper. Res., vol.20, p.993-1009, 1972) is examined in the context of related strategies of data analysis that seek to solve problems in production research, imaging, and related engineering problems. A taxonomy of types of input data and forms of matrix structure, adopted from other areas of data analysis, serves to clarify some distinctions that have been at most implicit in published alternatives to the bond energy approach. The objective function initially proposed for this approach is considered, and some of its properties and resulting limitations are deduced. Some extensions of the original technique are offered, and one alternative is criticized View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Toward efficient global optimization in large dynamic systems-the adaptive complex method

    Publication Year: 1990 , Page(s): 257 - 261
    Cited by:  Papers (4)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (552 KB)  

    The efficient solution of global optimization problems in large dynamic systems requires methods that are robust, do not require analytical expressions for the objective function or its derivatives and require function evaluations only on the order O(n), where n is the dimension of the search-variable space. The adaptive complex method operated in a staged feed-ahead mode has been found to possess these qualities. The method is described along with tests using classical multimodal functions. The tests indicate that the method is efficient relative to other methods. Tests with higher-dimension multimodal problems (with spaces of dimension 22 and 36) are described, and the results indicate that the method can be employed to solve efficiently much larger problems of the class described View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Belief combination and propagation in a lattice-structured interference network

    Publication Year: 1990 , Page(s): 45 - 57
    Cited by:  Papers (17)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1136 KB)  

    Belief propagation and belief combination procedures based on the Dempster-Shafer belief function for inference in rule-based systems is proposed. The belief combination procedure yields results identical to those of Dempster's rule when pieces of evidence are independent. Dempster's rule is shown to be nonrobust for combining evidence with a high degree of conflict. The cause of the nonrobustness is discussed, and an alternative belief combination procedure to remedy the deficiency is proposed. Although the proposed procedure yields results that are dependent on the interpretations of the rule, it is shown to be an interpolation between total ignorance and the uncertainty associated with the rule regardless of the interpretations. When the rule interpretation yields an associative belief propagation procedure, a corresponding chaining syllogism for it can be derived. The proposed inference procedures are applied to a lattice-structured inference network View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An exploration of orientation representation by Lie algebra for robotic applications

    Publication Year: 1990 , Page(s): 243 - 248
    Cited by:  Papers (6)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (552 KB)  

    A number of conventional methods for orientation representation are reviewed and discussed. An analysis based on Lie algebra for exploring possible definitions of three-dimensional (3D) orientation vectors and for unifying representations of position and orientation is presented. The associated computation methods are developed and verified by application to the six-axis Puma 560 robot View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An architecture for adversarial planning

    Publication Year: 1990 , Page(s): 186 - 194
    Cited by:  Papers (9)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (964 KB)  

    Adversarial planning for battle management is discussed. Conventional artificial intelligence planning approaches fail to address problems that arise. These include an unpredictable and dynamic environment; control of several semiautonomous intelligent agents; the need to adjust plans dynamically according to developments during plan execution; and the need to consider the presence of an adversary in devising plans. The requirements imposed on automated planning systems by battle management is examined, a planning approach that meets these requirements is described, and an architecture for investigating adversarial aspects of battle planning is presented View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Classification models for the counting of cellular objects

    Publication Year: 1990 , Page(s): 283 - 291
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1000 KB)  

    The automated image analysis cell classification paradigm for estimating the proportion (P) of cells on a cytology slide containing objects of interest is presented. Automated cell counters based on image analysis offer a mechanized alternative to the tedious and time-consuming task of manually performing these counts. Several classification models for increasing the automated estimation accuracy of P are presented. The receiver operating characteristic (ROC) curve, as used in classical signal detection theory, provides the conceptual structure and mathematical foundation for the models. It is shown that simple formulations, using this theory, yield dynamic strategies that result in higher cellular object classification accuracy than the classical one-threshold signal detection model. Moreover, these strategies can be implemented in a manner that satisfies the specific application constraints. An application involving the estimation of the proportion of various immunologically labeled lymphocyte subpopulations illustrates the methodology View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • User modeling in expert man-machine interfaces: a case study in intelligent information retrieval

    Publication Year: 1990 , Page(s): 166 - 185
    Cited by:  Papers (11)  |  Patents (5)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1872 KB)  

    The requirements of a user modeling component for an expert interface are analyzed, and the main points of a proposed approach to user modeling are stated. The authors focus on a knowledge-based system, called UM-tool, devoted to creating, maintaining, and using explicit user models within an expert interface. UM-tool supports a novel approach to user modeling, which is based both on the use of stereotypes and on a dynamic reclassification scheme. The architecture of the system is described, the organization and content of its knowledge bases are illustrated, and the modeling mechanisms utilized are presented in detail. An example of the use of UM-tool in the frame of the information-retrieval-natural-language-interface (IR-NLI II) expert interface devoted to supporting end users in accessing online information retrieval services is discussed, focusing on the specific role of the user modeling component. An evaluation of the proposed approach and a critical comparison with related work are presented. Future research directions are outlined View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Predicting fault diagnosis performance: why are some bugs hard to find?

    Publication Year: 1990 , Page(s): 274 - 283
    Cited by:  Papers (3)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1244 KB)  

    An experiment to examine how a troubleshooter's mental model of a logic network might prejudice his or her ability to diagnose previously unseen faults is discussed. Subjects first inferred the structure of a logic network by viewing it in different states. Most subjects ended up with mental models of the network that were almost, but not completely, correct. Then, in a fault diagnosis phase, subjects diagnosed faults placed in the network by the experimenter. Faults involved the addition of new links or the deletion of existing links. In generating faults for subjects, two factors were varied in a 2×2 design: whether a subject believed a link existed or not and whether that belief was true or false. The ability to detect the symptoms of a fault was not always a guarantee of correct fault diagnosis. Subjects had difficulty diagnosing faults to do with links they believed did not exist, and they found it impossible to diagnose faults that influenced parts of the network about which they had false beliefs View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A game theoretic approach with risk assessment for international conflict solving

    Publication Year: 1990 , Page(s): 141 - 148
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (704 KB)  

    A game-theoretic approach based on risk assessment for evaluating the effectiveness of the formation of international concords is discussed. An n-person cooperative game in the characteristic-function form is used for international conflict solving. A concept for solution of a game called a nucleolus and its alternative forms are derived, and a two-layer hierarchical system for the evaluation is constructed. At the first layer, the concept of the multiattribute risk function (MRF) is defined and derived by assessing value tradeoffs in the risk profile for each country. At the second layer, for effective formation of international coalitions for international conflict solving, a game theoretic approach is used. The characteristic function of the n-person cooperative game is constructed in terms of the decrease of the MRF values due to the formation of coalitions. The alternative concepts for the augmented nucleolus are compared with each other in terms of the dual solution concept View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.