By Topic

Artificial Intelligence Applications, 1989. Proceedings., Fifth Conference on

Date 6-10 March 1989

Filter Results

Displaying Results 1 - 25 of 36
  • Proceedings. The Fifth Conference on Artificial Intelligence Applications (IEEE Cat. No.89CH2712-8)

    Publication Year: 1989
    Save to Project icon | Request Permissions | PDF file iconPDF (256 KB)  
    Freely Available from IEEE
  • The effect of data character on empirical concept learning

    Publication Year: 1989 , Page(s): 199 - 205
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (516 KB)  

    The effect of data character on empirical concept learning has typically been studied using data from real domains. This presents problems because such data is often limited and uncontrollable. The authors present a more complex approach. They examine typical problems in an effort to characterize concepts more completely. They than utilize the characterization to generate data artificially. From this controlled data, they measure learning performance (speed and accuracy) as a precise function of several data characteristics. The authors' experiments lead to some novel conclusions: a useful starting point to clarify data character is the definition of the term concept, which is effectively a function over instance space; characterizing a concept as a function allows the mimicking of natural data and the control of the generation of artificial data for extensive experimentation; data characteristics are numerous and easy to overlook; and, compared with significant design factors of learning algorithms, certain data characteristics are highly significant View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Applying expert systems technology to communications software validation

    Publication Year: 1989 , Page(s): 209 - 213
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (352 KB)  

    A case study is presented which describes the use of an expert systems approach to automation of systems and integration testing for validation of complex, real-time communications software, such as used onboard the Airborne Warning and Control Systems (AWACS) aircraft. The approach permits a state-based rather than path- or branch-based testing style. States can be matched with high-level system requirements to give a measure of test coverage. The benefits and weaknesses realized from using the Boeing-built embeddable expert systems shell with a custom relational database interface to construct an automated software verification tool supporting this approach are discussed. A brief summary of the utility of applying expert systems technology in this software engineering area is given. Qualitative measurements of the productivity increase from a prototype demonstration are also included View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Experiences with the subsumption architecture

    Publication Year: 1989 , Page(s): 93 - 100
    Cited by:  Papers (4)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (640 KB)  

    A subsumption architecture has been proposed as an effective approach for the construction of robust, real-time control systems for mobile robots. To investigate its strengths and weaknesses, a simulation of the architecture was developed called the Subsumption Architecture Tool (SAT). This simulation allows various models of system behavior to be quickly built and tested. During the building and testing of the SAT, issues related to some architectural features became evident: level of commitment of each layer; code redundancy; problem decomposition and programming style; complexity of large system; and abstract reasoning capabilities. The effects of these issues are presented with respect to the design and implementation choices of two sample layers of behavior. These layers are used to illustrate considerations that need to be taken into account when a project team is considering the use of the subsumption architecture or when a subsumption-architecture-based system is being designed and implemented View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A development framework for distributed artificial intelligence

    Publication Year: 1989 , Page(s): 115 - 121
    Cited by:  Papers (1)  |  Patents (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (588 KB)  

    The authors describe distributed artificial intelligence (DAI) applications in which multiple organizations of agents solve multiple domain problems. They then describe work in progress on a DAI system development environment, called SOCIAL, which consists of three primary language-based components. The Knowledge Object Language defines models of knowledge representation and reasoning. The metaCourier language supplies the underlying functionality for interprocess communication and control access across heterogeneous computing environments. The metaAgents language defines models for agent organization coordination, control, and resource management. Applications agents and agent organizations will be constructed by combining metaAgents and metaCourier building blocks with task-specific functionality such as diagnostic or planning reasoning. This architecture hides implementation details of communications, control, and integration in distributed processing environments, enabling application developers to concentrate on the design and functionality of the intelligent agents and agent networks themselves View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Case-based reasoning in a rule-governed domain

    Publication Year: 1989 , Page(s): 45 - 53
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (712 KB)  

    A discussion is presented of the problem of reasoning with cases in a domain governed by rules, in particular, the problems of interpreting the meaning of terms (statutory predicates) used in the rules and combining case-based reasoning (CBR) with other modes of reasoning, such as rule-based and model-based reasoning. Terms used in statutes are typically underdefined in the statute and inherently open-textured and thus require precedent-based reasoning to interpret their meaning for particular fact situations. In such domains one needs to combine reasoning about the rules (statutes) with the precedents that concern them. A description is given of the precedent-based case-based reasoner TAX-HYPO, which operates in the statutory domain of tax law. TAX-HYPO is a derivative system of an earlier CBR system, HYPO, which operated in the common law domain of trade secret law. The authors describe CABARET which is an environment to support the building of precedent-based CBR systems like HYPO and TAX-HYPO, as well as experimentation with mixed-paradigm systems involving CBR View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • SCORE: an integrated system for dynamic scheduling and control of high-volume manufacturing

    Publication Year: 1989 , Page(s): 271 - 278
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (736 KB)  

    SCORE is a prototype dynamic scheduling system with rapid execution time that was developed for a high-volume manufacturing plant. The test-case environment is characterized by the complexity and size of the process structure, the high speed of the production flows, and the frequency of unexpected events that disturb the production process. The system has been developed with capabilities for: generating the resource allocation schedules for the shop floor; schedules generating the monitoring activities schedule to be performed by the shop floor controllers; monitoring the events occurring on the shop-floor and anticipating their impact on the manufacturing process; and performing corrective actions to the resource schedules whenever production is jeopardized by unexpected events. The system performs a closed-loop interaction with the network of shop-floor-controllers and work-center processors. The architectural approach adopted enables the integration and cooperation of multiple problem-solving techniques under the coordination of an interrupt-driven system supervisor View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Planning and execution of tasks in cooperative work environments [office automation]

    Publication Year: 1989 , Page(s): 255 - 262
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (508 KB)  

    A description is presented of work being done to construct a planning system to assist in the performance of multiagent, loosely structured, underspecified tasks. Specifically, the authors present a representation for modeling tasks, agents, and objects within such environments and describe the architecture and implementation of a planning system which uses these models to support cooperative work. A description is given of how this planning system is being used to support further research in areas such as exception handling, negotiation, and knowledge acquisition View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A study of the knowledge required for explanation in expert systems

    Publication Year: 1989 , Page(s): 83 - 90
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (608 KB)  

    A method is described for automatically assembling explanations expressed in expert terms yet which is, more general than canned-text techniques. The method is based on the constitution of an explanatory knowledge base written in a special-purpose language. This allows the assembly of explanations by reasoning conducted on the expert knowledge base as well as on session traces View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An expectation-driven approach to Q-A processing in a mixed-initiative natural language interface

    Publication Year: 1989 , Page(s): 67 - 74
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (464 KB)  

    In a mixed-initiative natural-language interface, questions are essentially requests for information. Expectations about the nature of the information requested accompany such questions. The system needs a way of determining whether the user has adequately answered its question. An expectation-based approach to this aspect of question-answer understanding is shown to be a viable strategy. The authors discuss representing and extracting these expectations from system questions and then subsequently using them to evaluate user answers to these questions View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Diagnosing multiple faults in digital systems

    Publication Year: 1989 , Page(s): 151 - 158
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (612 KB)  

    An efficient and fault-model independent method is presented for diagnosing multiple faults in digital system. The method is simpler to implement than the method that uses an ATMS with a constraint system or the method that uses a theorem prover with the minimal hitting set algorithm. Given the model of a Boolean digital system, an input vector, and a set of observed output values, the method computes the set of all minimal diagnoses (candidates). It begins with a system output that is incorrect. Using a system behavior model it computes a set of minimal potential candidates that account for the behavior of that incorrect output. The method then incrementally considers the remaining system outputs and extends the existing minimal potential candidate set to account for their behaviors. A minimal candidate is a minimal set of components whose hypothesized faulty outputs account for all correct and incorrect outputs of the system under some input vector. The authors show that minimal candidates do not contain components whose faulty outputs are either masked or nonobservable. They also show that for Boolean systems, supersets of candidates are candidates only for certain component fault models View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Retrieval in case-based reasoning: using semantic representation to cluster cases

    Publication Year: 1989 , Page(s): 183 - 189
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (512 KB)  

    Reasoning from previous experience, or case-based reasoning, has been used in numerous domains. The basic metaphor for case-based reasoning is one of analogical reasoning. The author presents a semantic representation structure based on NIKL to capture the analogic reasoning metaphor behind case-based reasoning. This provides numerous benefits, including conceptual organization and explanation support. This approach was implemented in a military command and control tactical planning environment. Its advantages and disadvantages are presented View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Extending the device-oriented qualitative simulation method to mechanical devices

    Publication Year: 1989 , Page(s): 29 - 36
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (520 KB)  

    The author offers a solution which uses a separate representation entity, called the connection frame, to model the spatial relationships between a pair of objects and how those relationships achieve force or velocity propagation. The connection representation is assumed supplied as part of the design knowledge of the mechanism, though it could be just as readily computed by other spatial connection determination methods. A framework is described which is constructed to simulate the behaviors of both ordinary and intermittent mechanical systems, with an emphasis on force and velocity propagation reasoning. In general, it appears that continuous motion can usually be modeled by velocity propagation while intermittent motion is best approached by force propagation View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Discovery of probabilistic rules for prediction

    Publication Year: 1989 , Page(s): 223 - 229
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (632 KB)  

    An inductive learning algorithm is presented for analyzing the inherent patterns in a sequence and for predicting future objects based on these patterns. This algorithm is divided into three phases: detection of underlying patterns in a sequence of objects; construction of rules, based on the detected patterns, that describe the generation process of the sequence; and use of these rules to predict the characteristics of the future objects. The learning algorithm has been implemented in a program known as the OBSERVER, and it has been tested with both simulated and real-life data. The experimental results show that the OBSERVER is capable of discovering hidden patterns and explaining the behavior of certain sequence-generating processes that a user is not immediately aware of or fully understood. For this reason, the OBSERVER can be used to solve complex real-world problems where predictions have to be made in the presence of uncertainty View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A knowledge-based approach to real time signal monitoring

    Publication Year: 1989 , Page(s): 133 - 140
    Cited by:  Papers (4)  |  Patents (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (612 KB)  

    A method is presented for describing, observing, and classifying, phenomena in signal data. The approach consists of two main parts: a declarative formalism for describing the events of interest and how they relate to real-world phenomena, and a runtime agent that utilizes these descriptions to detect and classify observations made from real-time signals. A prototype implementation of this approach, called the Observation Classification System (OCS), has been developed and has been used in applications ranging from experiment monitoring to data quality analysis View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Extending CATS: mixed-initiative inferencing

    Publication Year: 1989 , Page(s): 107 - 114
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (560 KB)  

    CATS is an architecture for building diagnostic expert systems. The authors describe an extension to CATS to support mixed-initiative inferencing. Currently the user cannot volunteer any information other than that requested by the system. Mixed-initiative inferencing will allow that user to do this. Issues discussed are: when the user can volunteer information; what information the user can volunteer; how the system treats that information; and what the system must do when the user-supplied information conflicts with the system's current knowledge View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Managing constructive induction using optimization and test incorporation

    Publication Year: 1989 , Page(s): 191 - 197
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (536 KB)  

    Constructive induction is the creation of new terms for describing and learning concepts more effectively. An approach to the problem that uses inductive test incorporation, for fast and effective assessment of terms, is presented. The authors investigate specific effects of system components using two domains: a well-known medical problem, and the relatively unsolved problem of protein folding. The new-term constructor borrows techniques from previous systems for constructive induction and multiple-objective optimization. It utilizes management and test incorporation techniques to control domain-dependent and domain-independent heuristics to extend the concept language. The system learns how to be effective by observing and apportioning credit to the various subcomponents used to generate terms. The system dynamically decides which terms to retain based on a flexible scheme for evaluating them at low cost; the scheme uses variable sampling and different measures of the effectiveness of new terms. Results are presented that show the extent to which each of the authors' techniques aids concept learning, in terms of accuracy and speed View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • MAPCon: an expert system to configure communications networks

    Publication Year: 1989 , Page(s): 23 - 28
    Cited by:  Papers (6)  |  Patents (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (476 KB)  

    MAPCon is an expert system that performs offline parameter configuration for local area networks that use MAP the (Manufacturing Automation Protocol). A description is given of the configuration task in general and MAPCon in particular, and its performance is described as a function of network size. MAPCon is distinct from most other applications of expert systems technology to communications in four ways. First, it deals with MAP networks, which (unlike many commonly used networking schemes) are compatible with the OSI seven-layer model of communications. Secondly, it does parameter configuration, rather than routing, diagnosis, or identifying which components need to be included in a given station. Thirdly, it relies heavily on a frame representation of deep knowledge about the system, and uses rules mainly to propagate deep-level constraints among related parameters rather than to capture shallow heuristics. Finally, it combines both synthetic and analytic reasoning View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Planning for multiple goals with limited interactions

    Publication Year: 1989 , Page(s): 263 - 270
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (464 KB)  

    A description is presented of the problem of finding an optimal global plan for a multigoal planning problem by combining plans that solve the individual goals. This problem arises in process planning, when one tries to combine the process plans for individual machinable features into a global process plan for the entire workpiece. It also arises in the planning of robot actions, when one tries to combine the plans for the handling of individual objects into a global plan. The problem can be shown to be NP-hard in general. However, if only one plan is available for each goal, then the problem can be solved in time O(n2) by imposing restrictions on the kinds of intergoal interactions involved. If more than one plan is available for each goal, the problem is still NP-hard even if these restrictions are satisfied but in the case described, a heuristic search algorithm has been developed which performs well View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Template-based multi-agent plan recognition for tactical situation assessment

    Publication Year: 1989 , Page(s): 247 - 254
    Cited by:  Papers (5)  |  Patents (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (692 KB)  

    A description is given of the concept of a multiagent plan-recognition model to assist a naval tactical decision maker in interpreting the activity of enemy ships and aircraft (agents) and predicting their goals and plans. Multiagent templates have been devised to capture the tactical decision-maker's knowledge of how the plans of individual agents interact in pursuit of shared goals. The templates provide a flexible knowledge representation that allows the plan-recognition model to reason about different situation-specific variations in the ways multiagent plans may unfold. Template instances form hypotheses of future agent actions and are created in top-down/bottom-up manner. The hypotheses are then updated based on observations of agent behavior. The plan-recognition process features mechanisms to curb the proliferation of hypotheses possible in multiagent domains View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Flexible reuse of plans via annotation and verification

    Publication Year: 1989 , Page(s): 37 - 43
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (580 KB)  

    An approach is presented for flexible reuse of old plans in the presence of a generative planner. In this approach, the planar leaves information relevant to the reuse process in the form of annotations on every generated plan. To reuse an old plan in solving a new problem, the plan along with its annotations is mapped into the problem. A process of annotation verification is used to locate applicability failures and suggest refitting tasks. The planner is then called upon to carry out the suggested modifications, i.e. to produce an executable plan for the new problem. This integrated approach obviates the need for any extra domain knowledge (other than that already known to the planner) during reuse and thus affords a relatively domain independent framework for plan reuse. Details are presented of annotation and plan reuse in PRIAR, a plan-reuse system based on this approach. The authors believe that their approach can be profitably used by generative planners in many application domains View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Manufacturing planning and scheduling: where we are and where we need to be

    Publication Year: 1989 , Page(s): 13 - 19
    Cited by:  Papers (4)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (624 KB)  

    The lack of robust geometric reasoning components in most current process planning expert systems is discussed. The lack of metaknowledge in most existing production-scheduling systems is examined. The large amount of effort required to bring a research system into daily factory use is detailed. Finally, the large amount of effort needed to integrate intelligent planning and scheduling is considered View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Diagnosis and repair of software performance problems using assumptive truth maintenance

    Publication Year: 1989 , Page(s): 165 - 171
    Cited by:  Papers (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (512 KB)  

    The FACS Advisor for System Tuning (FAST) is an expert system for tuning the performance of software associated with the Faculty Assignment and Control System (FACS). FACS is a very large transaction processing application consisting of close to one million lines of C source code and requiring a dedicated mainframe. FACS users expect fast response times and high system throughput, but the size and complexity of the application have turned FACS performance tuning into a formidable task. The FAST expert system encodes software-tuning expertise to assist production FACS sites in analyzing and improving the performance of their systems. The multiparadigm implementation of FAST includes frame-based knowledge representation with multiple inheritance, data-driven reasoning for diagnosis, and assumption-based reasoning for determining optimal performance tuning recommendations View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Classification-based inferences in retrieving information from a database of scientific facts

    Publication Year: 1989 , Page(s): 75 - 81
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (600 KB)  

    The authors explain how complex chains of inferences can be accomplished by representing existentially qualified sentences and concepts denoted by restrictive relative clauses as classification hierarchies. They first describe the representation structures which make possible the inferences, and then they explain the algorithms which draw the inferences from the knowledge structures. All the ideas explained have been implemented and are part of the information retrieval component of SNOWY, a program which understands scientific paragraphs View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Evidence integration for 3D object recognition: a connectionist framework

    Publication Year: 1989 , Page(s): 55 - 63
    Cited by:  Papers (1)  |  Patents (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (700 KB)  

    The authors present recent work on a vision system designed to recognize 3-D objects in a depth map. The system was originally capable of recognizing parametric surfaces of various types. The authors have added the ability to find parametric surface intersection curves and use these disparate types of information to index into an object database. From a depth map containing one or more objects, local surface and surface intersection curve estimates are determined. These are used in a series of layered and concurrent parameter-space transforms to extract the surface and intersection curves present in the image. Any one transform computes only a partial geometric description that forms the input to the next transform. The final transform is a mapping into an object database. An iterative refinement technique, motivated by work in connectionist systems, is used to integrate the evidence at each level. Fundamentally different types of evidence can be simultaneously extracted, be mutually supportive in intermediate levels of the recognition process, and cooperate to form a consistent interpretation of the image. Other features discussed include the modularity and consistency of the architecture View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.