By Topic

Artificial Intelligence Applications, 1991. Proceedings., Seventh IEEE Conference on

Date 24-28 Feb. 1991

Filter Results

Displaying Results 1 - 25 of 72
  • Proceedings. Seventh IEEE Conference on Artificial Intelligence Applications (Cat. No.91CH2967-8)

    Save to Project icon | Request Permissions | PDF file iconPDF (311 KB)  
    Freely Available from IEEE
  • Uncertainty reasoning in Prolog with layered meta-interpreters

    Page(s): 398 - 402
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (480 KB)  

    Proposes an architecture for uncertainty reasoning for rule-based systems in Prolog. Previous work has assumed a predefined calculus for the propagation of uncertainty, restricted to a single line of reasoning or where no negation is allowed. The authors identify the issues that need to be considered for both handling negation and taking a more global view. The authors adopt a pragmatic approach and argue that reasoning with uncertainty can be achieved by a flexible meta-architecture that is suitable for describing and reasoning with different representations of uncertainties, combining multiple lines of reasoning, and reasoning with unknowns and negation. The authors show that a layered meta-interpreter is a good flexible architecture that can accommodate different sets of assumptions for incorporating uncertainty in rule-based systems in Prolog View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Proving properties of rule-based systems

    Page(s): 81 - 88
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (636 KB)  

    A deductive method is applied to the validation of rule-based systems. A number of validation tasks, including the detection of errors or anomalies, the proof of termination, the verification of properties, and the generation of test cases, are all translated into conjectures. If the conjectures in an appropriate system theory are proved, the corresponding validation task is accomplished. The method applies to nonmonotonic rule systems, which may delete elements from working memory, as well as monotonic ones, which do not. Validation conjectures are proved or disproved in a new theorem-proving system, SNARK, which has facilities for reasoning in first-order theories with mathematical induction. The system has already been applied to establish properties of a number of simple rule-based systems View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • From parsing to database generation: applying natural language systems

    Page(s): 18 - 24
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (524 KB)  

    A new architecture for database generation that combines several levels of language processing is described and the author examines the relevant technique in each layer. The new methods for database generation break the task roughly into three stages, emphasizing different types of processing in each phase. The preprocessing, or corpus-driven tasks, use empirical results from large text samples to guide processing. The analysis tasks perform traditional parsing and semantic interpretation. Postprocessing actually produces the database templates. Syntactic, semantic, and domain knowledge can affect processing in each state. The author explains why database generation needs these separate stages of analysis. Some of the critical methods in each category are also described View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • KRS-a hybrid system for representing knowledge in knowledge-based help systems

    Page(s): 129 - 133
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (440 KB)  

    A description is given of the hybrid knowledge representation system (KRS). It has been designed for representing knowledge in computer-aided-software-engineering (CASE) tools, especially help systems. It supports easy encoding of the knowledge by a variety of constructs and performs efficient hybrid inferences. Knowledge acquisition is also supported by a classification procedure. Compared with other approaches for representing and retrieving information, such as hypertext or library science techniques, it supports semantic based information retrieval, as well as a natural integration of a natural language processor. Its convenience for domain modeling in a help system for the LaTex formatter is shown View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Influence networks: a reactive planning architecture

    Page(s): 354 - 360
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (584 KB)  

    An architecture for reactive planning, called an influence network, is presented. Most approaches to planning involve goal decomposition, action generation, and action fusion, the latter being the most difficult. In an influence network, fusion involves influences, rather than fully specified actions. An influence serves to constrain or bias the eventual selection of actions. Plans to satisfy individual goals generate influences. These influences are validated as they are generated to ensure that they are consistent with influences already accepted. After all influences have been processed, actions are generated which satisfy the influences that have been accepted. Influence networks provide a simple method of action fusion by delaying commitment while not delaying validation. As a result, plan repair, can be accomplished by the individual planning agents as their plans are being constructed, rather than being performed at the system level View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Automatic cluster assignment for documents

    Page(s): 25 - 28
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (304 KB)  

    A knowledge-based approach to classification is reported. The proposed methodology uses personal construct theory for interviewing a domain expert to elicit classification knowledge. This interview results in raw data which, on analysis, yields the relationship between different concepts from a user perspective. After finding the relationships, the user is asked to delineate the boundaries which enclose like concepts. With such a grouping of concepts, the authors develop a methodology to establish a relationship between the concepts and the index terms constituting document representations. This relationship is employed to assign a document to the most appropriate cluster. The knowledge elicited from the expert is mapped to system-observable features of documents to develop a classification. The techniques developed are experimentally validated View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Use of procedural programming languages for controlling production systems

    Page(s): 71 - 75
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (432 KB)  

    A new approach called UPPL, uses procedural programming languages, such as Lisp and C, to explicitly describe the plans of controlling production systems. As the key idea of implementing this, the authors view production systems as a collection of concurrent rule processes, each of which continuously monitors the global database and executes actions when its conditions match database entries. To bridge control plans and rule processes, the authors introduce the `procedural control macros' (PCMs) to procedural languages. The PCMs are designed based on the communicating sequential process (CSP) communication commands developed by C.A.R. Hoare (1978). Since PCMs include nondeterministic properties, the execution order of rules cannot be completely determined in advance, but is guided by the PCMs at run-time. The PCMs are functionally simple and easy to implement but they can effectively control production systems when combined with the original control facilities of procedural languages View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An intelligent assistant for financial hedging

    Page(s): 168 - 174
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (612 KB)  

    The authors describe the knowledge representation used to develop a prototype expert system for financial hedging. This representation captures the deep domain knowledge that experts use to reason about hedging decisions. It allows for reasoning qualitatively based on first principles using the fundamental quantitative valuation models that characterize each financial instrument. It also uses object-oriented concepts and inheritance to minimize the effort needed to set up the knowledge base and keep it current. It includes a calculus for derivation of qualitative knowledge of one-dimensional-order, which allows it to solve problems where optimality constraints are qualitative. It is flexible enough to reason in terms of the basic principles of risk assessment View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Fault diagnosis of a sewage plant

    Page(s): 120 - 123
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (328 KB)  

    A project whose aim is the development of an expert system for managing and diagnosing a sewage plant is presented. After a short description of how the knowledge acquisition process took place, the authors explain why the popular model-based diagnosis approach cannot be applied to the problem domain. Instead, they consider associative knowledge to solve the diagnostic problem. In order to adequately express knowledge about the structure of the sewage plant, knowledge about well understood subprocesses and associative knowledge for the diagnosis of the sewage plant, the authors designed the MOTESDM tool that supports hybrid knowledge representation. MOTESDM allows separation of associative knowledge from structural knowledge concerning the technical system View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Multi-sensor image interpretation using laser radar and thermal images

    Page(s): 190 - 196
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (704 KB)  

    The authors present a knowledge-based system called AIMS (automatic interpretation system using multiple sensors) to interpret registered laser radar and thermal images. The objective is to detect and recognize man-made objects at kilometer range in outdoor scenes. Various sensing modalities (range, intensity, velocity, and thermal) are used to improve both image segmentation and interpretation. Low-level attributes of image segments (regions) are computed by the segmentation modules and then converted to databases in the KEE format. KEE is a commercial package for expert system shell development. The interpretation system applies forward chaining to derive object-level interpretations from databases. Segments are grouped into objects and then objects are classified into predefined categories. AIMS transfers nonsymbolic processing tasks to a concurrent service manager (program). Therefore, tasks with different characteristics are executed using different software tools and methodologies. Experimental results using real data are presented View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Plan execution in dynamic and unanticipated environments

    Page(s): 361 - 367
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (636 KB)  

    The construction of planning and execution systems for complex real-time operational systems should be based on an understanding of what happens at the time of plan execution, because much of the information to constrain reasoning and decisions is not available prior to execution. The author discusses the results of research on the execution of procedures of a knowledge-rich domain (operation of a nuclear power plant) in a dynamic environment, and concludes that planning+execution is a viable model where planning is better viewed as a process for producing a `road map' and that the research needs to focus more on the role of the executor as an intelligent navigator. It is also shown that an understanding of domain-specific execution behavior can lead to strategies for dealing with the incompleteness problem. The author briefly describes salient features of the DPS (dynamic procedure selection and synthesis) approach and shows how some of the aspects of incompleteness can be addressed by exploiting both the knowledge richness of a domain and architectural features of plan execution behavior View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Optimizing knowledge-based system design

    Page(s): 269 - 274
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (516 KB)  

    The authors present an approach to optimizing the design of a knowledge-based design (KBD) system so that it produces optimal or near-optimal results. It is subjective in nature, however, as a KBD system's performance is often justified by the examination of its designs by experts. One alternative, as shown here, is to apply the simulated annealing technique (SA) in designing a KBD system. During the design of a KBD system, two designs by a KBD system and a SA system are independently generated. The solution generated by the SA system is then used to critique a solution from the KBD system. To some extent, an objective measure is brought into the design of a KBD system. The authors investigate this SA-KBD approach in a telephone cable network design domain, and the results are encouraging View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Extracting company names from text

    Page(s): 29 - 32
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (280 KB)  

    A detailed description is given of an implemented algorithm that extracts company names automatically from financial news. Extracting company names from text is one problem; recognizing subsequent references to a company is another. The author addresses both problems in an implemented, well-tested module that operates as a detachable process from a set of natural language processing tools. She implements a good algorithm by combining heuristics, exception lists and extensive corpus analysis. The algorithm generates the most likely variations that those names may go by, for use in subsequent retrieval. Tested on over one million words of naturally occurring financial news, the system has extracted thousands of company names with over 95% accuracy (precision) compared to a human, and succeeded in extracting 25% more companies than were indexed by a human View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Set-oriented constructs for rule-based systems

    Page(s): 76 - 80
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (456 KB)  

    Set-oriented constructs for forward chaining rule-based systems are presented. These constructs allow arbitrary amounts of data to be matched and changed within the execution of a single rule. Second-order tests on the data can be included in the match. The ability of a single rule to directly access all of the data to be manipulated eliminates the need for unwieldy control mechanisms and marking schemes. Adding this expressivity to rule-based languages enhances their value to expert system developers and their capabilities as database programming languages. Additionally, these set-oriented constructs provide a basis for more efficient implementations of rule-based systems, for both the traditional memory-based system and the emerging disk-based ones. The approach described has been implemented using an extended version of the Rete network algorithm View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An assumption-based scene interpretation system that solves multiplicity of scene description

    Page(s): 176 - 182
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (644 KB)  

    A Scene interpretation system that solves the multiplicity problem of scene description is described. To recognize a structured object, the system is required to segment lines or arcs from fragmented image features, and group them into structured parts of the object. The problem is that, in machine vision, the segmentation and grouping processes cannot determine the correct parts of the object uniquely for themselves. To carry out the interpretation against the problem, the authors have prototyped an assumption-based scene interpretation system (ASIS) that has an inference framework to get a set of consistent hypotheses that imply the goal and observed facts. The hypothetical reasoning scheme of ASIS is realized by a rule base with an assumption-based truth maintenance system (ATMS). By incorporating them, ASIS can obtain the globally plausible and consistent interpretation by preserving the alternative hypotheses in interpreting typical indoor scenes View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A knowledge representation for model-based high-level specification

    Page(s): 124 - 128
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (344 KB)  

    A conditional semantic network as knowledge representation for model-based high-level specification in a plant control programming domain is proposed. In this representation, semantics of relations varies according to the plant situation, while it is determinate in conventional networks. A plant model enables designers to design control programs using natural terms and concepts of a plant. In order to interpret user-defined high-level specifications, the model includes knowledge about plant control in addition to plant components. The plant control knowledge is a type of constraint relation which exists among the plant component's actions and states. It varies according to the plant situation. The plant model, including the plant control knowledge, is appropriately described in the conditional semantic network representation View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Automating the presentation of information

    Page(s): 90 - 97
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (708 KB)  

    The problem of creating effective computer displays of data contained in large information systems is addressed. The authors describe SAGE, an intelligent system which assumes presentation responsibilities for other systems by automatically creating graphical displays which express the results they generate. They describe SAGE's architecture and the methods by which it creates presentations. Implementing SAGE required identifying and representing a larger set of task and data characteristics than had been explored previously. The authors also describe several prototypical decision support and management systems which use SAGE in different ways to support the presentation needs of end users and system developers View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • PEX: a reactive procedure based decision maker [Satellite control application]

    Page(s): 368 - 371
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (376 KB)  

    Presents a system called PEX (Procedure Expert), relying on M. Georgeff's ideas (1985). The authors introduce the concept of `integrity context' as a procedure relevance criterion to increase reactivity. The authors envisage a procedure-based process control system. The purpose of the procedural decision maker (PDM) is to make the physical system reach a given objective. It exploits a procedural model which maps objectives and behavioral information (procedures) for deciding in real time what commands should be emitted and when. The authors consider reactivity as first (R1) the aptitude to dynamically choose the appropriate procedures according to the actual system's state, and second (R2) the aptitude to monitor the relevance of the procedure being executed with respect to the current system's behavior. The authors show how to confer on the PDM the potential to be reactive by allowing an appropriate knowledge representation, and suggest a clean mechanism to take advantage of it. The analysis consists in enriching the Knowledge Areas (KA) representation language and designing a smart KA interpreter specialized in particular on reactivity of the R2 type View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Combining rules and state objects in a configuration expert system

    Page(s): 275 - 279
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (452 KB)  

    The authors have designed and implemented an expert system in the area of mechanical design that combines the rule-based paradigm of problem solving with a paradigm based on on search in a state space. KONEX uses the state space to represent intermediate designs and applies production rules to move to successor states. The authors have developed strategies to keep the search space at a manageable size. The search tree is expanded incrementally and the number of nodes is reduced at each step. The authors describe the domain in which KONEX operates, i.e. the domain of CNC-machines. They then concentrate on how states are represented by control-objects. The advantages of using state spaces with nodes describing partial designs are illustrated with a concrete example View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Shape feature abstraction in knowledge-based analysis of manufactured products

    Page(s): 198 - 204
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (660 KB)  

    The authors focus on shape feature recognition based on the differential depth perception filter, which reduces the number of topological entities. By defining features in terms of loops, as opposed to defining them in terms of low level topological entities, substantial geometric as well as topological variation among features is captured. The space of entities in which the search for features is finally performed is loops. The reduction of this search space is first performed by the filter, and then by creating the loops from edges, both of which are significant. This assists in the reduction of combinatorial explosion. Considerable success in feature recognition has been achieved for parts with varying topology and geometry and for parts with a relatively large number of topological entities View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Understanding causal feedback using the Strategic Planning System (SPS)

    Page(s): 372 - 375
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (320 KB)  

    The Strategic Planning System (SPS) is a causal modeling tool that gives planners a way to express underlying causal relationships and feedback loops in a strategic plan and determine the effects and side-effects of different strategic alternatives. SPS provides an environment where situations can be defined, and plans refined, abstracted, analyzed, critiqued, and then further refined through an explicit, mutually understandable framework. SPS is a tool in which the underlying structure of a model is elucidated through qualitative explanations and incorporates systems dynamics concepts (nonlinear systems with feedback). Numerical tools are then used afterward to further refine the models. SPS assists planners in understanding the underlying structure and implications of the business model they have built and assists them in explaining these concepts to others. Artificial intelligence techniques were used to understand the model and build explanations View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Using a description classifier to enhance deductive inference

    Page(s): 141 - 147
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (512 KB)  

    Descriptions are given of some of the emerging techniques and uses of classifier-based reasoning systems, specifically as they apply to the LOOM knowledge representation system. The author asserts that current generation expert system tools fail to achieve a satisfactory integration of frame knowledge and rule knowledge. He then describes a class of languages, exemplified by LOOM, that combine descriptions and rules to form a hybrid logic that does achieve a satisfactory level of integration. The use of classifier technology enables a form of unification over descriptions that fills a gap present in the frame-plus-rule (F+R) technology. In addition, classification-based inference technology is more powerful than the inference technology found in languages such as (pure) Prolog. A classifier's ability to automatically organize definitions and to detect many kinds of inconsistency can significantly benefit the task of knowledge acquisition. The unique capabilities of the classifier can be applied to enhance existing programming paradigms. The author highlights specific enhancements to the production rule and object-oriented programming paradigms View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • MEXSES: an expert system for environmental screening

    Page(s): 294 - 298
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (432 KB)  

    A rule-based expert system for environmental impact assessment is described which was implemented for the analysis of water resources development projects. The system uses hierarchical checklist questionnaires, organized by project types and problem classes, and an assessment procedure based on rules and descriptors that allow the analyst to classify subproblems in terms of their expected environmental impacts and to aggregate them into an overall project assessment. Rules and descriptors are linked with cross-references. The inference engine uses look-ahead preprocessing for the dynamic pruning of the inference tree, and offers both forward and backward chaining functions for standard problem assessment and an alternative hypothesis testing feature. Different modes of interaction, all based on a fully menu-driven graphical user interface implemented in X Windows, offer alternative levels of verbosity with the optional display and selection of rules by the user. The software system includes a project database and an integrated geographical information system for the management of spatial environmental data View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Example-guided optimization of recursive domain theories

    Page(s): 240 - 244
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (336 KB)  

    The authors investigate the utility of explanation-based learning in recursive domain theories and examine the cost of using macro-rules in these theories. As a first step in producing effective explanation-based generalization (EBG) algorithms, the authors present a new algorithm for performing source optimization of recursive domain theories. The algorithm, RSG (recursive-structure generalizer), uses a training example as bias and generalizes the control knowledge encoded in the example's derivation tree to produce a more efficient formulation of the original domain theory. The control knowledge involves control of both clause and binding selection. The authors demonstrate the effectiveness of the method of planning problems in situation calculus. The authors show that in most cases one must know the future problem distribution a priori to produce an optimal reformulation View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.