By Topic

Tools for Artificial Intelligence, 1989. Architectures, Languages and Algorithms, IEEE International Workshop on

Date 23-25 Oct. 1989

Filter Results

Displaying Results 1 - 25 of 92
  • IEEE International Workshop on Tools for Artificial Intelligence. Architectures, Languages and Algorithms

    Save to Project icon | Request Permissions | PDF file iconPDF (16 KB)  
    Freely Available from IEEE
  • The role of knowledge in an active information environment

    Page(s): 376 - 385
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (932 KB)  

    Some innovative concepts which influence the design and development of active information systems are presented. The data representation, reasoning, and language facilities for an active information system are presented. It is argued that the notion of knowledge and metaknowledge improves the overall system functionality by supporting the user during the process of problem solving and by providing the milestones for automating limited and well-understood portions of the problem domain or administrative services. The prime purpose of an active information system is to support users during the process of problem solving by exploiting information relevant to decision making. This work is also closely associated with research in the area of symbiotic man-machine systems. The kernel of an active information system is its object-oriented modeling substrate, based on the high-level data model Cosmos, which offers numerous advanced features and is capable of representing factual and conceptual knowledge concerning a specific problem domain View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Adaptive DB schema evolution via constrained relationships

    Page(s): 393 - 398
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (520 KB)  

    A novel object-oriented data model and a tailored query language that enables the construction of evolving database (DB) schemas according to the information provided by transaction monitoring are presented. The data model and the query language are based on the notion of constrained relationships that enable interconnections between DB objects under constraints specified by rules. These relationships capture frequent transactions with the database and freeze them on the database schema. Thus, the database schema can evolve in terms of changing retrieval requirements. Furthermore, these frequent transactions are mapped at a lower level between object instances, so that future transactions may be answered more quickly. The schema adaptation mechanism is based on the identification of commonly used, simple query patterns View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The role of learning in logic synthesis

    Page(s): 252 - 258
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (600 KB)  

    A model of logic synthesis that uses technology-specific design rules and extends rule-based search to functional decomposition and technology mapping is proposed. The problem of technology independence is addressed with the addition of a model of learning for automating the generation of design rules. While this model improves design quality by taking advantage of the target technology, it is not robust to technology changes. To improve robustness, the model is augmented with two learning components: one for acquiring rules that make use of physical cells in a technology library and another for acquiring rules that make use of appropriate design styles. These components are related to work in the learning of macro-operators and explanation-based learning View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Fuzzy logic based tools for classification and reasoning with uncertainty

    Page(s): 572 - 577
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (448 KB)  

    Two methods for classification processes affected by uncertainty in the description phase and the class definition phase and for approximate reasoning in decision-making activities with a combination of certain and uncertain knowledge are described. Based on the two frameworks, an integrated set of automated fuzzy-logic-based tools for classification and decision-making activities has been developed and inserted in a classical logic programming environment. A homogeneous and flexible environment for the automatic handling of certain and uncertain knowledge in classification and decision problems is obtained View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Abstract machine LORAP II and experiments in process grain size determination for parallel execution of logic programs

    Page(s): 685 - 692
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (544 KB)  

    The authors propose a distributed multiprocessor execution model, LORAP II, for parallel execution of logic programs. Each processor in the abstract machine is capable of supporting processes of variable grain sizes and is designed to be competitive with existing sequential implementations for deterministic cases. The authors present a set of experiments that establish the requirement for determining appropriate process grain sizes for effective parallel processing. Some heuristics are presented for compile time grain size determination. Some results from experiments using heuristics are presented that indicate a significant improvement in performance over a similar model (LORAP) supporting fine grain processes View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Improved control strategy for parallel logic programming

    Page(s): 702 - 708
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (580 KB)  

    An attempt is made to formulate an improved control strategy for parallel logic programming systems and to verify its validity using the notion of the alternating Turing machine. The proposed control strategy is a combination of committed choice nondeterminism and the control-flow mechanism. It is shown that a natural reduction of the alternating Turing machine for the proposed control strategy yields a deterministic Turing whose complexity is linearly related to that of the alternating Turing machine. However, such a reduction for the conventional committed choice nondeterminism has complexity that is quadratically related to that of the alternating Turing machine View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Neural network simulation using INES

    Page(s): 556 - 561
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (420 KB)  

    Describes the Interactive NEtwork Simulation tool (INES), which supports the simulation of multilayer pattern processing. Since the sequence, the interconnections, and the functional characteristics of the layers depend on the ideas and the needs of the user, INES does not assume a special interconnection scheme but makes it possible to set up visually an interconnection scheme of predefined units. The possible interconnections obey some restrictions (rules), representing a kind of visual programming language. Thus, the programming and simulation language system can be used for the design and evaluation of a pattern-processing neural network computer. Since the graphical editor, the network interpreter, and the modules of the base units are independently defined the user can program his own base units (and even his own interpreter) in the programming language of his choice. By decomposition into separate units, the classical, historically grown (Fortran-based) simulation program becomes a collection of reusable, clearly structured software units. Standard problems like I/O filter functions and monitors have to be implemented only once, so that the software productivity is increased View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • On the robustness of Dempster's rule of combination

    Page(s): 578 - 582
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (316 KB)  

    It is demonstrated by example that Dempster's rule of combination is not robust when combining highly conflicting belief functions. It is shown that Shafer's (1983) discounted belief functions also suffer from this lack of robustness with respect to small perturbations in the discount factor. A modified version of Dempster's rule is proposed to remedy this difficulty View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • PM: a metrics driven plan compiler

    Page(s): 677 - 684
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (720 KB)  

    The partial metrics (PM) project, whose goal is to investigate the metrics-driven acquisition of planning knowledge, is discussed. The approach taken can be thought of as reverse compilation. Abstract plans are generated from source programs in a given target language. This process involves the integration of traditional compiler techniques with certain AI (artificial intelligence) paradigms. The current prototype's design is described. This prototype's capabilities are demonstrated by taking an Ada target program generated by a plan in the Programmer's Apprentice system and creating another plan using the PM knowledge compiler. The plan produced by the prototype possesses many of the original plan's features View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A parallel architecture for AI nonlinear planning

    Page(s): 51 - 58
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (612 KB)  

    The authors present a resource-level conflict detection and conflict resolution scheme which is combined with a state-level backward planning algorithm and provides efficient conflict detection and global conflict resolution for nonlinear planning. The scheme is to keep track of the usage of individual resources during planning and construct a resource-usage flow (RUF) structure (based on which conflict detection and resolution are accomplished). The RUF structure allows the system to perform minimal and nonredundant operations for conflict detection and resolution. Furthermore, resource-level conflict detection and resolution facilitates problem decomposition in terms of resources, thereby providing easy implementation in a parallel and distributed processing environment. Performance analysis indicates that the proposed architecture has a speed-up factor of the average depth of a plan network, D(Na), compared to the distributed NOAH where Na (the total number of action nodes at the completion of planning) and D(Na) are considerably larger than the number of resources involved in planning as well as the number of initial goal states View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A knowledge-based environment for the integration of logical and physical testing of VLSI circuits

    Page(s): 259 - 265
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (548 KB)  

    The authors have developed an application environment for VLSI design, under which the VLSI design tools as well as the testers can be run. They have also developed a knowledge-based system for the transparent use of various testers from a common intermediate test-pattern language. Under the new environment, the user stimulates a design as before, and then specifies on which tester the fabricated design should be tested. The tests are performed with minimal user intervention (e.g. powering the circuit up). Upon completion of the physical testing the system compares the test data to the simulation data and graphically presents discrepancies which may indicate potential errors View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • I-see: an AI-tool for image understanding

    Page(s): 314 - 320
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (736 KB)  

    A software environment that supports the implementation of image understanding applications is described. The environment, called I-see, is implemented on top of the object-oriented KEE system and focuses on the iconic representation and exploration of visual data. The system offers an interface between the high-level symbolic reasoning mechanisms of KEE and the raw image data and iconically represented image models or segmentation results. An example problem of interpreting satellite images shows the use of this iconically represented knowledge in the form of a terrain elevation image. In this specific example, the iconic representation could be used directly without the need for a complementary symbolic description. The iconic and symbolic representation schemes can be used to supplement each other and can be related by the use of label images View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Finding and learning explanatory connections from scientific texts

    Page(s): 85 - 90
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (572 KB)  

    A theory for detecting and learning the explanatory connections between sentences in scientific texts is presented. A program called SNOWY that embodies the theory is also described. The knowledge in the program is organized around the notions of analytic and empirical knowledge. Analytic knowledge encompasses very general rules which are valid across any domain, while empirical knowledge includes rules whose validity is domain dependent. Examples of these rules and their representation are given View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Structured matching: a task-specific technique for making decisions

    Page(s): 138 - 145
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (768 KB)  

    The authors describe structured matching informally and give a formal definition of the task and strategy of structured matching. Structured matching integrates the knowledge and control for making a decision within a hierarchical structure. Structured matching has several desirable characteristics: it is qualitative and tractable, it facilitates knowledge acquisition and explanation, and it explicitly represents decision-making knowledge. The authors describe how structured matching is implemented in the HYPER tool, showing how HYPER corresponds to the formal definition. HYPER is a tool for building problem-solving modules that measure the fit of a hypothesis to a situation. Structured matching is a generalization of hypothesis matching. The authors present examples of structured matching in several knowledge-based systems View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Toward a paperless development environment

    Page(s): 495 - 498
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (352 KB)  

    Describes the process being used to implement a paperless environment, called TEDIUM, for the design, construction, and maintenance of computer software applications. The underlying paradigm for this research is not one of artificial intelligence (AI). Nevertheless, the methods used are equally appropriate for the development of AI tools and environments. TEDIUM is an environment designed to support the development of interactive information systems. The author gives an overview of the problem being addressed, describes an environment that offers a solution to that problem, and discusses the methods used to evaluate and improve the environment View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • NSA algorithm and its computational complexity-preliminary results

    Page(s): 442 - 446
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (324 KB)  

    To overcome the limitations of models of statistical heuristic searching the author proposes the idea of combining nonparametric statistical inference methods with the heuristic search. A nonparametric statistical algorithm (NSA) for heuristic searching is presented, and its computational complexity is discussed. It is shown that, in a uniform m-ary search tree G with a single goal S N located at an unknown site in the N-th depth, NSA can find the goal asymptotically with probability one, and the complexity remains O(N(In N)2), where N is the length of the solution path View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An entity model for conceptual design

    Page(s): 709 - 716
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (400 KB)  

    An effort is made to construct an interactive design aid environment with a highly semantic description facility and a logically effective verification facility for increasing software productivity oriented toward reutilization. The proposed method for design simulation is a semantic technique, based on an effective human interface in the entity model constructive process. The system helps human beings to understand and verify during the design process. This can be achieved by semantically representing an entity model and performing predicate calculus on its entity characteristics, using rule-based inference from the database for software conceptual description. Application examples are presented View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Solving large scale puzzles with neural networks

    Page(s): 562 - 569
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (464 KB)  

    The n-queens problem is solved with Boltzmann machines and a depth-first search (DFS) algorithm. In large-scale problems, the Boltzmann machines found a solution much faster than the DFS. The 1000-queens problem was solved using an energy minimization technique. The polyomino puzzles were also solved with Boltzman machines and a DFS algorithm. In small-scale problems, the DFS solved these puzzles faster than Boltzmann machines. Using Gaussian machines, large-size polyomino puzzles were solved successfully. For example, 36 unique solutions were obtained for the 1000-queens problem, and 5×8, 6×10, and 8×8 sized difficult polyomino puzzles were solved View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • On the combined approach to intelligent backtracking

    Page(s): 490 - 494
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (384 KB)  

    An intelligent backtracking scheme based on both static and dynamics information, called the CIB (combined intelligent backtracking) approach, is presented. At compile time IBGs (intelligent backtracking graphs) are generated. An IBG is constructed by augmenting a data dependency graph with additional backtracking links for each variable and different types of backtracking. At runtime, the variables causing the failure are identified, and the intelligent backtracking is accomplished by following corresponding links in the IBGs. Comparison with other approaches is made View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Acquisition of knowledge sources for natural language processing

    Page(s): 122 - 129
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (632 KB)  

    Tools for facilitating the task of encoding knowledge in developing lexicons and knowledge bases (KBs) are discussed. The specific tools discussed are Lextool (lexicon development tool) and KBtool (knowledge base development tool). These tools, written in Lisp, are the result of an attempt to automate the process of acquiring the data for the benefit of users with varying levels of sophistication. These tools allow for data to be encoded easily and verified automatically. Both expert and nontechnical user aspects of these tools are described View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Pose determination using vertex-pairs on the Connection Machine

    Page(s): 512 - 517
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (424 KB)  

    A model-matching scheme for three-dimensional object recognition on a fine-grained data parallel machine is described. The feature that is used for model matching is the vertex pair, introduced by J.L. Mundy et al. (1987). The transformation between the features in the three-dimensional model and the two-dimensional scene is described by an affine transformation, which is an approximation to the perspective transformation. Coarse-to-fine histogramming on an n-dimensional grid is used to compute the best transformation between the model and the scene. Performance results for an implementation on the Connection Machine are presented View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A parallel architecture for large scale production systems

    Page(s): 27 - 33
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (572 KB)  

    The authors present an architecture, suitable for implementation on a shared memory multiprocessor system, in which all the phases can run in parallel. Running multiple match, execution, and select phases causes subtle synchronization problems, which if not resolved can lead to altered semantics. The proposed architecture uses a lock and interference manager and a scheduler to resolve the possible synchronization conflicts. A new lock which provides concurrency beyond the standard two-phase locking in databases is used. The conflict resolution phase has been formalized as a scheduling problem. The approach taken is conservative in the sense that the scheduler performs careful analysis (interference avoidance and abort avoidance tests) to prevent interference, abort, and blocking View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Multilayer of ring-structured feedback network for production system processing

    Page(s): 457 - 464
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (672 KB)  

    The ANN (artificial neural network) approach can be applied to problems in artificial intelligence-in particular, to production systems. Among various types of neural networks, the three-layer ring-structured feedback network with three associative memories is considered to suit the problem domain. Characteristics of the production system paradigm are identified, based on which mapping strategies are developed. Two types of representation techniques are studied: local and hierarchical. The local representation can give an O(1) pattern matching time in production systems when an efficient training strategy is used. The hierarchical representation derives features from production systems and constructs a three-dimensional feature space, where a pattern can be uniquely defined by a vector. Simulation results demonstrate that the proposed architecture and mapping strategy can be an efficient solution to the production system paradigm View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Differential A*: an adaptive search method illustrated with robot path planning for moving obstacles and goals, and an uncertain environment

    Page(s): 624 - 639
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1592 KB)  

    Differential A* is presented. It is a method that builds on the A*/configuration-space approach to adapt quickly to changes in the space by determining and updating the localized regions affected by those changes rather than regenerating the entire space. This is particularly effective with moving obstacles or goals and in an uncertain environment because only small parts of the space are affected at a time. This technique can provide significant speed improvements over, with the same desired results, as complete space regeneration. The A* search algorithm and its relationship to the configuration space method of path planning are presented. The connection of A* to wave propagation in configuration space for path planning is described. The differential A* method is outlined, with the focus on path planning. Examples of moving obstacles and goals and planning in an uncertain environment are presented View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.