Proceedings 12th IEEE Internationals Conference on Tools with Artificial Intelligence. ICTAI 2000

15-15 Nov. 2000

Filter Results

Displaying Results 1 - 25 of 67
  • Proceedings 12th IEEE Internationals Conference On Tools With Artificial Intelligence

    Publication Year: 2000, Page(s):iii - ix
    Request permission for commercial reuse | PDF file iconPDF (305 KB)
    Freely Available from IEEE
  • Multi-resolution on compressed sets of clauses

    Publication Year: 2000, Page(s):2 - 10
    Cited by:  Papers (12)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (841 KB)

    The paper presents a system based on new operators for handling sets of propositional clauses represented by means of ZBDDs (zero-suppressed binary decision diagrams). The high compression power of such data structures allows efficient encodings of structured instances. A specialized operator for the distribution of sets of clauses is introduced and used for performing multi-resolution on clause s... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An assumptive logic programming methodology for parsing

    Publication Year: 2000, Page(s):11 - 18
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (591 KB)

    We show how several novel tools in logic programming for AI (namely, continuation based linear and timeless assumptions, and datalog grammars) can assist us in producing terse treatments of difficult language processing phenomena. As a proof of concept, we present a concise parser for datalog grammars (logic grammars where strings are represented with numbered word boundaries rather than as lists ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Belief revision and possibilistic logic for adaptive information filtering agents

    Publication Year: 2000, Page(s):19 - 26
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (805 KB)

    Prototypes of adaptive information agents have been developed to alleviate the problem of information overload on the Internet. However, the explanatory power and the learning autonomy of these agents are weak. A logic based framework for the development of information agents is appealing since semantic relationships among information objects can be captured and reasoned about. This sheds light on... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A visualization tool for interactive learning of large decision trees

    Publication Year: 2000, Page(s):28 - 35
    Cited by:  Papers (8)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (999 KB)

    Decision tree induction is certainly among the most applicable learning techniques due to its power and simplicity. However learning decision trees from large datasets, particularly in data mining, is quite different from learning from small or moderately sized datasets. When learning from large datasets, decision tree induction programs often produce very large trees. How to efficiently visualize... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Function approximation based multi-agent reinforcement learning

    Publication Year: 2000, Page(s):36 - 39
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (356 KB)

    The paper presents two new multi-agent based domain independent coordination mechanisms for reinforcement learning. The first mechanism allows agents to learn coordination information from state transitions and the second one from the observed reward distribution. In this way, the latter mechanism tends to increase region-wide joint rewards. The selected experimented domain is Adversarial Food-Col... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Knowledge pruning in decision trees

    Publication Year: 2000, Page(s):40 - 43
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (383 KB)

    We propose a novel pruning method of decision trees based on domain knowledge, semantic hierarchies among classes, which is used to generate decision trees by relaxing the levels of hierarchies for both height and width of the trees. We develop the algorithm, and the effectiveness is examined by UCI Machine Learning Repository: On Car Evaluation and Nursery. We can generate the decision trees cons... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Using latent semantic analysis to identify similarities in source code to support program understanding

    Publication Year: 2000, Page(s):46 - 53
    Cited by:  Papers (34)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (812 KB)

    The paper describes the results of applying Latent Semantic Analysis (LSA), an advanced information retrieval method, to program source code and associated documentation. Latent semantic analysis is a corpus based statistical method for inducing and representing aspects of the meanings of words and passages (of natural language) reflective in their usage. This methodology is assessed for applicati... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Modeling software quality: the Software Measurement Analysis and Reliability Toolkit

    Publication Year: 2000, Page(s):54 - 61
    Cited by:  Papers (4)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (1047 KB)

    The paper presents the Software Measurement Analysis and Reliability Toolkit (SMART) which is a research tool for software quality modeling using case based reasoning (CBR) and other modeling techniques. Modern software systems must have high reliability. Software quality models are tools for guiding reliability enhancement activities to high risk modules for maximum effectiveness and efficiency. ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Proceedings 12th IEEE Internationals Conference on Tools with Artificial Intelligence. ICTAI 2000

    Publication Year: 2000
    Request permission for commercial reuse | PDF file iconPDF (878 KB)
    Freely Available from IEEE
  • Principles for mining summaries using objective measures of interestingness

    Publication Year: 2000, Page(s):72 - 81
    Cited by:  Papers (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (914 KB)

    An important problem in the area of data mining is the development of effective measures of interestingness for ranking discovered knowledge. The authors propose five principles that any measure must satisfy to be considered useful for ranking the interestingness of summaries generated from databases. We investigate the problem within the context of summarizing a single dataset which can be genera... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • From data mining to rule refining. A new tool for post data mining rule optimisation

    Publication Year: 2000, Page(s):82 - 85
    Cited by:  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (563 KB)

    The discovery of information from data and its presentation to the user have long been the primary goals of data mining. This paper describes a new software tool, the Rule Refiner, which focuses on post data mining operations, the optimisation of rules, the visualisation of rule characteristics and their validity within domain data. The system also provides facilities for the manipulation of this ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • What's new? Using prior models as a measure of novelty in knowledge discovery

    Publication Year: 2000, Page(s):86 - 89
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (428 KB)

    One of the challenges of knowledge discovery is identifying patterns that are interesting, with novelty an important component of interestingness. Another important aspect of knowledge discovery is making efficient use of background knowledge. This paper develops a definition of novelty relative to a prior model of the domain. The definition of novelty is tested using pneumonia outcome data and a ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Parallel mining of association rules with a Hopfield type neural network

    Publication Year: 2000, Page(s):90 - 93
    Cited by:  Papers (4)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (366 KB)

    Association rule mining (ARM) is one of the data mining problems receiving a great deal of attention in the database community. The main computation step in an ARM algorithm is frequent itemset discovery. In this paper, a frequent itemset discovery algorithm based on the Hopfield model is presented. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Implementing an action language using a SAT solver

    Publication Year: 2000, Page(s):96 - 103
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (953 KB)

    In recent years, research on planning algorithms has made big progress. Recent approaches encode the plan search space into a data structure called the planning graph. To extract plans, a planning graph is transformed into the satisfiability problem (SAT), which is solved by a high-speed SAT solver. This kind of planning is called SAT planning. On the other hand, recent research on reasoning about... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • About the use of local consistency in solving CSPs

    Publication Year: 2000, Page(s):104 - 107
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (385 KB)

    Local consistency is often a suitable paradigm for solving constraint satisfaction problems. We show how search algorithms could be improved, thanks to a smart use of two filtering techniques (path consistency and singleton arc consistency). We propose a possible way to get benefits from using a partial form of path consistency (PC) during the search. We show how local treatment based on singleton... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Using heuristic-based optimizers to handle the personal computer configuration problems

    Publication Year: 2000, Page(s):108 - 111
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (359 KB)

    Given the diversity of PC hardware components, and the limited compatibility between some of these hardware components, most people are interested to obtain a (sub)optimal configuration for some specific usage restricted by their budget limits and other possible criteria. We firstly formulate the widely occurring configuration problems as discrete optimization problems. More interestingly, we prop... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • DGPS/INS integration using neural network methodology

    Publication Year: 2000, Page(s):114 - 121
    Cited by:  Papers (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (463 KB)

    This paper presents an INS/DGPS land vehicle navigation system using a neural network methodology. The network setup is developed based on a mathematical model to avoid excessive training. The proposed method uses a KF-based backpropagation training rule, which achieves the optimal training criterion. The North and East travel distances are used as desired targets to train the two decoupled neural... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Efficient prediction of interconnect crosstalk using neural networks

    Publication Year: 2000, Page(s):122 - 125
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (376 KB)

    Interconnect crosstalk prediction has become increasingly important with deep submicron downscaling of ICs and wafer scale integration. Existing tools for management of the emi problem are computationally expensive and not very broad in application. The unique approach proposed involves the creation of parameterized models of primitive interconnect structures, wirecells, using modular artificial n... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Identifying causal structure in a biological neural network

    Publication Year: 2000, Page(s):126 - 129
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (403 KB)

    A simulator for the Burgess, Recce, and O'Keefe (1994) BRO model of rodent navigation was built. Initial experiments did not reproduce the navigation performance reported by BRO. To determine the cause of the discrepancy, we reconstructed and reanalyzed their model. We were able to verify their main claim that if phase-modulated place cells act as radial basis functions, then they computationally ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A synergistic model for interpreting human activities and events from video: a case study

    Publication Year: 2000, Page(s):132 - 139
    Cited by:  Papers (1)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (754 KB)

    This paper describes a new approach for representing, recognizing and interpreting human activity from video. The approach presented (at the conceptual level) is a model based on the hierarchical synergy of three other models (the L-G graph, the SPN graph and a NN model). In particular, in our project human activity is strongly related with the ability of describing and interrelating events. Thus,... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A distributed multimedia knowledge based environment for modeling over the Internet

    Publication Year: 2000, Page(s):140 - 146
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (663 KB)

    The paper describes a knowledge-based scalable multimedia environment for graph based modeling and the design of complex objects over the Internet. A complex object is modeled as a directed hierarchical graph with each sub-component abstracted as a node and the shared parameters between two components as an edge. The knowledge base archives and retrieves reusable components, and integrates multipl... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Texture image segmentation method based on multilayer CNN

    Publication Year: 2000, Page(s):147 - 150
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (349 KB)

    The paper presents a new texture feature extraction method called simple texel scale feature (STSF) based on the scale and orientation information of texels, and a new texture image segmentation method based on binary image processing is introduced. The scale information of texels is extracted by comparing the gray value of two pixels. The relation of the positions of these two pixels shows the fr... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Strategies for optimizing image processing by genetic and evolutionary computation

    Publication Year: 2000, Page(s):151 - 154
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (512 KB)

    We examine the results of previous attempts to apply genetic and evolutionary computation (GEC) to image processing. In many problems, the accuracy (quality) of solutions obtained by GEC-based methods is better than that obtained by others such as conventional methods, neural networks (NNs) and simulated annealing (SA). However, the computation time required is satisfactory in some problems, where... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Building efficient partial plans using Markov decision processes

    Publication Year: 2000, Page(s):156 - 163
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (733 KB)

    Markov decision processes (MDP) have been widely used as a framework for planning under uncertainty. They allow to compute optimal sequences of actions in order to achieve a given goal, accounting for actuator uncertainties. But algorithms classically used to solve MDPs are intractable for problems requiring a large state space. Plans are computed considering the whole state space, without using a... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.