Scheduled System Maintenance:
On Monday, April 27th, IEEE Xplore will undergo scheduled maintenance from 1:00 PM - 3:00 PM ET (17:00 - 19:00 UTC). No interruption in service is anticipated.
By Topic

Computer Science Society, 2000. SCCC '00. Proceedings. XX International Conference of the Chilean

Date 16-18 Nov. 2000

Filter Results

Displaying Results 1 - 25 of 28
  • Proceedings 20th International Conference of the Chilean Computer Science Society

    Publication Year: 2000
    Save to Project icon | Request Permissions | PDF file iconPDF (346 KB)  
    Freely Available from IEEE
  • Author index

    Publication Year: 2000 , Page(s): 215
    Save to Project icon | Request Permissions | PDF file iconPDF (50 KB)  
    Freely Available from IEEE
  • Comparative analysis of a parallel discrete-event simulator

    Publication Year: 2000 , Page(s): 172 - 177
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (428 KB)  

    Discrete-event simulation is a widely used technique for the study of systems which are too complex to be modeled with analytical methods. Usually parallelism becomes an effective tool for reducing the running times involved in the simulation of large scale systems. However the actual realization of an efficient parallel simulator is highly dependent on the particular features of the system being modeled. As a result, a number of alternative strategies for parallel simulation, the so called synchronization protocols, have been developed. None of them is the most efficient one for all kind of systems, or even different instances of a given system. It is then relevant to provide the designer of a parallel simulator with information relative to the factors affecting the performance of known protocols. We present an analysis of such factors in the context of the comparison of an optimistic synchronization protocol with alternative approaches which are suitable for the bulk-synchronous parallel model of computing. It is well-known that analytical analysis in this field is mathematically intractable due to the irregular nature of the workload. However, rather than resorting to benchmarks whose results are largely influenced by programming details, we devised a strategy to get quantitative results from an implementation independent and yet empirical framework View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Network programming internals of the Dillo Web browser

    Publication Year: 2000 , Page(s): 178 - 182
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (344 KB)  

    Network programs face several delay sources when sending or retrieving data. This is particularly problematic in programs which interact directly with the user, most notably Web browsers. We present a hybrid approach using threads and signal driven I/O, which allows a non-blocking main thread and overlapping waiting times View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Using focus of attention in classification

    Publication Year: 2000 , Page(s): 109 - 116
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (540 KB)  

    The goal of a classifier is to accurately predict the value of a class given its context. Often the number of classes “competing” for each prediction is large. Therefore, it is necessary to “focus attention” on a smaller subset of these. We investigate the contribution of a “focus of attention” mechanism using enablers to the performance of a word predictor. We then describe a large scale experimental study in which the approach presented is shown to yield significant improvements in word prediction tasks View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A dynamic virtual fragmentation method for query recovery optimization

    Publication Year: 2000 , Page(s): 50 - 57
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (464 KB)  

    The time consumed during the execution of queries in a parallel and distributed environment is highly affected by the form in which the tables comprising a database have been fragmented. The classical methods of fragmentation in a distributed database system helps, to a great extent, to make the information retrieval faster and with smaller calculation efforts. This is particularly true for applications where the specifications are well known in advance at the time of the creation of the tables that compose the database, which in some form or other influenced the design and the definition of the type of fragmentation and their distribution on different sites of processing. Nevertheless, the above characteristics cannot be used in applications where the distributed management cannot do inferences that help it to know in what sites the data with the characteristics the user is looking for are located. Under these conditions the time and amount of work used by the participants in the query solution can be highly increased. In this paper the dynamic virtual fragmentation method is presented. The method works as an alternative way that allows us to diminish response time consumed by queries using horizontal fragmented tables. The proposed method has been proven and implemented in a database parallel server running on a shared nothing supercomputer View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A temporal database management system implemented on top of a conventional database

    Publication Year: 2000 , Page(s): 58 - 67
    Cited by:  Patents (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (916 KB)  

    Temporal data models have proven to be convenient to specify applications, allowing the representation of the temporal evolution of data. Several temporal data models were proposed in the last 20 years with this purpose, but no commercial implementation of a temporal database is still available. This paper presents an integrated temporal database environment implemented on top of a conventional database. Using this environment, a user can handle the specification, the data definition and the queries as though the database implements the temporal data model. The environment performs the mapping from the temporal conceptual schema to the correspondent database, and of the queries expressed in the temporal query language to SQL. Data definition is controlled based on the state transition rules of the temporal data model, keeping thus the temporal integrity of the database. The underlying conventional database keeps transparent to the user View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An integral software process formal model based on the SOCCA approach

    Publication Year: 2000 , Page(s): 162 - 171
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (812 KB)  

    In this article, a Capacity-Centered Integral Software Process Model (CCISPM) applicable to the construction of conventional systems (CS) and of knowledge-based systems (KBS) is described. The model designed is prescriptive, that is, it shows what is done to produce and to maintain an automated solution to a real word problem and who is doing it. It is formalized by applying the object-oriented modeling approach SOCCA (Specifications of Coordinated and Cooperative Activities). This article shows only the formalization of design process, one subprocess taken from several that are involved into the CCISPM. The formal model obtained, which represents all three P: the processes, the products and the people, put together data, process and behavior perspectives into an understandable representation for the software process users (engineer, manager, developer) View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Haar-like wavelets defined over tetrahedrical grids

    Publication Year: 2000 , Page(s): 117 - 125
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (472 KB)  

    We systematically define Haar-like wavelets over a tetrahedral grid which is generated by a regular subdivision method. These wavelets form an unconditional basis for LP(T,Σ,μ), 1<p<∞, μ being the Lebesgue measure and Σ the σ-algebra of all tetrahedra generated from a tetrahedron T by the chosen subdivision method View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Distributed object engine construction

    Publication Year: 2000 , Page(s): 183 - 190
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (552 KB)  

    Although several object-oriented distributed systems development tools exist, there is little, if no offer of solutions that homogeneously integrate all aspects of a distributed system. This article describes a toolkit that aims at supporting the construction and the execution of distributed systems. The tools are defined according to an architecture that fully complies with object-oriented principles, providing software developers with a relatively simple, yet effective and efficient, platform that makes the main difficulties normally found in distributed systems development transparent. Among those difficulties, it is the notorious impedance mismatch between the programming language and the object manipulation language. Such difficulty is not noticed in the toolkit presented. Moreover, all the benefits from object orientation are deeply exploited. For this reason, the toolkit permits fast development of distributed systems View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Design of a system for image registration and compensation based on spectral analysis

    Publication Year: 2000 , Page(s): 191 - 198
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (464 KB)  

    Estimation of the values of the parameters of affine transformations are crucial in dealing with the compensation problem in image acquisition and image registration. We investigate these problems using spectral, Fourier and complex analysis. We furthermore introduce some new techniques for obtaining spectral transformations and computing of the so-called crosspower spectrum of an image, allowing a fast and efficient solution of the problem of image registration under affine transformation View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Testing an implementation of a temporal logic language

    Publication Year: 2000 , Page(s): 68 - 73
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (356 KB)  

    Axiomatic presentations contain a great deal of information that can be used in the testing of an implementation. We describe how we tested an implementation of a temporal logical system, specifically an implementation of an extension (Cobo and Augusto, 1998) of the language Temporal Prolog (Gabbay, 1987). However, this testing approach generalizes to any system specified using a Hilbert model (i.e. a system described using a set of axioms and inference rules) and implemented as a Prolog program. Our approach allowed us to discover some errors in the program. The necessary background information on temporal logic and specification-based testing is included in order to make the exposition as self-contained as possible View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A paraconsistent system of autonomous agents for Brazilian bank check treatment

    Publication Year: 2000 , Page(s): 89 - 98
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (684 KB)  

    This work is part of the Multicheck Project that defines the architecture of autonomous agents for the automatic treatment of handwritten Brazilian bank checks. The competence of these agents is implemented in two layers. The first corresponds to pattern recognition algorithms directly applied to image segments. The second one corresponds to reasoning mechanisms applied to the information from the first layer, either to validate or to interpret it. The interpretation process also involves information obtained from other agents. This information can present inconsistencies. This problem is treated properly and naturally through the concepts and operators of paraconsistent logic. This paper focuses on the second layer, on task distribution problems and on communication between agents. The first layer information was obtained through a simulated database View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A measurement-based approach for implanting SQA and SCM practices

    Publication Year: 2000 , Page(s): 126 - 134
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (492 KB)  

    In recent years an increasing number of software organizations have launched initiatives to improve their software process. Unfortunately, most of them have been unable to move beyond diagnosis and action planning, turning those plans into real and practical actions. This paper focuses on two software process areas, software quality assurance (SQA) and software configuration management (SCM), and proposes a set of generic tools to assist in the implantation of specific practices for them. SQUID (Software QUality In the Development Process), a measurement-based methodology for specifying, monitoring and evaluating the software product quality during development, is adapted and used to manage the quality of implanting specific tools, such as guides, checklists and templates for SQA and SCM practices. We describe the results of a preliminary application conducted, showing how the proposed adaptation helps in formalizing and normalizing the implantation process, setting tangible goals and evaluating the results more precisely View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Graph library design

    Publication Year: 2000 , Page(s): 144 - 151
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (488 KB)  

    We present an object oriented design for graph libraries that implements a dynamic typing of graphs. With this design, we can specify pre and post-conditions on graph algorithms, describe safe polymorphic algorithms on graphs and specify operations specific to types of graphs, while presenting performance and allowing extensibility View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Comprehensive specification of distributed systems using I5 and IOA

    Publication Year: 2000 , Page(s): 74 - 82
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (660 KB)  

    Low level difficulties in the development of distributed systems that are due to non-standard communication protocols and incompatible components or platforms have largely been solved through standardization and commoditization of protocols and platforms. Distributed systems are being designed at higher levels of sophistication these days, and having an expressive yet usable specification language is a valuable tool. IOA is a formal language for specifying the semantics of distributed systems. I5 is a specification framework for architectural definition of distributed systems, also intended as a basis for configuration management. I5 has five levels that specify mainly the structural characteristics at different levels of abstraction, but I5 does not address the semantics or dynamics of distributed systems interactions. We explore the integration of IOA and I5 to create combined specifications that enjoy the benefits of both specification languages: the five different levels of abstraction of I5 with their structural specification capabilities are enhanced by a semantic specification written in IOA. We show an example of a specification developed using IOA and I5 in an integrated way. We consider general approaches to such integrated specifications and discuss the possibilities and limitations of integrating IOA and I5, as well as our future work towards the complete integration View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The cooperative system for information retrieval

    Publication Year: 2000 , Page(s): 199 - 209
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (732 KB)  

    We describe our information retrieval system which allows a convivial access to databases. It is based on a multi-expert architecture using a database management system and a blackboard to control the progressive analysis of the user's sentence. We have integrated a numerical method based on fuzzy rules to deal with uncertainty. This method is used to optimize the analysis process and the cooperation between the different experts. On the other hand, the goal of our recent research is to ease or eliminate the knowledge-acquisition bottleneck for expert system creation and to make a connectionist model behave as much as possible like an expert system. We describe our experience using neural networks to represent the knowledge bases of the different experts (i.e., lexical entries expert, homographs expert, template expert, grammatical word expert and words expert) View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Evolving a legacy data warehouse system to an object-oriented architecture

    Publication Year: 2000 , Page(s): 32 - 40
    Cited by:  Patents (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (648 KB)  

    We introduce an object-oriented approach to transform a star-schema of a legacy data warehouse system into a Dimensional Object Model (DOM) (Firestone, 1998), in order to take advantage of the flexibility of the object paradigm. We have applied the Object-Oriented Software Engineering (OOSE) process, proposed by (Jacobson et al., 1992), to describe the life cycle of a data warehouse and to clarify the context of our intervention. By means of OOSE we also define the target architecture and the components that will ensure the development of reuse-supporting data warehouse systems. In a sense, a change tolerant architecture is proposed, as referred to by (Jacobson, 1998). The solution proposed applies the object paradigm for the star-schema decomposition in a three-layer architecture, separating the components in logical, interface and data management layers. This decomposition minimizes the dependence among the components and increases their reusability. The main contribution of this work is that, by transforming the star-schema into a DOM, we are preparing the system to evolve through any object-oriented data warehouse development methodology. In a context of many already existing data warehouse systems based on relational models, this transformation sounds especially interesting View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A dynamic associative semantic model for natural language processing based on a spreading activation network

    Publication Year: 2000 , Page(s): 99 - 108
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (768 KB)  

    This paper presents a semantic model based on well-known psycholinguistic theories of human memory. It is centered on a spreading activation network, but it departs from classical models by representing associations between structured units instead of atomic nodes. Network units have an activity level that evolves according to their expected contextual relevance. Spreading activation explains the predictive top-down effect of knowledge. It supports general heuristics which may be used as the first step of more elaborated methods. This model is suited to deal with the interaction between semantic and episodic memories, as well as many other practical issues regarding natural language processing, including the retroactive effect of semantics over perception and the operation in open-worlds View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • TAOS: a task-and-action oriented framework for user's task analysis in the context of human-computer interfaces design

    Publication Year: 2000 , Page(s): 24 - 31
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (576 KB)  

    This paper presents a conceptual model for user centered human-computer interface design. It considers that the user's task analysis and description is analogous to knowledge acquisition and representation for knowledge based systems. The paper presents TAOS, a task-and-action oriented framework used to define a modeling language of the domain (KL-ONE like language) and a methodology for building and validating the user's task description. It also demonstrates how TAOS is particularly adapted for the description and analysis of the user's task in the context of human-computer interface design View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Applying the SCR method in software requirements specifications

    Publication Year: 2000 , Page(s): 135 - 143
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (528 KB)  

    This article focuses on the SCR (Software Cost Reduction) method for requirements specification of real-time, process-control systems. It describes a case study on the application of the method in the requirements specification of a liquid mixture system. The main stages of the method utilization are illustrated and the advantages and difficulties identified are discussed View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Multiplicity and local search in evolutionary algorithms to build the Pareto front

    Publication Year: 2000 , Page(s): 7 - 13
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (400 KB)  

    In multicriteria optimization determination of the Pareto-optimal front is of utmost importance for decision making. Simultaneous parallel search for multiple members of an evolutionary algorithm can lead to effective optimization. In a previous approach (Esquivel et al., 1999) extending the ideas of a former work of (Lis and Eiben, 1997), we proposed the multi-sexual-parents-crossovers genetic algorithm (MSPC-GA), a method which by allowing multiple parents per sex and multiple crossovers per mating action attempted to balance the explorative and exploitative efforts which are present in any evolutionary algorithm. The performance of the method produced an evenly distributed and larger set of efficient points. Following this concept the present proposal incorporates a hybridisation of global and local search to the multiplicity approach. Now the evolutionary approach combined with simulated annealing and neighbourhood search produced better results View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Cooperative learning on autonomous agents acquiring common language for action and perception

    Publication Year: 2000 , Page(s): 83 - 88
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (400 KB)  

    In order to perform shared tasks, all the participants in a multi-agent system (MAS) must agree common meanings for their perceptions and actions. Therefore, the basic communication and cooperation must be handled in a different way, mainly because the agents are autonomous components who sometimes make their own decisions. Keeping this in mind, a model for language acquisition and learning based on an underlying MAS is proposed. It takes into account issues concerned lexical acquisition and emergence capabilities as products of agents interacting in an artificial society. As a result, some simulations in which these agents are involved in a “talk” with others to agree common meanings are discussed View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Capacities-centered integral software process formalization

    Publication Year: 2000 , Page(s): 152 - 161
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (596 KB)  

    We formalize an integral software process model which is applied to the construction of conventional systems (CS) and knowledge-based systems (KBS), centered in the capacities. The Capacities Centered Integral Software Process Model (CCISPM) is formalized through an object oriented approach. Aiming to automate the CCISPM, the formalization of the process dynamic aspects is begun. In this context, we present the dynamic modeling of the activities of the project initiation, planning and estimation process. The formal model obtained, which represents the three Ps: processes, products and people, favours the direct understanding and communication of the process users (engineers, managers or developers) in relation to the model considered aspects View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A genetic classifier tool

    Publication Year: 2000 , Page(s): 14 - 23
    Cited by:  Papers (2)  |  Patents (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (636 KB)  

    Knowledge discovery is the most desirable end product of an enterprise information system. Research from different areas recognizes that a new generation of intelligent tools for automated data mining is needed to deal with large databases. In this sense, induction based learning systems have emerged as a promising approach. This paper describes an induction-based classifier tool. The tool employs a genetic algorithm using the Michigan approach to find rules, is able to process discrete and continuous attributes and also is domain-independent. Implementation details are explained, including some optimizations, data structures and genetic operators. Some optimizations include the use of phenotypic sharing (with linear complexity) to direct the search. The results of accuracy are compared with 33 other algorithms in 32 datasets. The difference of accuracy is not statistically significant at the 10% level when compared with the best of the other 33 algorithms View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.