By Topic

Applied Computing, 1991., [Proceedings of the 1991] Symposium on

Date 3-5 April 1991

Filter Results

Displaying Results 1 - 25 of 71
  • 1991 Symposium on Applied Computing (Cat. No.91TH0355-8)

    Save to Project icon | Request Permissions | PDF file iconPDF (30 KB)  
    Freely Available from IEEE
  • A methodology for evaluating the performance of RISC processors

    Page(s): 213 - 222
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (548 KB)  

    A methodology is devised to simulate and evaluate the performance of RISC processors. The detailed model provides a way to evaluate the performance of the processor for various applications and under a wide spectrum of operation environments and conditions. The instruction flow of each instruction is modeled. The instruction requests the various units and stages of pipelines which are modeled as resources. The model is applied on the Motorola MC88100 RISC processor which includes four processing units and numerous pipelines to speed up the execution of instructions. From the model the processing speed is investigated for various applications, and the contention on some units is studied. Moreover, the utilization of units and mean waiting time are studied and discussed View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Migration to a distributed architecture for the Space Shuttle flight planning system

    Page(s): 226 - 235
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (880 KB)  

    A system modeling approach was applied to the migration of Space Shuttle flight planning applications from a centralized computer to a distributed architecture. Four steps were performed in this approach: characterization of the current flight planning systems workloads, data and components: modeling of workload and usage on the current system; selection of a distributed architecture as candidate for migration; and modeling of projected workload and data resource utilization on the selected candidate architecture. Applying traditional system modeling steps for migration to a distributed system was instrumental in accurately predicting required system resources and significantly reducing risk in migration. Further, improvements in the distributed architecture were developed which would not have been possible without the knowledge gained during the system modeling process View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Software reuse and information theory based metrics

    Page(s): 437 - 446
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (832 KB)  

    The main purpose of the research presented is to theoretically investigate the effect of reusing software on metrics that are based on the entropy function of communication information. R.N. Chanon's (1973) entropy loading and E.T. Chen's (1978) control structure entropy were applied to C and Ada programs obtained from the open literature. Four units of decomposition (statement, component, module, and program) along Chanon's definition of an object were introduced to classify software reuse units. A total of three versions for each of the three programs were considered (optimum reuse, intermediate reuse, and no reuse). The lines of code metric was utilized to quantify the amount of nonreusable code in each of the versions of the programs. Pearson product-moment correlations were computed between the information theory based metrics and the lines of code metric. The results of this study show that there are significant correlations between the information theory based metrics and software reusability View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Using a truth maintenance system to support knowledge base construction and evolution

    Page(s): 176 - 179
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (264 KB)  

    Most expert systems require access to a knowledge base of domain facts. The authors raise issues concerning the granularity of such knowledge bases and the need to update them. In particular, they discuss the use of an inference engine and associated truth maintenance system in constructing and maintaining such knowledge bases and present an implementation of a prototype system which uses these techniques View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • On the problem of finding all maximum weight independent sets in interval and circular-arc graphs

    Page(s): 465 - 470
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (384 KB)  

    J.Y.-T. Leung (J. Algorithms, no.5, (1984)) presented algorithms for generating all the maximal independent sets in interval graphs and circular-arc graphs. The algorithms take O(n2+β) steps, where β is the sum of the number of nodes in all maximal independent sets. The authors use a new technique to give fast and efficient algorithms for finding all the maximum weight independent sets in interval graphs and circular-arc graphs. The algorithms take O(max(n 2, β)) steps in O(n2) space, where β is the sum of the number of nodes in all maximum weight independent sets. The algorithms can be directly applied for finding a maximum weight independent set in these graphs in O(n2) steps. Thus, the result is an improvement over the best known result of O(n2 log n) for finding the maximum weight independent set in circular-arc graphs View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Modularization in an object-oriented knowledge base system

    Page(s): 167 - 174
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (608 KB)  

    Modularization of a knowledge base system has the following goals: improving system response time, providing a more secure system and adding the capability to load or unload individual modules. However the object-oriented characteristics of a knowledge base system make modularization a complex task, due in large part to the many inter-relationships of the objects. The author suggests a modularization strategy for the knowledge representation language Telos, based on the structure of the language and with consideration for the security framework of the Group Security model View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A public-key based dynamic password scheme

    Page(s): 430 - 435
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (380 KB)  

    A dynamic password authentication scheme based on public-key concept is proposed. The login password is changed dynamically and users can use this scheme within a remote login environment. Since we employ the public-key concept to bind each user's password to that user's identification, we eliminate the necessity for the system to store the encrypted password file. This approach has greatly reduced the risk of cracking the password from attacking the encrypted password file. In addition, the amount of information needed to be stored in the host system is reduced View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An extended memoryless inference control model: partial-table level suppression

    Page(s): 142 - 149
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (508 KB)  

    Memoryless inference controls are an important class of inference control methods for online statistical databases. Other inference controls are usually too complex to use in online systems. Cell level controls have been shown to provide a low level of indentification risk along with a relatively high level of release of nonsensitive statistics, but are also too complex. Table level controls have a reasonable level of complexity. A higher level of control is desirable, however to preclude the necessity of using other less accurate control methods in conjunction with the table level controls. The authors present a method which allows the release of some statistics at a level below the table level of inference control, thus providing the release of a greater number of statistics with a comparable level of identification risk. The method provides user look-up tables to calculate the risk for a particular query View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A nonmonotonic theory of plan synthesis

    Page(s): 180 - 189
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (796 KB)  

    A theory of plan synthesis is proposed that reasons about actions to solve the frame problem. The authors also reason about the plan synthesis to detect possible or impossible orderings of the actions. This theory uses the frame axiom and the modal quantificational logic Z to propagate the facts from the current situation to the next situation. The explicit results of an action are provided only, no delete list is needed. The facts are automatically added and deleted from one situation to the next by the nonmonotonic reasoning as the actions are performed. An example illustrates the plan synthesis algorithm, which is given View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A fast `parsing' algorithm for conceptual clustering

    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (84 KB)  

    The notion of conceptual clustering was introduced by Michalski in 1980 where he used concepts at a more physical level such as (color=`red') and (height=`tall') and so on. Shankar et al. explored the possibility of using a knowledge base in the clustering process. They used concepts at one or more levels higher than the physical level. The author suggests a method which uses only integers during the clustering process. The idea involves reorganizing the knowledge base (KB) used in Shankar et al. and the presentation of a method for clustering, which uses KB so re-organized. The crux of the reorganization involves associating every node in cohesion forest (CF) with a unique integer. This association is made in a `breadth-first-traversal' fashion for all trees in CF View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A project engineering tool to assist in the development and maintenance of project life cycles

    Page(s): 119 - 122
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (392 KB)  

    The paper discusses the existing problems in developing and maintaining project life cycles and proposes a tool as a possible solution. This tool is part of an integrated tool set, called Project Engineer, which has been designed to support the maintenance of software projects from a project management perspective. A PC platform is chosen and the user interface is based on Microsoft Windows 3.0. One module is selected for discussion and demonstration of the concept: the Life Cycle Builder. The purpose of this module is to manipulate project life cycles and templates. Project Engineer uses Multiple Document Interface, which is an IBM SAA/CUA standard provided by Microsoft Windows 3.0. It also uses the objects, properties, roles, and relationships (OPRR) based data engine. Each module in Project Engineer is internally independent of the other modules. Coordination among the modules is provided by the OPRR project database engine View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A DIBOL/SOL database for a virological laboratory

    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (96 KB)  

    The clinical diagnostic testing facilities at the University of Wurzburg's Institute of Virology started in 1972. Currently the laboratories receive and process approximately 20000 specimens per year. The operation is coordinated by one physician and the laboratories can test each of the 30 various human clinical material specimen types for approximately 90 different viruses using up to ten laboratory tests. The daily routine testing is performed by six to eight medical technical assistants. The current software generates all lab worksheets, histories of each patient and prints the final lab report for the physician and/or hospital. The database stores patient data from 1972 online. Currently, the authors are implementing the concept in SQL with DIBOL View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • On flow control mechanisms and their impacts upon statistical databases

    Page(s): 150 - 155
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (420 KB)  

    The effect of mixing two or more independently designed data security policies has not been well analyzed. There has been concern that it may create unexpected violations. The authors show that the widely used *-property flow control policy in multi-level data security systems can contribute to the compromise of statistical databases. They further conclude that additions of unclassified data into a statistical database alone may undermine inference control policies and mechanisms. Fundamental insights obtained from this study must be taken into account during data security policy analysis and formulation View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An object-oriented conceptual model for the representation of geographic information

    Page(s): 472 - 480
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (660 KB)  

    The interest of the DP community towards geographic databases is large. The drawbacks of traditional database systems when used for handling unstructured data have been largely pointed out in the literature. At present, research efforts are concentrated on the investigation of the usability of the object-oriented approach for developing a new generation of powerful geographic database systems. The complexity of geographic systems, in terms of relationships to be modelled and operations to be performed on data, is well-known; therefore a conceptual analysis of the specific situation is a good strategy for handling the complexity of the implementation. The authors propose an object-oriented conceptual model tailored for organizing and representing basic map elements and their relationships, as well as operations of interest for the management of geographic data. The aim is to provide designers with a conceptual tool useful for organizing their knowledge about a geographic application in terms of basic concepts of the object-oriented paradigm, namely classes, instances, and methods View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Construction of a main memory database testbed

    Page(s): 2 - 11
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (744 KB)  

    Performance of the shadow-copy update method in a memory-resident database environment has previously been studied analytically and by simulation. However, not all of the several possible shadow memory architectures have been investigated thoroughly. The article describes a transaction processing testbed facility that has been constructed to examine performance of database operations using actual shadow memory implementations View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Decision support system for creativity management

    Page(s): 350 - 359
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (584 KB)  

    The expert system for creativity management is designed to provide guidance for managers in R&D laboratories who need to nurture creativity in their technical staff. It is based on the Ginn model of management of creativity, which breaks these management problems down into a series of dichotomies. The expert system uses a blackboard architecture to simplify the software engineering and enhance the user interface View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Detection of parallelism in a computer program by grammatical means

    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (84 KB)  

    A context sensitive grammar (CSG) is developed which selects the balanced tree most suitable for parallel execution. Also, a new parsing algorithm for this grammar has been considered. The grammar detects parallelism in arithmetic expressions by considering boundaries between subexpressions. The grammar can be extended to include all kinds of associative operators. The algorithm is an extension of LR parser which constructs a balanced binary parse tree to process on parallel machine View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Practical spatial database access methods

    Page(s): 82 - 90
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (516 KB)  

    Although many solutions to the problem of representing spatial data have been proposed, most are not practical. They cannot be guaranteed to perform well with large and arbitrarily distributed data collections. They may not be easily integrated with concurrency and recovery software already written for database systems. However, two proposed structures do have some guarantees, similar to those which have made the B+-tree so successful for one-dimensional data. These analytic guarantees include worst case space utilization in data and index pages, a minimal fan-out and exact match search time bounded by the height of the tree. The two methods are the holey brick tree and bit interleaving using a B+-tree. Both can also be integrated with concurrency and recovery systems in the same way that B+-trees are View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A branch oriented key management solution to dynamic access control in a hierarchy

    Page(s): 422 - 429
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (384 KB)  

    A branch oriented key management scheme whose security is based on solving discrete logarithms is proposed. The key management problems which exist in the multilevel security environment can be overcome. The authors scheme is elegant, simple, general and suitable for the dynamic access control in a hierarchy View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Recovery of shape and motion parameters for polyhedron-like objects

    Page(s): 448 - 456
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (684 KB)  

    A discussion is given on the problem of determining the shape of an object and the parameters that describe a rigid motion of the object, from an analysis of two noisy, and perhaps incomplete, perspective images of the object. One image is taken before and the other is taken after the rigid motion. The authors present a practical approach that applies to polyhedron-like objects. They compare the results to related results, and discuss implementation results View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A high-performance, modular design paradigm for teletraffic simulation with CSIM

    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (84 KB)  

    The simulator designed by the author provides a new design paradigm for simulating large-scale telecommunications systems. It is based on C and CSIM and has the features of dynamic memory allocation, modular design, interruptability, user-interface, verification and displaying run-time statistics. At present, it can be used to simulate CDRs or traffic matrix inputs; fixed hierarchical routing, or spray routing; internal dynamic overflow controls for switch cpu congestion control; dynamic cross-connection by features for both switches and trunks View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Using handheld computers to improve efficiency in oil industry field operations

    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (76 KB)  

    Handheld computers are being used to improve efficiency in oil industry field operations. Field data is recorded electronically with the handheld unit rather than being written on a paper report. After the data is collected, it is uploaded into a PC database in the field office where timely reports and data analysis are produced that support the decision-making process in the field. Handheld computers play a critical role in making these efficiency improvements possible View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A categorical entity-relationship model of databases

    Page(s): 156 - 166
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (740 KB)  

    The major objective of the author is to develop a highly abstract model for databases: the categorical entity-relationship (CER) model. This model is based on a unified view of the entity-relationship and relational database models. Central to the CER model is the concept of an entity-relationship (ER) object. The concept of a key defines the identity of an ER-object. An ER-object with an atomic key represents a database entity; an ER-object with a composite key represents a database relationship. The concept of a schema represents the type of an ER-object. Operations are defined on ER-objects. The role is to define the structure of a database, to update a database, and to derive information from a database. This model provides a sufficiently rich basis for designing and implementing entity-relationship and relational databases as object-oriented systems View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Processor scheduling and concurrency control in real-time main memory databases

    Page(s): 12 - 21
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (716 KB)  

    Databases are increasingly being used in real-time applications where timeliness of a result is part of the correctness criterion. Time constrained resource scheduling is one of the critical issues for real-time systems. The paper proposes new CPU scheduling and locking-based concurrency control algorithms for RTDBSs. Furthermore, a real-time timestamp-based concurrency is also presented, to compare its performance with locking-based algorithms in the real-time environment View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.