By Topic

Proceedings., Fourteenth Annual International Computer Software and Applications Conference

Oct. 31 1990-Nov. 2 1990

Filter Results

Displaying Results 1 - 25 of 114
  • The software engineering of extensible database systems

    Publication Year: 1990
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (28 KB)

    Summary form only given. Extensible DBMS (database management system) technology is aimed at simplifying the customization of DBMSs for specialized applications. Customization may involve the introduction of new data types and operators, support for new data languages and data models, new storage structures, and new relational operators. It is pointed out that to achieve these ambitious goals, an ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Software productivity metrics-new initiatives in making it work (panel)

    Publication Year: 1990, Page(s):253 - 254
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (98 KB)

    The software productivity activities in The IEEE Computer Society, the Software Productivity Consortium, and the Software Engineering Institute are addressed. Particular emphasis is placed on the metrics work of the Software Productivity Consortium, which includes the development of definitions of quantifiable aspects of the process and the products it generates and the establishment of means to o... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Proceedings. Fourteenth Annual International Computer Software and Applications Conference (Cat. No.90CH2923-1)

    Publication Year: 1990
    Request permission for commercial reuse | PDF file iconPDF (136 KB)
    Freely Available from IEEE
  • A process modeling language for large process control systems

    Publication Year: 1990, Page(s):70 - 75
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (568 KB)

    A process modeling language (PML) has been developed which aids the analyst in modeling the processes of large process control systems. Because PML is based on a conceptual framework close to the customer's view of controlled processes, a PML model is an effective communication medium between the analyst and the customer. PML allows a process to be decomposed into activities and a composite activi... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Panel: the model and metrics for software quality evaluation report of the Japanese National Working Group

    Publication Year: 1990, Page(s):64 - 69
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (80 KB)

    The Japanese National Working Group has carried out research on developing a framework which clarifies the relation among internal and external characteristics, factors which affect software quality, and the effect of software quality. An attempt has also been made to develop metrics to measure them. The concept model and metrics that have been developed are presented with the future plan. Specifi... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An inversion capability for the PRESTIGE workbench: some basic issues

    Publication Year: 1990, Page(s):623 - 628
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (452 KB)

    The PRESTIGE workbench is an integrated CASE (computer-aided software environment) intended to provide full implementation support for Jackson System Development (JSD). JSD is an operational software development method, and thus implementation in JSD is essentially a transformational process. The main objective is to offer a generalized transformational facility that the JSD implementor can apply ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The analysis of infeasible concurrent paths of concurrent Ada programs

    Publication Year: 1990, Page(s):424 - 429
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (536 KB)

    In an execution of a concurrent Ada program, each task will traverse its own path so that the execution can be seen as involving a set of concurrent paths, referred to as a concurrent path (C-path). The path feasibility problem of concurrent program testing is to identify whether a given C-path is traversable in some execution. A static analysis technique is proposed to address this problem. The t... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • DEMOM-A description based media object data model

    Publication Year: 1990, Page(s):57 - 63
    Cited by:  Papers (2)  |  Patents (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (612 KB)

    A description is given of the DEMOM media object data model, which aims at providing a uniform framework for managing different types of media data i.e. images, text, sound or graphics. According to DEMOM, media objects are defined as a class hierarchy of objects, i.e., images, text, sound, and graphics are subtypes of the general type media object. Representation-specific objects are regarded as ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Performance analysis of the make and load building algorithms

    Publication Year: 1990, Page(s):20 - 25
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (424 KB)

    The performance of the make and load building algorithms is analyzed. The average time of compilation of an application is calculated by using the make algorithm. This time depends on the number of changed files, their compilation time, and the number of files affected by the change that have to be compiled. The load building algorithm chooses the files in an application that have to be compiled a... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Architecture and functionality of a specification environment for distributed software

    Publication Year: 1990, Page(s):617 - 622
    Cited by:  Papers (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (528 KB)

    A description is given of Graspin, a workstation-based prototype environment that aids in the incremental construction, verification, and prototyping of specifications for concurrent and distributed software systems. It includes a Petri net-based specification formalism, an editor generator with graphical capabilities, and tools for static semantics checking, automated verification of static and d... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Commercial applications of knowledge based systems: initiatives in the electric power industry

    Publication Year: 1990, Page(s):420 - 423
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (64 KB)

    The session discusses the rationale which led to the development of KBTAC (Knowledge-Based Technology Applications Center) of the Electric Power Research Institute; the advantages that KBTAC brings to its user base; KBTAC data and knowledge base applications; and how these can be generalized to other applications. The initial focus of the Center will be on nuclear power plants. The approach of KBT... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An approach to introduce the reflection to C++

    Publication Year: 1990, Page(s):52 - 56
    Cited by:  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (364 KB)

    The authors report on their work to introduce the reflective architecture in a popular compilation-based language, C++, without modifying the compiler. The reflective architecture provides a disciplined split between the object level and the meta-object level in a class-based form. The unit of causal connections is the class member function. Mechanisms based on the methods diversion are constructe... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The hyper-geometric distribution software reliability growth model (HGDM): precise formulation and applicability

    Publication Year: 1990, Page(s):13 - 19
    Cited by:  Papers (13)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (528 KB)

    The hyper-geometric distribution is used to estimate the number of initial faults residual in software at the beginning of the test-and-debug phase. The hyper-geometric distribution growth model (HGD model) is well suited to making estimates for the observed growth curves of the accumulated number of detected faults. The advantage of the proposed model is the applicability to all kinds of observed... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Industrial experience in automating software re-engineering

    Publication Year: 1990, Page(s):611 - 616
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (44 KB)

    The special problems of software re-engineering and technologies and approaches that specifically address these re-engineering problems are considered. Particular attention is given to approaches for analyzing and abstracting the functionality of old code, converting software to work with new languages or dialects, translating large systems, defining appropriate roles for the user of highly automa... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Extending software complexity metrics to concurrent programs

    Publication Year: 1990, Page(s):414 - 419
    Cited by:  Papers (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (584 KB)

    A metric for concurrent software is proposed based on an abstract model (Petri nets) as an extension of T.J. McCabe's (1976) cyclomatic number. As such, its focus is on the complexity of control flow. This metric is applied to the assessment of Ada programs, and an automatic method for its direct computation based on the inspection of Ada code is provided. It is pointed out, however, that wider ex... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An application of object-oriented design for communication control systems

    Publication Year: 1990, Page(s):43 - 51
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (576 KB)

    The authors present what is believed to be the first concrete object-oriented design (OOD) of a practical communication control system. First, all terms which are needed to describe target system behavior are unified by all designers, and listed. Next, objects are made from the term list, and operations found in target system behavior are added to the objects. The behavior of the target system is ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Predictability measures for software reliability models

    Publication Year: 1990, Page(s):7 - 12
    Cited by:  Papers (14)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (404 KB)

    A two-component predictability measure is presented that characterizes the long-term predictability of a software reliability growth model. The first component, average predictability, measures how well a model predicts throughout the testing phase. The second component, average bias, is a measure of the general tendency to overestimate or underestimate the number of faults. Data sets for both lar... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A semi-adaptive DCT compression method that uses minimal space

    Publication Year: 1990, Page(s):359 - 362
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (504 KB)

    Adaptive DCT (discrete cosine transform) compression methods outperform fixed DCT compression methods in terms of image quality, but they need a large amount of scratch space for the transformed image file. The author proposes a semi-adaptive DCT compression method that outperforms fixed DCT compression, but uses only a small amount of scratch space. This method was designed for use in an electron... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • MCFS: a multiple criteria reasoning fuzzy expert systems building tool

    Publication Year: 1990, Page(s):605 - 610
    Cited by:  Patents (7)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (512 KB)

    The authors present the design principles of MCFS, an expert system building tool based on the idea of combining multiple criteria reasoning with the concepts of fuzzy logic. An important feature of MCFS is its ability to handle multiple criteria reasoning by structuring the deduction process into a hierarchy of logical levels. Within each level, the rules are organized into sets of rules and rule... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A methodology for formal specification and implementation of Ada packages

    Publication Year: 1990, Page(s):491 - 496
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (496 KB)

    The authors present a methodology for formal specification and prototype implementation of Ada packages using the Anna specification language. Given the formal specification of a package resulting from the methodology for package specifications, the methodology allows implementors of packages to follow a few simple steps to implement the package. The implementation is meant to be a prototype. This... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A connectionist approach to multiple-view based 3-D object recognition

    Publication Year: 1990, Page(s):665 - 670
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (460 KB)

    The authors propose a hierarchical approach to solving the surface and the vertex correspondence problems in multiple-view based 3-D object recognition systems. The proposed scheme is a coarse-to-fine search process, and a Hopfield network is employed at each stage. Compared with the conventional object matching schemes, the proposed technique provides a more general and compact formulation of the... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Twin-page storage management for rapid transaction-undo recovery

    Publication Year: 1990, Page(s):295 - 300
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (524 KB)

    A twin-page disk-storage management scheme for rapid database transaction-undo recovery is presented and evaluated. In contrast to previous twin-page schemes, this approach uses static page mapping and allows dirty pages in the main memory to be written, at any instant, onto disk without the requirement of undo logging. No explicit undo is required when a transaction is aborted. Transaction undo i... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The lines of code metric as a predictor of program faults: a critical analysis

    Publication Year: 1990, Page(s):408 - 413
    Cited by:  Papers (10)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (560 KB)

    The relationship between measures of software complexity and programming errors is explored. Four distinct regression models were developed for an experimental set of data to create a predictive model from software complexity metrics to program errors. The lines of code metric, traditionally associated with programming errors in predictive models, was found to be less valuable as a criterion measu... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Path expression in data flow program testing

    Publication Year: 1990, Page(s):570 - 576
    Cited by:  Papers (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (492 KB)

    The language of regular expressions is used for the identification of constructors of definition-use chains. Activation of the chains is essential for all data flow testing strategies. The algorithm is based on the node-elimination method of J.A. Brzozowski and E.J. McCluskey (1963). It generates a regular expression that represents the (possibly infinite) set of all constructors of the chain invo... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The conceptual design of OSEA: an object-oriented semantic data model

    Publication Year: 1990, Page(s):221 - 230
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (808 KB)

    Semantics is integrated with an object-oriented data model to increase its power of expression, leading to the notions of subsumption of attributes, values, and relationships. In addition, the concept of inheritance is extended to include inheritance of relationships as well as attributes and operations. The specification of attributes, relationships and operations for the hierarchical framework i... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.