Scheduled System Maintenance:
On May 6th, single article purchases and IEEE account management will be unavailable from 8:00 AM - 12:00 PM ET (12:00 - 16:00 UTC). We apologize for the inconvenience.
By Topic

Assessment of Quality Software Development Tools, 1992., Proceedings of the Second Symposium on

Date 27-29 May 1992

Filter Results

Displaying Results 1 - 25 of 30
  • Proceedings of the Second Symposium on Assessment of Quality Software Development Tools (Cat. No.92TH0415-0)

    Publication Year: 1992
    Save to Project icon | Request Permissions | PDF file iconPDF (15 KB)  
    Freely Available from IEEE
  • Development of a workbench for knowledge-based systems using the ECMA reference model for CASE frameworks

    Publication Year: 1992 , Page(s): 254 - 261
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (484 KB)  

    The European Computer Manufacturers Association reference model for computer assisted software engineering environments frameworks (ECMA RM) provides a description of the services which should be provided by a software engineering environments (SEEs) as well as of the relations between these services. Over the past year, the ECMA RM has become a de facto European standard reference in the field of SEEs. This paper constitutes one of the first hands-on experience-reports on the use of the ECMA RM in the design of SEEs. The ECMA RM's salient features are discussed and evaluated on the pragmatic basis of their useability in the development of a software engineering workbench to support KBS development. Concrete proposals are then made for enhancing the useability of the RM View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A linear combination software reliability modeling tool with a graphically-oriented user interface

    Publication Year: 1992 , Page(s): 21 - 31
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (796 KB)  

    Previously the authors have shown that forming linear combination of model results tends to yield more accurate predictions of software reliability. Using linear combinations also simplifies the practitioner's task of deciding which model or models to apply to a particular development effort. Currently, no commercially available tools permit such combinations to be formed within the environment provided by the tool. Most software reliability modeling tools also do not take advantage of the high-resolution displays available today. Performing actions within the tool may be awkward, and the output of the tools may be understandable only to a specialist. They propose a software reliability modeling tool that allows users to formulate linear combination models, that can be operated by non-specialists, and that produces results in a form understandable by software developers and managers View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Requirements engineering support technique (REQUEST): a market driven requirements management process

    Publication Year: 1992 , Page(s): 211 - 223
    Cited by:  Papers (5)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (724 KB)  

    Prior to consuming resources for product development, whether for a new product or enhancements to an existing product, the requirements of the product must be determined. This requirements process must be `market driven' allowing product organizations to capture the `Voice of Customer', and it must describe the requirements in understandable and measurable terms to be analyzed in order to identify solutions for the requirements. It must be a definable, repeatable, and predictable process. This paper is a synopsis of the requirements engineering support technique (REQUEST). REQUEST is a process for use by the planning and product development organizations of IBM Lines of Business (LOBs), which introduces technical and managerial discipline into the requirements process. It transforms systematically the many `Voices of Customers' through various stages to a set of plan candidates by means of analysis, validation, and prioritization. It tracks and relates original requirements to plan items and vice versa View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Assessment of support for program understanding

    Publication Year: 1992 , Page(s): 102 - 111
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (820 KB)  

    Discusses tools for program understanding during the software maintenance phase. The program understanding is crucial to successful maintenance, but it is still poorly supported by analysis-oriented tools. In the light of cognitive studies for program understanding, the authors assess the existing tools for program understanding, and suggest an approach which facilitates the understanding of complex code during maintenance via the chunking process. During this process programmers recognize the abstract function or meaning of groups of statements and then piece together these chunks to form even larger chunks until the entire code is understood and mapped out. Chunking support can be effective as part of a maintenance toolkit. It lets maintenance personnel control code abstraction and ask many semantic questions about chunks and their relationship to other parts of the code View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • OpenProcess/6000: a total solution for process management

    Publication Year: 1992 , Page(s): 184 - 193
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (616 KB)  

    This paper presents a new approach in the domain of process modeling and process enactment, proposed by IBM and CAP Gemini Sogeti (CGS) on the IBM RISC System/6000. This paper starts with a short history of the CASE tools. The specific needs are detailed specially the need for process modeling and enactment. The requirements for an efficient process modeling and enactment are detailed. The paper ends with a summary of the new approach stressing on the adaptability to the customer environment, using a set of consulting services (OpenProcess/6000), tightly coupled with a flexible process tool from CGS (Process WEAVER) View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Assessment of reverse engineering tools: A MECCA approach

    Publication Year: 1992 , Page(s): 120 - 126
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (508 KB)  

    It is a general requirement in the software engineering field that quality and productivity should be taken into consideration. Software development tools can have significant impacts in assuring the quality of a software system and productivity of the development process. In a rapidly evolving engineering field such as software engineering, it is therefore important to select appropriate development tools. This paper discusses the reverse engineering tool as a quality software development tool. An introduction to the reverse engineering tools, their impact on the development environment and their functionality in general are cited before a MECCA-model for assessing specific tools is proposed. The main objective is to introduce the reader to how a reverse engineering tool can be assessed in terms of quality and productivity View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Overview of PCTE standardization

    Publication Year: 1992 , Page(s): 83 - 89
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (316 KB)  

    This paper outlines the background to the standardisation of PCTE. It briefly covers its origins, current status and future plans as well as an overview of current announced commercial backing and related activities. PCTE arose out of one of the first CEC sponsored ESPRIT collaborative research projects. ESPRIT Project 32 was formally titled `A Basis for a Portable Common Tool Environment'. The formal objectives of the PCTE project were to define the necessary interface specifications and to implement the basic utilities and a working prototype of a portable common tool environment to support tool development View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Graph visualization in software analysis

    Publication Year: 1992 , Page(s): 226 - 237
    Cited by:  Papers (3)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (780 KB)  

    Directed graphs are ubiquitous in most aspects of software analysis. Presented abstractly, as a list of edges, a graph does not manifest much of the important structural information that becomes obvious if the graph is displayed pictorially. This paper presents a technique for drawing directed graphs quickly and attractively. It also describes how a tool implementing this technique has been used, in conjunction with other programming and analysis tools, in various aspects of software engineering View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Introducing IEEE-CS 1175: a standard for tool interconnections

    Publication Year: 1992 , Page(s): 66 - 72
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (328 KB)  

    The IEEE Computer Society's (IEEE-CS) 1175 standard, Trial-Use Standard Reference Model for Computing System Tool Interconnections, exhibits a unique structure to assist the buyers, builders, testers, and users of professional computing tools. The standard not only provides a means to exchange software requirements and design information among different kinds of tools, it also contains guidance on the roles and usage of tools, both within organizations and within system architectures. Further, it assists tool users by providing terminology to help describe the characteristics of a tool-to-tool interchange that might be required. This paper summarizes the key elements of the IEEE-CS 1175 standard, which has been approved by the IEEE Computer Society for trial use, and describes how it can be applied to assist software engineering practitioners View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Integrated CASE: our experience with Teamwork and Rational

    Publication Year: 1992 , Page(s): 271 - 284
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (880 KB)  

    Mapping existing software development processes to new development methods, and implementing these methods with new and emerging computer aided software engineering (CASE) tools, is now beginning to fulfil to original CASE promises of minimizing the cost and improving the quality of software. In addition to these advantages, the computer-assistance provided by the tools, coupled with the object-oriented approach advocated by current methods, promises both near-term as well as long-term productivity increases View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An intelligent approach to verification and testing of the configurator

    Publication Year: 1992 , Page(s): 151 - 162
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (680 KB)  

    The configurator verification requires three basic processes: generating the test data, analyzing the actual and expected outputs, and fixing the deviations. The typical verification processes of today's configurators are to manually provide the test case and expected output, and then analyze the differences between the expected and actual test outputs. This manual approach not only constrains the testing and verification of the configurator but also is not feasible for the large computer configurator program such as IBM ES/900. An intelligent approach to verification and testing of large computer configurators is introduced where test data are generated automatically and test results are analyzed intelligently with little human intervention. This approach utilizes a generic configurator model where the a priori computer configurator knowledge is captured and applied to generate a large number of potential test data and to analyze the test results automatically. With this intelligent approach, human experts' knowledge and skills can be better utilized to initialize the configurator model and to review the potential faults of the configurator program as revealed by the testing results View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Experience in using three testing tools for research and education in software engineering

    Publication Year: 1992 , Page(s): 128 - 143
    Cited by:  Patents (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (892 KB)  

    It is a common belief that good software tools are necessary to support both research and education in software engineering. The authors document their experience, in support of this belief, with two data flow testing tools named ASSET and ATAC and one mutation testing tool named MOTHRA. These tools have been in use at Purdue University in research projects related to software testing and reliability. The tools have also been used in both undergraduate and graduate courses in software engineering View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • SEMST-a support environment for the management of software testing

    Publication Year: 1992 , Page(s): 11 - 20
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (708 KB)  

    This paper presents a newly developed environment for supporting the management of software testing. This environment has been built on the top of UNIX and RCS to maintain all versions of the specifications, the test cases and programs, as well as managing the relationships among these components. It is a practical model of applying software configuration management methods to the testing process View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • SQEngineer: a methodology and tool for specifying and engineering software quality

    Publication Year: 1992 , Page(s): 194 - 210
    Cited by:  Patents (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (904 KB)  

    Quality control techniques such as reviews, walkthroughs, inspections, and testing have been both suggested and heavily used in practice to build quality software. However, all these techniques are effective only as defect detection rather than prevention techniques. A more effective approach to preventing defects from being introduced into the product is to engineer quality into software. Software must be developed to meet quality requirements that have been specified and agreed upon. This paper proposes a methodology and an automated tool for specifying quality requirements and engineering quality into software. The SQEngineer methodology and tool evolved over a two year period through which they were used by software professionals in the graduate programs in software at the University of St. Thomas (USA) to develop quality plans for real life applications View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Introducing ANSI-X3.138-1988: a standard for information resource dictionary system (IRDS)

    Publication Year: 1992 , Page(s): 90 - 99
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (652 KB)  

    The American National Standards Institute (ANSI) released its standard for information resources dictionary systems on 19 October 1988. The standard was formulated by the ANSI Technical Committee for Information Resources and Dictionary (X3H4), a Technical Committee of the ANSI Accredited Standards Committee for Information Processing Systems (X3). The purpose of the standard is to define the requirements of a software tool used to describe, document, control, protect, and provide access to information about the information assets of an organization. Recently, two additional standards have been approved to supplement the basic standard. One, X3.185, addresses the interface requirements between an IRDS and external software. The other, X3.195, addresses export/import requirements for exchanging information between ANSI IRDSs. This paper summarizes the conceptual evolution of repositories, provides an overview of the IRDS standards and outlines future directions for evolution of IRDS standards View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A data flow coverage testing tool for C

    Publication Year: 1992 , Page(s): 2 - 10
    Cited by:  Papers (34)  |  Patents (3)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (748 KB)  

    Describes ATAC (Automatic Test Analysis for C), a tool for data flow coverage testing of C programs. ATAC is being used as a research instrument at Purdue and Bellcore and as a software development tool at Bellcore. The authors discuss the design of ATAC, a preliminary view of its uses in development, and its research uses View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Automated assessment of program and system quality

    Publication Year: 1992 , Page(s): 112 - 119
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (540 KB)  

    Software quality issues play a dominant role in the development of large scale software systems. High quality software enhances reuse potential and facilitates software maintenance activities. However, at present, software quality assessment is a labor intensive, error prone, time consuming process. Automated quality assessment of software systems is a cost effective alternative to ensure compliance with system quality objectives. This paper discusses how reverse engineering technology can be leveraged to solve the needs of software quality assurance teams View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Introducing EIA-CDIF: the CASE Data Interchange Format Standard

    Publication Year: 1992 , Page(s): 74 - 82
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (468 KB)  

    The Electronic Industries Association (EIA) released the CASE Data Interchange Format (CDIF) Interim Standards in July 1991. These standards were formulated by the EIA CDIF Technical Committee to facilitate movement of information between computer-aided software engineering (CASE) tools. CDIF Interim Standards are a family of standards that provide the vendor independent, method independent definitions of CASE data concepts necessary for information exchange. This paper provides an overview of the CDIF Interim Standards, summarizes prototyping efforts in progress, and outlines future directions for CDIF View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Evaluating and selecting testing tools

    Publication Year: 1992 , Page(s): 55 - 64
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1092 KB)  

    The authors consider how a data collection system that leads a company to successful tool selections must be carefully devised. They discuss the analysis of user needs and the establishment of tool selection criteria. They suggest a number of standards, articles and surveys which will help in the search for tools and provide a procedure for rating them View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Graph services for program understanding tools

    Publication Year: 1992 , Page(s): 238 - 251
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (800 KB)  

    Many types of problems are more easily understood when represented with graphs. This paper presents an organization of graph theoretical functions for use by an application program. The paper covers a unified set of generalized routines and data structures for graph reductions, path analysis, and data flow problems. Some suggestions for user interfaces are provided. Examples of problems in which these functions might be used are program understanding, software testing, software design, scheduling, and network management View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Architecture flow diagrams under Teamwork

    Publication Year: 1992 , Page(s): 262 - 270
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (524 KB)  

    The Teamwork CASE tool allows data flow diagrams (DFDs) to be maintained for structured analysis. Fermilab has extended Teamwork under UNIX to permit Hatley and Pirbhai architecture flow diagrams (AFDs) to be associated with DFDs and subsequently maintained. This extension, called TWKAFD, allows a user to open an AFD, graphically edit it, and replace it into a TWKAFD maintained library. Other aspects of Hatley and Pirbhai's methodology are supported. This paper presents a quick tutorial on architecture diagrams. It then describes the user's view of TWKAFD, the experiences incorporating it into Teamwork, and the successes with using the architecture diagram methodology along with the shortcomings of using the Teamwork/TWKAFD tool View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A tool for software quality

    Publication Year: 1992 , Page(s): 144 - 150
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (336 KB)  

    The authors present a theoretical study of a tool used to improve the quality of delivered software. Common approaches to software quality focus on the quantitative aspects of the code and neglect the qualitative sides of the utilization though these last characteristics are far from being worthless for the end-user. Starting from the work of B.W. Boehm et al. (1978) and J.A. McCall et al. (1977) concerning factors, criteria and metrics, the authors have constructed a model which serves to put together the user's needs and the programmer's possibilities. The model incorporates a quantitative assessment of both addresses. The algorithms of the tool using this model are also described View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A systematic approach to CASE selection

    Publication Year: 1992 , Page(s): 42 - 54
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (924 KB)  

    Seven CASE tools were evaluated and compared by the author and a group of graduate students. From the results of the author's evaluations, a rather fundamental topic is presented as an example to highlight some of the problems that arise during CASE selection. The identified problems especially seem to hit potential CASE users who are yet unfamiliar with current CASE technology. An approach is introduced and discussed aiming at a more systematic evaluation of tools without the need of CASE experts to perform it. Reuse of experience is suggested to reduce effort; the approach allows managers to directly express their preferences and constraints without being too much bothered by technological issues. Their individual decisions influence the process and the results of the evaluation. In introducing the approach step by step, they show how more and more hidden problems from the above example are mastered. This uncovers a number of crucial points which determine whether CASE selection will satisfy the potential user, i.e. if it can be successful or if it is bound to fail View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • PISCES: a tool for predicting software testability

    Publication Year: 1992 , Page(s): 297 - 309
    Cited by:  Papers (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1000 KB)  

    Before a program can fail, a software fault must be executed, that execution must alter the data state, and the incorrect data state must propagate to state that results directly in an incorrect output. This paper describes a tool called PISCES (developed by Reliable Software Technologies Corporation) for predicting the probability that faults in a particular program location will accomplish all three of these steps causing program failure. PISCES is a tool that is used during software verification and validation to predict a program's testability View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.