By Topic

Software Engineering, 1994. Proceedings. ICSE-16., 16th International Conference on

Date 16-21 May 1994

Filter Results

Displaying Results 1 - 25 of 45
  • Proceedings of 16th International Conference on Software Engineering

    Publication Year: 1994
    Save to Project icon | Request Permissions | PDF file iconPDF (438 KB)  
    Freely Available from IEEE
  • Workshop on the intersection between databases and software engineeering

    Publication Year: 1994 , Page(s): 355
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (44 KB)  

    First Page of the Article
    View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A process for hitting paydirt

    Publication Year: 1994 , Page(s): 369
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (96 KB)  

    First Page of the Article
    View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Understanding “why” in software process modelling, analysis, and design

    Publication Year: 1994 , Page(s): 159 - 168
    Cited by:  Papers (61)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (884 KB)  

    In trying to understand and redesign software processes, it is often necessary to have an understanding of the “whys” that underlie the “whats” - the motivations, intents, and rationales behind the activities and input-output flows. This paper presents a model which captures the intentional structure of a software process and its embedding organization, in terms of dependency relationships among actors. Actors depend on each other for goals to be achieved, tasks to be performed, and resources to be furnished. The model is embedded in the conceptual modelling language Telos. We outline some analytical tools to be developed for the model, and illustrate how the model can help in the systematic design of software processes. The examples used are adaptations of the ISPW-6/7 benchmark example View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An integrated method for effective behaviour analysis of distributed systems

    Publication Year: 1994 , Page(s): 309 - 320
    Cited by:  Papers (5)  |  Patents (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (956 KB)  

    Behavioural analysis is a valuable aid for the design and maintenance of well-behaved distributed systems. Dataflow and reachability analyses are two orthogonal, but complementary, behavioural analysis techniques. Individually, each of these techniques may be inadequate for the analysis of large-scale distributed systems. On the one hand, dataflow analysis algorithms, while tractable, may not be sufficiently accurate to provide meaningful detection of errors. On the other hand, reachability analysis, while providing exhaustive analysis, may be computationally too expensive for complex systems. In this paper, we present a method which integrates dataflow and reachability analysis techniques to provide a flexible and effective means for analysing distributed systems at the preliminary and final design stages respectively. We also describe some effective measures taken to improve the adequacy of the individual analysis techniques using the concepts of action dependency and context constraints. A prototype supporting the method has been built, and its performance is described in this paper. A realistic example of a distributed track control system is used as a case study View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • What small businesses and small organizations say about the CMM

    Publication Year: 1994 , Page(s): 331 - 340
    Cited by:  Papers (17)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (728 KB)  

    The US Air Force sponsored research within the Department of Defense software development community to determine the applicability of the Software Engineering Institute's capability maturity model (CMM) for software to small businesses and small software organizations. The research found that small businesses are faced not only with a lack of resources and funds required to implement many of the practices stated in the CMM, but also with the task of basing their process improvement initiatives on practices that do not apply to a small business and small software organization. This paper discusses, from industry's perspective, why small businesses and organizations are experiencing difficulties implementing CMM-based process improvement programs and how they are tailoring their approach to the CMM to meet their quality goals View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Software process improvement experience in the DP/MIS function

    Publication Year: 1994 , Page(s): 323 - 329
    Cited by:  Papers (4)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (584 KB)  

    This experience report outlines Corning Inc.'s experience in successfully using software process assessment as part of their Information Service Division's software process improvement program. Improvement actions executed as indicated and prioritized by our self-assessment findings resulted in largely eliminating the cost and schedule overruns on projects in ISD's Systems Engineering Group. This paper describes the ISD process improvement initiative; a summary of our observations and the key lessons we learned concludes the paper View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The SMART approach for software process engineering

    Publication Year: 1994 , Page(s): 341 - 350
    Cited by:  Papers (5)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (744 KB)  

    Describes a methodology for software process engineering and an environment, SMART, that supports it. SMART supports a process life-cycle that includes the modeling, analysis, and execution of software processes. SMART's process monitoring capabilities can be used to provide feedback from the process execution to the process model. SMART represents the integration of three separately developed process mechanisms, and it uses two modeling formalisms (object-oriented data representation and imperative-style programming language) to bridge the gap between process modeling, analysis, and execution. SMART demonstrates the meta-environment concept, using a process modeling formalism as input specification to a generator that produces process-centered software engineering environments (PSEEs). Furthermore, SMART supports a team-oriented approach for process modeling, analysis, and execution View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Software reuse - facts and myths

    Publication Year: 1994 , Page(s): 267 - 268
    Cited by:  Papers (3)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (156 KB)  

    The concept of systematic software reuse is simple: the idea of building and using “software preferred parts.” By building systems out of carefully designed, pre-tested components, one will save the cost of designing, writing and testing new code. The practice of reuse has not proven to be this simple however, and there are many misconceptions about how to implement and gain benefit from software reuse View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • TESTTUBE: a system for selective regression testing

    Publication Year: 1994 , Page(s): 211 - 220
    Cited by:  Papers (40)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (832 KB)  

    The paper describes a system called TESTTUBE that combines static and dynamic analysis to perform selective retesting of software systems written in C. TESTTUBE first identifies which functions, types, variables and macros are covered by each test unit in a test suite. Each time the system under test is modified, TESTTUBE identifies which entities were changed to create the new version. Using the coverage and change information, TESTTUBE selects only those test units that cover the changed entities for testing the new version. We have applied TESTTUBE to selective retesting of two software systems, an I/O library and a source code analyzer. Additionally, we are adapting TESTTUBE for selective retesting of nondeterministic systems, where the main drawback is the unsuitability of dynamic analysis for identification of covered entities. Our experience with TESTTUBE has been quite encouraging, with an observed reduction of 50% or more in the number of test cases needed to test typical software changes View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Software reuse experience at Hewlett-Packard

    Publication Year: 1994
    Cited by:  Papers (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (88 KB)  

    At Hewlett-Packard, we have had visible divisional software reuse efforts since the mid-1980s. In 1990, we initiated a multi-faceted corporate reuse program to gather information about reuse from within HP and from other companies. As we studied the existing reuse programs, we discovered that certain issues were poorly understood, and as a consequence, mistakes were made in starting and running certain programs at HP and elsewhere. Our corporate reuse program focused on packaging best-practice information and guidelines to avoid common pitfalls. We also developed technology transfer and educational processes to spread this information and enhance reuse practice within the company. In 1992, we launched a multi-disciplinary research program to investigate and develop better methods for domain-specific, reuse-based software engineering. We have learned that for large-scale reuse to work, the problems to be overcome are mostly non-technical View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Exoskeletal software

    Publication Year: 1994
    Cited by:  Papers (3)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (96 KB)  

    The author advocates the use of a separate and explicit structural language to describe software architectures. The structural nature makes it amenable to both textual and graphical description. Since it is a language, it can be used to support general descriptions and to provide the framework for checking interconnections. In addition, it can be used to generate and manage the system itself. This approach, initially under the guise of simple “module interconnection languages” (MIL) and subsequently as “configuration languages”, provides generalised support for a wide variety of component and interaction types. Generic (skeleton) architectures provide the means for reusing structures with different constituent components. Dynamic constructs support explicit extension while constraining the potential structures of the system to those expressed as valid. Further, change can be supported at the architectural level, either offline on the design or code, or dynamically on the system itself. System structure (architecture), separately and explicitly described, should be recognised as the unifying framework upon which to hang specification, design, construction and evolution of systems View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • On the inference of configuration structures from source code

    Publication Year: 1994 , Page(s): 49 - 57
    Cited by:  Papers (32)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (628 KB)  

    We apply mathematical concept analysis to the problem of inferring configuration structures from existing source code. Concept analysis has been developed by German mathematicians over the last years; it can be seen as a discrete analogon to Fourier analysis. Based on this theory, our tool will accept source code, where configuration-specific statements are controlled by the preprocessor. The algorithm will compute a so-called concept lattice, which - when visually displayed - allows remarkable insight into the structure and properties of possible configurations. The lattice not only displays fine-grained dependencies between configuration threads, but also visualizes the overall quality of configuration structures according to software engineering principles. The paper presents a short introduction to concept analysis, as well as experimental results on various programs View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • On formal requirements modeling languages: RML revisited

    Publication Year: 1994 , Page(s): 135 - 147
    Cited by:  Papers (27)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1360 KB)  

    Research issues related to requirements modeling are introduced and discussed through a review of the requirements modeling language RML, its peers and its successors from the time it was first proposed at the Sixth International Conference on Software Engineering (ICSE-6) to the present - ten ICSEs later. We note that the central theme of “Capturing More World Knowledge” in the original RML proposal is becoming increasingly important in requirements engineering. The paper highlights key ideas and research issues that have driven RML and its peers, evaluates them retrospectively in the context of experience and more recent developments, and points out significant remaining problems and directions for requirements modeling research View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The role of testing methodologies in open systems standards

    Publication Year: 1994 , Page(s): 233 - 240
    Cited by:  Papers (2)  |  Patents (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (644 KB)  

    This paper describes the lifecycle role of a conformance testing research facility in the open systems standards environment. This facility, the Clemson Automated Testing System (CATS), has demonstrated the value of integrating formalized test methods within all phases of standards development. IEEE's effort to develop a standard for operating systems interfaces (POSIX) has provided a working environment to investigate and evaluate the capabilities of CATS. In this arena, CATS has proven valuable in exposing critical issues in the emerging standard and in formulating feasible solutions on multiple occasions. The role of CATS in the areas of automated testing, profile development and real-time extensions is described. A discussion of future directions for CATS and testing in open system standards concludes the paper View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Workshop on software engineering and computer-human interaction: joint research issues

    Publication Year: 1994 , Page(s): 356 - 357
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (168 KB)  

    Software engineering and computer-human interaction have much to do with each other, but their respective research communities typically have little interaction. The purpose of the article is to explore the intersections of these areas, determining what each community has to offer the other as well as to identify and address open problems of mutual interest. Topics of discussion were drawn from the following: cost drivers, current products, prototyping, requirements, formal methods and specifications, testing and evaluation, design and development, architectures, user interfaces and software environments, CHI and CSCW concerns and toolkits View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Lessons from using Basic LOTOS

    Publication Year: 1994 , Page(s): 5 - 14
    Cited by:  Papers (7)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (772 KB)  

    We describe three case studies in the use of Basic LOTOS for electronic switching systems software. The studies cover design recovery, requirements specification, and design activities. We also report lessons learned from the studies. Early lessons suggested changes to the syntax of the language used, and the need for some specific analysis tools. The last case study reports some of the results of these changes View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Formal specification techniques

    Publication Year: 1994 , Page(s): 223 - 227
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (412 KB)  

    Formal approaches to software specification and development have been a topic of active research for a long time. There now exists an important corpus of knowledge and results in this domain. There is more and more interest in the industrial applications of these techniques, even if it is generally observed that transfer is difficult in this area. The article surveys formal specification techniques, but, as it is difficult (and probably meaningless) to speak of such techniques independently from the development process, some formal development methods are discussed, as well as the impact of formal specifications on the development activities View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Facts and myths affecting software reuse

    Publication Year: 1994
    Cited by:  Papers (3)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (72 KB)  

    Discusses the three most important facts or myths affecting reuse. There is a great deal of misunderstanding about reuse in the software domain and it is difficult to pick out only three: there has been to much emphasis on the reuse of code; software reuse implies some form of modification of the artifact being reused; and software development processes do not explicitly support reuse, in fact they implicitly inhibit reuse View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Experiments on the effectiveness of dataflow- and control-flow-based test adequacy criteria

    Publication Year: 1994 , Page(s): 191 - 200
    Cited by:  Papers (178)  |  Patents (4)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (836 KB)  

    This paper reports an experimental study investigating the effectiveness of two code-based test adequacy criteria for identifying sets of test cases that detect faults. The all-edges and all-DUs (modified all-uses) coverage criteria were applied to 130 faulty program versions derived from seven moderate size base programs by seeding realistic faults. We generated several thousand test sets for each faulty program and examined the relationship between fault detection and coverage. Within the limited domain of our experiments, test sets achieving coverage levels over 90% usually showed significantly better fault detection than randomly chosen test sets of the same size. In addition, significant improvements in the effectiveness of coverage-based tests usually occurred as coverage increased from 90% to 100%. However the results also indicate that 100% code coverage alone is not a reliable indicator of the effectiveness of a test set. We also found that tests based respectively on control-flow and dataflow criteria are frequency complementary in their effectiveness View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Formalizing architectural connection

    Publication Year: 1994 , Page(s): 71 - 80
    Cited by:  Papers (79)  |  Patents (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (784 KB)  

    As software systems become more complex the overall system structure - or software architecture - becomes a central design problem. An important step towards an engineering discipline of software is a formal basis for describing and analyzing these designs. We present a theory for one aspect of architectural description, the interactions between components. The key idea is to define architectural connectors as explicit semantic entities. These are specified as a collection of protocols that characterize each of the participant roles in an interaction and how these roles interact. We illustrate how this scheme can be used to define a variety of common architectural connectors. We provide a formal semantics and show how this lends to a sound deductive system in which architectural compatibility can be checked in a way analogous to type checking in programming languages View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Reuse facts and myths

    Publication Year: 1994
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (72 KB)  

    This paper on software reuse is the view of a practitioner rather than of a scientist. Myth 1: OO technology eats up reuse. Fact 1: OO does not automatically yield high reuse rates-both OO and reuse can complement each other. Myth 2: Incentives are key to reuse success. Fact 2: Incentives create awareness, are cheap but don't change much. Myth 3: Reuse is for free. Fact 3: Reuse is a mid-term investment impacting the entire software development process. It must be based on a product strategy which spans several releases or a family of products View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Nico Habermann's research: a brief retrospective

    Publication Year: 1994 , Page(s): 149 - 153
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (300 KB)  

    The last decade and a half of Nico Habermann's research career focused on software engineering, and in particular on software development environments. His earlier work was oriented more towards operating systems and programming language research. We take this opportunity to look back at his research, putting it in a larger perspective, identifying some general themes that characterize his contributions to software engineering in particular, and to computer science in general View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Automated construction of testing and analysis tools

    Publication Year: 1994 , Page(s): 241 - 250
    Cited by:  Papers (4)  |  Patents (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (952 KB)  

    Many software testing and analyse's tools manipulate graph representations of programs, such as abstract syntax trees or abstract semantics graphs. Hand-crafting such tools in conventional programming languages can be difficult, error prone, and time consuming. Our approach is to use application generators targeted for the domain of graph-representation-based testing and analysis tools. Moreover, we generate the generators themselves, so that the development of tools based on different languages and/or representations can also be supported better. In this paper we report on our experiences in developing a system called Aria that generates testing and analysis tools based on an abstract semantics graph representation for C and C++ cabled Reprise. Aria itself was generated by the Genoa system. We demonstrate the utility of Aria and, thereby, the pourer of our approach, by showing Aria's use in the development of a tool that derives control dependence graphs directly from Reprise abstract semantics graphs View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Software reuse myths revisited

    Publication Year: 1994 , Page(s): 271 - 272
    Cited by:  Papers (3)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (144 KB)  

    In ACM Software Engineering Notices, vol. 13, no. 1, pp. 17-21 (1988), the author published the paper “Software reuse myths”. This paper comments on these “myths” in the light of recent technology advances: (1) software reuse is a technical problem; (2) special tools are needed for software reuse; (3) reusing code results in huge increases in productivity; (4) artificial intelligence will solve the reuse problem; (5) the Japanese have solved the reuse problem; (6) Ada has solved the reuse problem; (7) designing software from reusable parts is like designing hardware using integrated circuits; (8) reused software is the same as reusable software; and (9) software reuse will just happen View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.