By Topic

Software Engineering and Formal Methods, 2006. SEFM 2006. Fourth IEEE International Conference on

Date 11-15 Sept. 2006

Filter Results

Displaying Results 1 - 25 of 39
  • Fourth IEEE International Conference on Software Engineering and Formal Methods [Cover]

    Page(s): c1
    Save to Project icon | Request Permissions | PDF file iconPDF (1593 KB)  
    Freely Available from IEEE
  • Fourth IEEE International Conference on Software Engineering and Formal Methods - Title

    Page(s): i - iii
    Save to Project icon | Request Permissions | PDF file iconPDF (136 KB)  
    Freely Available from IEEE
  • Fourth IEEE International Conference on Software Engineering and Formal Methods - Copyright

    Page(s): iv
    Save to Project icon | Request Permissions | PDF file iconPDF (97 KB)  
    Freely Available from IEEE
  • Fourth IEEE International Conference on Software Engineering and Formal Methods [Table of contents]

    Page(s): 263
    Save to Project icon | Request Permissions | PDF file iconPDF (59 KB)  
    Freely Available from IEEE
  • Preface

    Page(s): viii
    Save to Project icon | Request Permissions | PDF file iconPDF (136 KB)  
    Freely Available from IEEE
  • Committees

    Page(s): ix - x
    Save to Project icon | Request Permissions | PDF file iconPDF (128 KB)  
    Freely Available from IEEE
  • Reviewers

    Page(s): xi
    Save to Project icon | Request Permissions | PDF file iconPDF (99 KB)  
    Freely Available from IEEE
  • Modeling Heterogeneous Real-time Components in BIP

    Page(s): 3 - 12
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (211 KB) |  | HTML iconHTML  

    We present a methodology for modeling heterogeneous real-time components. Components are obtained as the superposition of three layers: behavior, specified as a set of transitions; Interactions between transitions of the behavior; Priorities, used to choose amongst possible interactions. A parameterized binary composition operator is used to compose components layer by layer. We present the BIP language for the description and composition of layered components as well as associated tools for executing and analyzing components on a dedicated platform. The language provides a powerful mechanism for structuring interactions involving rendezvous and broadcast. We show that synchronous and timed systems are particular classes of components. Finally, we provide examples and compare the BIP framework to existing ones for heterogeneous component-based modeling View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The Context of Object Computation (extended abstract)

    Page(s): 13 - 17
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (102 KB) |  | HTML iconHTML  

    A program, or in object-oriented programming a feature, is characterized not only by an implementation but by a contract specifying its intent and a proof obligation to ascertain that the implementation meets the contract. From these ideas it is possible to derive a general framework for discussing programs and program development View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Automatic Property Checking for Software: Past, Present and Future

    Page(s): 18 - 20
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (87 KB) |  | HTML iconHTML  

    Software validation is a very hard problem. Traditionally, most validation in our industry has been done by testing. Testing is the process of running software on representative inputs and checking if the software behaves as intended. There are various granularities in which testing is performed ranging from unit tests that test small units of the system, to system-wide tests. Over the past decade, automatic property checking tools that use static analysis have started providing a complementary approach to software validation. These tools are intended to augment, rather than replace, testing. These tools do not typically ensure that the software implements intended functionality correctly. Instead, they look for specific kind of errors more thoroughly inside the program by analyzing how control and data flow through the program. This short paper surveys the state of the art in property checking tools and presents the author's personal perspective on future research in this area View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Harnessing Disruptive Innovation in Formal Verification

    Page(s): 21 - 30
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (133 KB) |  | HTML iconHTML  

    Technological innovations are sweeping through the field of formal verification. These changes are disruptive to tools based on interactive theorem proving, which needs new ways to integrate the capabilities of novel technologies. I describe two approaches. One is development and use of SMT solvers: these use techniques from theorem proving but apply them in ways that enable model checking, while also supporting highly automated theorem proving. The other is a proposal for an evidential tool bus: a loosely coupled architecture that allows many different verification components to collaborate to solve problems beyond the capability of any single component View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A Semi-Automatic Methodology for Repairing FaultyWeb Sites

    Page(s): 31 - 40
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (224 KB) |  | HTML iconHTML  

    The development and maintenance of Web sites are difficult tasks. To maintain the consistency of ever-larger, complex Web sites, Web administrators need effective mechanisms that assist them in fixing every possible inconsistency. In this paper, we present a novel methodology for semi-automatically repairing faulty Web sites which can be integrated on top of an existing rewriting-based verification technique developed in a previous work. Starting from a categorization of the kinds of errors that can be found during the Web verification activities, we formulate a stepwise transformation procedure that achieves correctness and completeness of the Web site w.r.t. its formal specification while respecting the structure of the document (e.g. the schema of an XML document). Finally, we shortly describe a prototype implementation of the repairing tool which we used for an experimental evaluation of our method View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • On Bisimilarities Induced by Relations on Actions

    Page(s): 41 - 49
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (198 KB) |  | HTML iconHTML  

    In this paper, we give a straightforward generalization of bisimulations to "bisimulations induced by a pair of relations" on the underlying action set. We establish that many of the nice properties of bisimulations and bisimilarities may be thought of as actually being inherited from properties of the underlying relations on actions. We show that many bisimulation-based orderings (including strong and weak bisimilarity) defined in the literature are instances of this generalization. We also show by an example that there are instances where the equivalence of two systems (which intuitively have the same functionality), cannot be established directly by observational equivalence, but requires a more general notion. We finally give an adaptation of the "on-the-fly algorithm " of Fernandez and Mourner for computing generalized bisimilarities View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Formalizing AspectJ Weaving for Static Pointcuts

    Page(s): 50 - 59
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (225 KB)  

    The aspect oriented programming is a new programming paradigm that provides a means of encapsulation of crosscuting concerns in software. This paper describes a formal semantics of advice weaving in AspectJ, an aspect oriented programming language that extends Java. The advice weaving is performed on the bytecode in regions of the code that correspond to join points declared by pointcuts. AspectJ provides two kinds of pointcuts: static pointcuts and dynamic pointcuts. The static pointcuts quantify over static properties of join points, and thus correspond directly to locations in the bytecode whereas dynamic pointcuts quantify over dynamic properties of join points and can not be definitely mapped to places in bytecode. In this paper, we focus only on static pointcuts View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Filtering Retrenchments into Refinements

    Page(s): 60 - 69
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (172 KB) |  | HTML iconHTML  

    Retrenchment is a weakening of model based refinement that enables many development steps not expressible by refinement to be formally described nevertheless. The greater flexibility of retrenchment comes at the price of much feebler guarantees as compared with refinement, and so the interplay between retrenchment and refinement can hope to offer the best of both worlds. The paper explores the strategy of filtering the information in a retrenchment to yield a refinement under a suitable notion of observation. A general construction is given that enables a retrenchment, with its intrinsic notion of observability, to be filtered to produce a refinement with its intrinsic notion of observability. A simple running example illustrates the theory View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Computing Complete Test Graphs for Hierarchical Systems

    Page(s): 70 - 79
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (189 KB) |  | HTML iconHTML  

    Conformance testing focuses on checking whether an implementation under test (IUT) behaves according to its specification. Typically, testers are interested in performing targeted tests that exercise certain features of the IUT. This intention is formalized as a test purpose. The tester needs a "strategy" to reach the goal specified by the test purpose. Also, for a particular test case, the strategy should tell the tester whether the IUT has passed, failed, or deviated from the test purpose. In (J. Jeron and P. Morel, 1999) Jeron and Morel show how to compute, for a given finite state machine specification and a test purpose automaton, a complete test graph (CTG) which represents all test strategies. In this paper, we consider the case when the specification is a hierarchical state machine and show how to compute a hierarchical CTG which preserves the hierarchical structure of the specification. We also propose an algorithm for an online test oracle which avoids a space overhead associated with the CTG View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Jose: Aspects for Design by Contract80-89

    Page(s): 80 - 89
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (153 KB) |  | HTML iconHTML  

    Design by contract is a practical methodology for evolving code together with its specification. The contract has important methodological implications on the design of the program. In addition, tools that instrument the code to check for contract violations help the development process by catching errors close to their sources. This is complicated by several factors, such as the need to collect preconditions from supertypes. There are two issues involved in the implementation of such a tool: the correct enforcement of the theoretical principles, and the instrumentation of the code. Most previous tools tackle both issues, but have subtle failures in one or the other. This paper describes Jose, a tool for design by contract in Java, which uses AspectJ, an aspect-oriented extension of Java, to instrument the program. This allows us to leverage the expertise of the AspectJ developers in instrumenting Java programs, and concentrate on the correct implementation of the design by-contract principles. This approach has the added benefit that it can be generalized to other object-oriented languages that have aspect-oriented extensions. We describe the design decisions made in the implementation of Jose, and the features of AspectJ that helped or hindered this implementation View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Discern: Towards the Automatic Discovery of Software Contracts

    Page(s): 90 - 99
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (162 KB) |  | HTML iconHTML  

    Design by contract is a practical methodology for evolving code together with its specification; it helps prevent many errors, and catch others close to their sources. Unfortunately, writing (and maintaining) contracts requires a non-trivial investment of time and effort. We are developing a tool, called Discern, to statically analyze existing programs and discover draft contracts for them. Discern works by propagating weakest preconditions and strongest postconditions through the code. Known pre- and postconditions of operations used in the code help refine the contract; conversely, new assertions discovered can be propagated to clients of the method being analyzed. As usual, loops make the analysis difficult; heuristics are used to recognize the most common loop forms and extract useful information about them. Discern uses a library containing specifications of the most-used language libraries. With the manual addition of one postcondition and tagging of 3 assertions as invariants, Discern computed the correct preconditions for all but one method of Java's Vector class, and similar results were obtained for StringBuffer View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A Model for Temporal relations between Object Roles

    Page(s): 100 - 107
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (158 KB) |  | HTML iconHTML  

    The concept of roles has been advocated to model application domain objects which evolve dynamically during their lifespan. These objects may acquire new and drop old roles. Several research efforts have focused on formalizing roles as conceptual unit and their mappings to classes and objects of class based languages. This paper presents a formal notation for modelling temporal relationships between roles using notion of semi-intervals rather than intervals. A semi-interval is a partially ordered set of time instances for which the endpoints are either not known or not relevant. Each role and their instances are associated with a lifespan which is a set of semi-intervals. The temporal relations are defined in terms of relationships between the lifespan of roles. An algorithm for computing the transitive closure of temporal relations is presented for inferring implicit relations. Both explicit and implicit relations define constraints which must be honored for acquiring and dropping the roles. A simple framework has been implemented in Java to demonstrate the usability of these concepts View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A PVS Based Framework for Validating Compiler Optimizations

    Page(s): 108 - 117
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (180 KB) |  | HTML iconHTML  

    An optimization can be specified as sequential compositions of predefined transformation primitives. For each primitive, we can define soundness conditions which guarantee that the transformation is semantics preserving. An optimization of a program preserves semantics, if all applications of the primitives in the optimization satisfy their respective soundness conditions on the versions of the input program on which they are applied. This scheme does not directly check semantic equivalence of the input and the optimized programs and is therefore amenable to automation. Automating this scheme however requires a trusted framework for simulating transformation primitives and checking their soundness conditions. In this paper, we present the design of such a framework based on PVS. We have used it for specifying and validating several optimizations viz. common subexpression elimination, optimal code placement, lazy code motion, loop invariant code motion, full and partial dead code elimination, etc View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Formal Modelling and Verification of an Asynchronous DLX Pipeline

    Page(s): 118 - 127
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (144 KB) |  | HTML iconHTML  

    A five stage pipeline of an asynchronous DLX processor is modelled and its control flow is verified. The model is built using an asynchronous pipeline of latches separated by processing logic. We model two versions of this pipeline: one using latch controllers with four-phase semi-decoupled and another using fully-decoupled protocol. All the processing units are modelled as processes in the PROMELA language of the Spin tool. The model is verified in Spin by means of assertions, LTL properties and progress labels. A useful observation obtained from the study is that: although the semi-decoupled protocol has the potential to hold a data item in every latch, in the presence of processing logic, at most alternate stages can execute at a given time. Its implication being, in the case of control and data hazards no pipeline stalls are necessary, in the case of fully decoupled version, all stages could execute valid instructions at the same time. All the models were verified to be free from deadlock View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Product Automata and Process Algebra

    Page(s): 128 - 136
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (200 KB) |  | HTML iconHTML  

    We define a model of labelled product systems of automata and explore its connections with process calculi and trace languages. Bisimilarity of labelled product systems is defined using a new definition of bisimulation with renaming. Concurrent mu-expressions are defined to describe labelled product systems. This leads to complete axiomatizations and algorithms for bisimulation and failure equivalence over labelled product systems, and for equality over recognizable trace languages View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Verification of JAVA CARD Applets Behavior with Respect to Transactions and Card Tears

    Page(s): 137 - 146
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (188 KB) |  | HTML iconHTML  

    The Java card transaction mechanism allows to protect sensitive operations on smart cards against problems due to card tears or power losses. Statements within a transaction are viewed as a single atomic operation so that either they are all performed or none of them is. KRAKATOA is a tool for static verification of Java programs annotated in JML (Java modeling language), a behavioral specification language tailored to Java and based on first order predicate logic. In a first step, we show how we modeled the transactions within KRAKATOA, by generating on-the-fly (i.e. on each applet) specifications of the API methods for transactions. In a second step, we consider security problems that can be caused by a card tear. We propose new JML constructs allowing to express properties to satisfy when a method is interrupted by a card tear, also taking non-atomic methods into account. We present a modeling of these constructs in KRAKATOA, and show it is practicable for the detection of potential security holes, or to prove the absence of risk View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Specifying Data-Flow Requirements for the Automated Composition of Web Services

    Page(s): 147 - 156
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (6811 KB) |  | HTML iconHTML  

    One of the fundamental Ideas of Weh services and service oriented architecture is the possibility to develop new applications by composing existing services that are available on the Web. Several approaches have been proposed to tackle the problem of Web service composition, but little effort has been devoted so far to the problem of modeling the requirements of the composition. However, it is clear that the possibility to express requirements specifying complex interactions patterns among the component services is an essential step to cope with a wide range of composition problems. In this paper we present a new model which addresses one of the key aspects of composition requirements, namely the dataflow among the component services. We develop graphical notations and a formal theory for the new model and we integrate it within an existing automated composition framework View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Requirements Modeling -- Experience from an Insurance Project

    Page(s): 157 - 166
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (193 KB) |  | HTML iconHTML  

    Most errors in a software development life cycle are introduced in the requirements phase. Rigorous specifications and automatic analysis can address this problem. However, there are almost no tools that are based on formal analysis that can scale up for business systems. Also, there is very little literature that reports on the usefulness and scalability of formal analysis of industrial-scale business specifications. The requirements modeling tool, a tool developed at TRDDC, which is a visual, intuitive yet formal notation with analysis support that finds gaps and inconsistencies in functional requirements, is a rare exception. This paper is an experience report on the tool's usage for capturing functional requirements for a real life insurance project. Four use cases were modeled and analysed formally, one very complex and the others moderately complex. The results are promising. Around 150 queries were raised, pointing to gaps, inconsistencies or ambiguities in the requirements View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.