Skip to Main Content
Automatic performance tools must of course be tested as to whether they perform their task correctly. Because performance tools are meta-programs, tool testing is more complex than ordinary program testing and comprises at least three aspects. First, it must be ensured that the tools do neither alter the semantics nor distort the run-time behavior of the application under investigation. Next, it must be verified that the tools collect the correct performance data as required by their specification. Finally, it must be checked that the tools indeed perform their intended tasks and detect relevant performance problems. Focusing on the latter (correctness) aspect, testing can be done using synthetic test functions with controllable performance properties, and/or real world applications with known performance behavior. A systematic test suite can be built from synthetic test functions and other components, possibly with the help of tools to assist the user in putting the pieces together into executable test programs. Clearly, such a test suite can be highly useful to builders of performance analysis tools. It is surprising that up till now, no systematic effort has been undertaken to provide such a suite. In this paper we discuss the initial design of a test suite for checking the correctness (in the above sense) of automatic performance analysis tools. In particular, we describe a collection of synthetic test functions which allows to easily construct both simple and more complex test programs with desired performance properties.