By Topic

A Comparative Study of Three Program Exploration Tools

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Brian de Alwis ; University of British Columbia ; Gail C. Murphy ; Martin P. Robillard

Programmers need tools to help explore large software systems when performing software evolution tasks. A variety of tools have been created to improve the effectiveness of such exploration. The usefulness of these tools has been argued largely on the basis of case studies, small narrowly-focussed experiments, or non-human-based experiments. In this paper, we report on a more rigorously controlled study of three specialized software exploration tools in which professional programmers used the tools to plan complex change tasks to a medium-sized code base. We found that the tools had little apparent effect; the effects observed instead appear to be dominated by individual styles and strategies of the programmers and characteristics of the tasks. In addition to presenting the results of the study, this paper introduces the use of two experimental evaluation aids: the NASA task load index (TLX) for assessing task difficulty and distance profiles for assessing the to which programmers remain on-track.

Published in:

15th IEEE International Conference on Program Comprehension (ICPC '07)

Date of Conference:

26-29 June 2007