*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************
Ph.D. Defense of Dissertation Announcement
--------------------------------------------------------------
Raul Santelices
School of Computer Science, College of Computing Georgia Institute of Technology raul@cc.gatech.edu
Title: Change-effects Analysis for Effective Testing and Validation of Evolving Software
Date: Wednesday, April 25, 2012
Time: 11:15am - 1:45pm, EST
Location: KACB 3100
Committee:
Abstract:
The constant modification of software during its life cycle poses many challenges for developers and testers because changes might not behave as expected or may introduce erroneous side effects. For those reasons, it is of critical importance to analyze, test, and validate software every time it changes.
The most common method for validating modified software is regression testing, which identifies differences in the behavior of software caused by changes and determines the correctness of those differences. Most research to this date has focused on the efficiency of re-running existing test cases affected by changes during regression testing. However, little attention has been given to finding whether the test suite adequately tests the effects of changes (i.e., the behavior differences in the modified software) and which of those effects are missed during testing. In practice, it is necessary not only to re-run the test suite, but also to augment it to exercise the untested effects.
The thesis of this research is that the effects of changes on software behavior can be computed with unprecedented precision to help testers analyze the consequences of changes and augment test suites effectively. To demonstrate this thesis, this dissertation uses novel insights to introduce a fundamental understanding of how changes affect the behavior of software. Based on these foundations, the dissertation presents and studies new techniques that detect and use these effects in cost-effective ways. These techniques support test-suite augmentation by identifying the effects of individual changes that should be analyzed and tested, identifying the combined effects of multiple changes, and optimizing the computation of these effects.
This dissertation makes the following contributions to the fields of software engineering and program analysis: