Loading...
Thumbnail Image
Publication

Mining Performance Regression Inducing Code Changes in Evolving Software

Luo, Qi
Poshyvanyk, Denys
Grechanik, Mark
Abstract
During software evolution, the source code of a system frequently changes due to bug fixes or new feature requests. Some of these changes may accidentally degrade performance of a newly released software version. A notable problem of regression testing is how to find problematic changes (out of a large number of committed changes) that may be responsible for performance regressions under certain test inputs. We propose a novel recommendation system, coined as PerfImpact, for automatically identifying code changes that may potentially be responsible for performance regressions using a combination of search-based input profiling and change impact analysis techniques. PerfImpact independently sends the same input values to two releases of the application under test, and uses a genetic algorithm to mine execution traces and explore a large space of input value combinations to find specific inputs that take longer time to execute in a new release. Since these input values are likely to expose performance regressions, PerfImpact automatically mines the corresponding execution traces to evaluate the impact of each code change on the performance and ranks the changes based on their estimated contribution to performance regressions. We implemented PerfImpact and evaluated it on different releases of two open-source web applications. The results demonstrate that PerfImpact effectively detects input value combinations to expose performance regressions and mines the code changes are likely to be responsible for these performance regressions.
Description
Date
2016-01-01
Journal Title
Journal ISSN
Volume Title
Publisher
Collections
Download Dataset
Rights Holder
Usage License
Embargo
Research Projects
Organizational Units
Journal Issue
Keywords
Citation
Advisor
Department
Computer Science
DOI
https://doi.org/10.1145/2901739.2901765
Embedded videos