13th Working Conference on Mining Software Repositories (Msr 2016)
During software evolution, the source code of a system frequently changes due to bug fixes or new feature requests. Some of these changes may accidentally degrade performance of a newly released software version. A notable problem of regression testing is how to find problematic changes (out of a large number of committed changes) that may be responsible for performance regressions under certain test inputs. We propose a novel recommendation system, coined as PerfImpact, for automatically identifying code changes that may potentially be responsible for performance regressions using a combination of search-based input profiling and change impact analysis techniques. PerfImpact independently sends the same input values to two releases of the application under test, and uses a genetic algorithm to mine execution traces and explore a large space of input value combinations to find specific inputs that take longer time to execute in a new release. Since these input values are likely to expose performance regressions, PerfImpact automatically mines the corresponding execution traces to evaluate the impact of each code change on the performance and ranks the changes based on their estimated contribution to performance regressions. We implemented PerfImpact and evaluated it on different releases of two open-source web applications. The results demonstrate that PerfImpact effectively detects input value combinations to expose performance regressions and mines the code changes are likely to be responsible for these performance regressions.
Luo, Qi; Poshyvanyk, Denys; and Grechanik, Mark, Mining Performance Regression Inducing Code Changes in Evolving Software (2016). 13th Working Conference on Mining Software Repositories (Msr 2016).