An exploratory study of the state of practice of performance testing in Java-based open source projects
A peer-reviewed article of this Preprint also exists.
Author and article information
Abstract
The usage of open source (OS) software is nowadays wide- spread across many industries and domains. While the functional quality of OS projects is considered to be up to par with that of closed-source software, much is unknown about the quality in terms of non-functional attributes, such as performance. One challenge for OS developers is that, unlike for functional testing, there is a lack of accepted best practices for performance testing. To reveal the state of practice of performance testing in OS projects, we conduct an exploratory study on 111 Java-based OS projects from GitHub. We study the performance tests of these projects from five perspectives: (1) the developers, (2) size, (3) organization and (4) types of performance tests and (5) the tooling used for performance testing. First, in a quantitative study we show that writing performance tests is not a popular task in OS projects: performance tests form only a small portion of the test suite, are rarely updated, and are usually maintained by a small group of core project developers. Second, we show through a qualitative study that even though many projects are aware that they need performance tests, developers appear to struggle implementing them. We argue that future performance testing frameworks should provider better support for low-friction testing, for instance via non-parameterized methods or performance test generation, as well as focus on a tight integration with standard continuous integration tooling.
Cite this as
2016. An exploratory study of the state of practice of performance testing in Java-based open source projects. PeerJ Preprints 4:e2496v2 https://doi.org/10.7287/peerj.preprints.2496v2Author comment
This is a slight update to the previous version. No conclusions have been changed.
Sections
Additional Information
Competing Interests
Philipp Leitner is an academic editor of PeerJ Computer Science.
Author Contributions
Philipp Leitner conceived and designed the experiments, performed the experiments, analyzed the data, wrote the paper, prepared figures and/or tables, reviewed drafts of the paper.
Cor-Paul Bezemer conceived and designed the experiments, performed the experiments, analyzed the data, wrote the paper, prepared figures and/or tables, reviewed drafts of the paper.
Data Deposition
The following information was supplied regarding data availability:
An online appendix to this publication is available at GitHub:
https://xleitix.github.io/appendix_perf_tests/
Funding
Philipp Leitner has received funding from the Swiss National Science Foundation (SNF) under project MINCA (Models to Increase the Cost Awareness of Cloud Developers). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.