Instance Space Analysis of Search-Based Software Testing

Citation Author(s):
Neelofar
Neelofar
Monash University
Submitted by:
Neelofar Neelofar
Last updated:
Wed, 02/16/2022 - 20:05
DOI:
10.21227/e2pa-4p40
Data Format:
Links:
License:
0
0 ratings - Please login to submit your rating.

Abstract 

Search-based software testing (SBST) is now a mature area, with numerous techniques developed to tackle the increasingly challenging task of software testing. SBST techniques have shown promising results and have been successfully applied in industry to automatically generate test cases for large and complex software systems. Their effectiveness, however, has been shown to be problem dependent. In this paper, we revisit the problem of objective performance evaluation of SBST techniques in light of recent methodological advances – in the form of Instance Space Analysis (ISA) – enabling the strengths and weaknesses of SBST techniques to be visualised and assessed across the broadest possible space of problem instances (software classes) from common benchmark datasets. We identify features of SBST problems that explain why a particular instance is hard for an SBST technique, reveal areas of hard and easy problems in the instance space of existing benchmark datasets, and identify the strengths and weaknesses of state of the art SBST techniques. In addition, we examine the diversity and quality of common benchmark datasets used in experimental evaluations.  

Instructions: 

The metadata.csv contains the data used for our paper, titled, Instance Space Analysis of Search-Based Software Testing.

The detail of various columns is as under:

  • The column "instances" contains instance-ids of all the problem instances
  • The columns starting with "feature_" contain feature values of the instances
  • The columns starting with "algo_" contain algorithm performance of the portfolio algorithms
  • The column "source" mentions the source of origination of the problem instances.