Abstract
Software effort estimation (SEE) usually suffers from inherent uncertainty arising from predictive model limitations and data noise. Relying on point estimation only may ignore the uncertain factors and lead project managers (PMs) to wrong decision-making. Prediction intervals (PIs) with confidence levels (CLs) present a more reasonable representation of reality, potentially helping PMs to make better informed decisions and enable more flexibility in these decisions. However, existing methods for PIs either have strong limitations, or are unable to provide informative PIs. To develop a ‘better’ effort predictor, we propose a novel PI estimator called Synthetic Bootstrap ensemble of Relevance Vector Machines (SynB-RVM) that adopts Bootstrap resampling to produce multiple RVM models based on modified training bags whose replicated data projects are replaced by their synthetic counterparts. We then provide three ways to ensemble those RVM models into a final probabilistic effort predictor, from which PIs with CLs can be generated. When used as a point estimator, SynB-RVM can either significantly outperform or have similar performance compared with other investigated methods. When used as an uncertain predictor, SynB-RVM can achieve significantly narrower PIs compared to its base learner RVM. Its hit rates and relative widths are no worse than the other compared methods that can provide uncertain estimation.
Original language | English |
---|---|
Article number | 5 |
Number of pages | 43 |
Journal | ACM Transactions on Software Engineering and Methodology |
Volume | 28 |
Issue number | 1 |
Early online date | 23 Feb 2019 |
DOIs | |
Publication status | E-pub ahead of print - 23 Feb 2019 |
Keywords
- Bootstrap resampling
- Ensemble learning
- Prediction intervals with confidence levels
- Relevance vector machine
- Software effort estimation
- Software risk management
- Synthetic replacement
- Uncertain effort estimation
ASJC Scopus subject areas
- Software