Abstract
The combination of forecasts resulting from an ensemble of neural networks has been shown to outperform the use of a single "best" network model. This is supported by an extensive body of literature, which shows that combining generally leads to improvements in forecasting accuracy and robustness, and that using the mean operator often outperforms more complex methods of combining forecasts. This paper proposes a mode ensemble operator based on kernel density estimation, which unlike the mean operator is insensitive to outliers and deviations from normality, and unlike the median operator does not require symmetric distributions. The three operators are compared empirically and the proposed mode ensemble operator is found to produce the most accurate forecasts, followed by the median, while the mean has relatively poor performance. The findings suggest that the mode operator should be considered as an alternative to the mean and median operators in forecasting applications. Experiments indicate that mode ensembles are useful in automating neural network models across a large number of time series, overcoming issues of uncertainty associated with data sampling, the stochasticity of neural network training, and the distribution of the forecasts.
Original language | English |
---|---|
Pages (from-to) | 4235-4244 |
Number of pages | 10 |
Journal | Expert Systems with Applications |
Volume | 41 |
Issue number | 9 |
Early online date | 12 Jan 2014 |
DOIs | |
Publication status | Published - 1 Jul 2014 |
Keywords
- Combination
- Ensembles
- Forecasting
- Kernel density estimation
- Mean
- Median
- Mode estimation
- Neural networks
- Time series
ASJC Scopus subject areas
- General Engineering
- Computer Science Applications
- Artificial Intelligence