Abstract
Most studies of Estimation of Distribution Algorithms (EDA) are restricted to low dimensional problems due to EDA being susceptible to the curse of dimensionality. Among methods that try to scale up EDA to high dimensional problems, EDA-MCC was recently proposed. It controls the complexity of the search distribution by thresholding correlation estimates as a means to approximate the prominent dependency structure among the search variables and discard irrelevant detail. However, it is known that the correlation coefficient can only determine statistical dependence when the data distribution is Gaussian. In this paper, we develop a new variant of EDA-MCC called EDA-MCC-MI which uses mutual information (MI) estimates to determine dependencies between the search variables, replacing linear correlation. Our method is in a better position to determine the correct dependency structure than the EDA-MCC can do, simply because MI is zero if and only if the variables are independent, whereas a zero correlation does not imply independence in general. Empirical comparison results show that EDA-MCC-MI is never worse than EDA-MCC even when the search distribution is Gaussian. Our implementation employs a nonparametric MI estimator, hence it is easily extensible to any other, non-Gaussian search distribution.
Original language | English |
---|---|
Title of host publication | Proceedings of the IEEE Congress on Evolutionary Computation |
Publisher | Institute of Electrical and Electronics Engineers (IEEE) |
Number of pages | 8 |
DOIs | |
Publication status | E-pub ahead of print - 21 Nov 2016 |
Event | IEEE Congress on Evolutionary Computation 2016 - Vancouver, Canada Duration: 25 Jul 2016 → 29 Jul 2016 |
Conference
Conference | IEEE Congress on Evolutionary Computation 2016 |
---|---|
Country/Territory | Canada |
Period | 25/07/16 → 29/07/16 |