Large scale continuous EDA using mutual information

Research output: Chapter in Book/Report/Conference proceedingConference contribution


Colleges, School and Institutes

External organisations

  • University of Birmingham


Most studies of Estimation of Distribution Algorithms (EDA) are restricted to low dimensional problems due to EDA being susceptible to the curse of dimensionality. Among methods that try to scale up EDA to high dimensional problems, EDA-MCC was recently proposed. It controls the complexity of the search distribution by thresholding correlation estimates as a means to approximate the prominent dependency structure among the search variables and discard irrelevant detail. However, it is known that the correlation coefficient can only determine statistical dependence when the data distribution is Gaussian. In this paper, we develop a new variant of EDA-MCC called EDA-MCC-MI which uses mutual information (MI) estimates to determine dependencies between the search variables, replacing linear correlation. Our method is in a better position to determine the correct dependency structure than the EDA-MCC can do, simply because MI is zero if and only if the variables are independent, whereas a zero correlation does not imply independence in general. Empirical comparison results show that EDA-MCC-MI is never worse than EDA-MCC even when the search distribution is Gaussian. Our implementation employs a nonparametric MI estimator, hence it is easily extensible to any other, non-Gaussian search distribution.


Original languageEnglish
Title of host publicationProceedings of the IEEE Congress on Evolutionary Computation
Publication statusE-pub ahead of print - 21 Nov 2016
EventIEEE Congress on Evolutionary Computation 2016 - Vancouver, Canada
Duration: 25 Jul 201629 Jul 2016


ConferenceIEEE Congress on Evolutionary Computation 2016