We perform rigorous runtime analyses for the univariate marginal distribution algorithm (UMDA) and the population-based incremental learning (PBIL) Algorithm on LEADINGONES. For the UMDA, the currently known expected runtime on the function is O(nλlogλ+n2) under an offspring population size λ=Ω(logn) and a parent population size μ≤λ/(e(1+δ)) for any constant δ>0 (Dang and Lehre, GECCO 2015). There is no lower bound on the expected runtime under the same parameter settings. It also remains unknown whether the algorithm can still optimise the LEADINGONES function within a polynomial runtime when μ≥λ/(e(1+δ)). In case of the PBIL, an expected runtime of O(n2+c) holds for some constant c∈(0,1) (Wu, Kolonko and Möhring, IEEE TEVC 2017). Despite being a generalisation of the UMDA, this upper bound is significantly asymptotically looser than the upper bound of O(n2) of the UMDA for λ=Ω(logn)∩O(n/logn). Furthermore, the required population size is very large, i.e., λ=Ω(n1+c). Our contributions are then threefold: (1) we show that the UMDA with μ=Ω(logn) and λ≤μe1−ε/(1+δ) for any constants ε∈(0,1) and 0<δ≤e1−ε−1 requires an expected runtime of eΩ(μ) on LEADINGONES, (2) an upper bound of O(nλlogλ+n2) is shown for the PBIL, which improves the current bound O(n2+c) by a significant factor of Θ(nc), and (3) we for the first time consider the two algorithms on the LEADINGONES function in a noisy environment and obtain an expected runtime of O(n2) for appropriate parameter settings. Our results emphasise that despite the independence assumption in the probabilistic models, the UMDA and the PBIL with fine-tuned parameter choices can still cope very well with variable interactions.
- estimation of distribution algorithms
- running time analysis
- level-based analysis
- noisy optimisation
- theory of randomised search heuristics