## Abstract

Recent works showed that simple success-based rules for self-adjusting parameters in evolutionary algorithms (EAs) can match or outperform the best fixed parameters on discrete problems. Non-elitism in a (1, λ) EA combined with a self-adjusting of spring population size λ outperforms common EAs on the multimodal Cliff problem. However, it was shown that this only holds if the success rate λ that governs self-adjustment is small enough. Otherwise, even on OneMax, the self-adjusting (1, λ) EA stagnates on an easy slope, where frequent successes drive down the of spring population size.

We show that self-adjustment works as intended in the absence of easy slopes. We define everywhere hard functions, for which successes are never easy to find and show that the self-adjusting (1, λ) EA is robust with respect to the choice of success rates λ. We give a general fitness-level upper bound on the number of evaluations and show that the expected number of generations is at most [EQUATION] where d is the number of non-optimal fitness values and [EQUATION] is the smallest probability of finding an improvement from a non-optimal search point. We discuss implications for the everywhere hard function LeadingOnes and a new class OneMaxBlocks of everywhere hard functions with tunable difficulty.

We show that self-adjustment works as intended in the absence of easy slopes. We define everywhere hard functions, for which successes are never easy to find and show that the self-adjusting (1, λ) EA is robust with respect to the choice of success rates λ. We give a general fitness-level upper bound on the number of evaluations and show that the expected number of generations is at most [EQUATION] where d is the number of non-optimal fitness values and [EQUATION] is the smallest probability of finding an improvement from a non-optimal search point. We discuss implications for the everywhere hard function LeadingOnes and a new class OneMaxBlocks of everywhere hard functions with tunable difficulty.

Original language | English |
---|---|

Title of host publication | GECCO '22 |

Subtitle of host publication | Proceedings of the Genetic and Evolutionary Computation Conference |

Editors | Jonathan E. Fieldsend |

Place of Publication | New York |

Publisher | Association for Computing Machinery |

Pages | 796–804 |

Number of pages | 9 |

ISBN (Electronic) | 9781450392372 |

DOIs | |

Publication status | Published - 8 Jul 2022 |

Event | GECCO '22: Genetic and Evolutionary Computation Conference - Boston, United States Duration: 9 Jul 2022 → 13 Jul 2022 |

### Publication series

Name | GECCO: Genetic and Evolutionary Computation Conference |
---|

### Conference

Conference | GECCO '22: Genetic and Evolutionary Computation Conference |
---|---|

Abbreviated title | GECCO 2022 |

Country/Territory | United States |

City | Boston |

Period | 9/07/22 → 13/07/22 |