Projects per year
Abstract
Measuring the performance of an algorithm for solving multi-objective optimization problem has always been challenging simply due to two conflicting goals, i.e., convergence and diversity of obtained trade-off solutions. There are a number of metrics for evaluating the performance of a multi-objective optimizer that approximates the whole Pareto-optimal front. However,
for evaluating the quality of a preferred subset of the whole front, the existing metrics are inadequate. In this paper, we suggest a systematic way to adapt the existing metrics to quantitatively evaluate the performance of a preference-based evolutionary multi-objective optimization algorithm using reference points. The basic idea is to pre-process the preferred solution set according to a multi-criterion decision making approach before using a regular metric for performance assessment. Extensive experiments on several artificial scenarios and benchmark problems fully demonstrate its effectiveness in evaluating the quality of different preferred solution sets with regard to various
reference points supplied by a decision maker.
for evaluating the quality of a preferred subset of the whole front, the existing metrics are inadequate. In this paper, we suggest a systematic way to adapt the existing metrics to quantitatively evaluate the performance of a preference-based evolutionary multi-objective optimization algorithm using reference points. The basic idea is to pre-process the preferred solution set according to a multi-criterion decision making approach before using a regular metric for performance assessment. Extensive experiments on several artificial scenarios and benchmark problems fully demonstrate its effectiveness in evaluating the quality of different preferred solution sets with regard to various
reference points supplied by a decision maker.
Original language | English |
---|---|
Pages (from-to) | 821 - 835 |
Number of pages | 23 |
Journal | IEEE Transactions on Evolutionary Computation |
Volume | 22 |
Issue number | 6 |
Early online date | 26 Sept 2017 |
DOIs | |
Publication status | Published - Dec 2018 |
Keywords
- User-preference
- performance assessment
- reference point
- multi-criterion decision making
- evolutionary multi-objective optimization
Fingerprint
Dive into the research topics of 'R-Metric: Evaluating the Performance of Preference-Based Evolutionary Multi-Objective Optimization Using Reference Points'. Together they form a unique fingerprint.Projects
- 1 Finished
-
Evolutionary Computation for Dynamic Optimisation in Network Environments
Engineering & Physical Science Research Council
25/02/13 → 17/08/17
Project: Research Councils