Multiobjective Neural Network Ensembles based on Regularized Negative Correlation Learning

Huanhuan Chen, Xin Yao

Research output: Contribution to journalArticle

90 Citations (Scopus)


Negative Correlation Learning (NCL) [1], [2] is a neural network ensemble learning algorithm which introduces a correlation penalty term to the cost function of each individual network so that each neural network minimizes its mean-square-error (MSE) together with the correlation. This paper describes NCL in detail and observes that the NCL corresponds to training the entire ensemble as a single learning machine that only minimizes the MSE without regularization. This insight explains that NCL is prone to overfitting the noise in the training set. The paper analyzes this problem and proposes the multiobjective regularized negative correlation learning (MRNCL) algorithm which incorporates an additional regularization term for the ensemble and uses the evolutionary multiobjective algorithm to design ensembles. In MRNCL, we define the crossover and mutation operators and adopt nondominated sorting algorithm with fitness sharing and rank-based fitness assignment. The experiments on synthetic data as well as real-world data sets demonstrate that MRNCL achieves better performance than NCL, especially when the noise level is nontrivial in the data set. In the experimental discussion, we give three reasons why our algorithm outperforms others.
Original languageEnglish
Pages (from-to)1738-1751
Number of pages14
JournalIEEE Transactions on Knowledge and Data Engineering
Issue number12
Publication statusPublished - 1 Dec 2010


  • Multiobjective algorithm
  • neural network ensembles
  • neural networks
  • negative correlation learning
  • regularization
  • multiobjective learning


Dive into the research topics of 'Multiobjective Neural Network Ensembles based on Regularized Negative Correlation Learning'. Together they form a unique fingerprint.

Cite this