Bagging and Boosting Negatively Correlated Neural Networks

M Monirul Islam, Xin Yao, SM Shahriar Nirjon, M Asiful Islam, K Murase

Research output: Contribution to journalArticle

69 Citations (Scopus)


In this paper, we propose two cooperative ensemble learning algorithms, i.e., NegBagg and NegBoost, for designing neural network (NN) ensembles. The proposed algorithms incrementally train different individual NNs in an ensemble using the negative correlation learning algorithm. Bagging and boosting algorithms are used in NegBagg and NegBoost, respectively, to create different training sets for different NNs in the ensemble. The idea behind using negative correlation learning in conjunction with the bagging/boosting algorithm is to facilitate interaction and cooperation among NNs during their training. Both NegBagg and NegBoost use a constructive approach to automatically determine the number of hidden neurons for NNs. NegBoost also uses the constructive approach to automatically determine the number of NNs for the ensemble. The two algorithms have been tested on a number of benchmark problems in machine learning and NNs, including Australian credit card assessment, breast cancer, diabetes, glass, heart disease, letter recognition, satellite, soybean, and waveform problems. The experimental results show that NegBagg and NegBoost require a small number of training epochs to produce compact NN ensembles with good generalization.
Original languageEnglish
Pages (from-to)771-784
Number of pages14
JournalIEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics)
Issue number3
Publication statusPublished - 1 Jun 2008


  • constructive approach
  • diversity
  • bagging
  • generalization
  • boosting
  • neural network (NN)
  • negative correlation learning
  • ensemble design


Dive into the research topics of 'Bagging and Boosting Negatively Correlated Neural Networks'. Together they form a unique fingerprint.

Cite this