Bagging and Boosting Negatively Correlated Neural Networks

Research output: Contribution to journalArticle

Standard

Bagging and Boosting Negatively Correlated Neural Networks. / Monirul Islam, M; Yao, Xin; Shahriar Nirjon, SM; Asiful Islam, M; Murase, K.

In: IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), Vol. 38, No. 3, 01.06.2008, p. 771-784.

Research output: Contribution to journalArticle

Harvard

APA

Vancouver

Author

Monirul Islam, M ; Yao, Xin ; Shahriar Nirjon, SM ; Asiful Islam, M ; Murase, K. / Bagging and Boosting Negatively Correlated Neural Networks. In: IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics). 2008 ; Vol. 38, No. 3. pp. 771-784.

Bibtex

@article{9c659928b037480a9e9d23c22b0a374d,
title = "Bagging and Boosting Negatively Correlated Neural Networks",
abstract = "In this paper, we propose two cooperative ensemble learning algorithms, i.e., NegBagg and NegBoost, for designing neural network (NN) ensembles. The proposed algorithms incrementally train different individual NNs in an ensemble using the negative correlation learning algorithm. Bagging and boosting algorithms are used in NegBagg and NegBoost, respectively, to create different training sets for different NNs in the ensemble. The idea behind using negative correlation learning in conjunction with the bagging/boosting algorithm is to facilitate interaction and cooperation among NNs during their training. Both NegBagg and NegBoost use a constructive approach to automatically determine the number of hidden neurons for NNs. NegBoost also uses the constructive approach to automatically determine the number of NNs for the ensemble. The two algorithms have been tested on a number of benchmark problems in machine learning and NNs, including Australian credit card assessment, breast cancer, diabetes, glass, heart disease, letter recognition, satellite, soybean, and waveform problems. The experimental results show that NegBagg and NegBoost require a small number of training epochs to produce compact NN ensembles with good generalization.",
keywords = "constructive approach, diversity, bagging, generalization, boosting, neural network (NN), negative correlation learning, ensemble design",
author = "{Monirul Islam}, M and Xin Yao and {Shahriar Nirjon}, SM and {Asiful Islam}, M and K Murase",
year = "2008",
month = jun,
day = "1",
doi = "10.1109/TSMCB.2008.922055",
language = "English",
volume = "38",
pages = "771--784",
journal = "IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics)",
issn = "1083-4419",
publisher = "Institute of Electrical and Electronics Engineers (IEEE)",
number = "3",

}

RIS

TY - JOUR

T1 - Bagging and Boosting Negatively Correlated Neural Networks

AU - Monirul Islam, M

AU - Yao, Xin

AU - Shahriar Nirjon, SM

AU - Asiful Islam, M

AU - Murase, K

PY - 2008/6/1

Y1 - 2008/6/1

N2 - In this paper, we propose two cooperative ensemble learning algorithms, i.e., NegBagg and NegBoost, for designing neural network (NN) ensembles. The proposed algorithms incrementally train different individual NNs in an ensemble using the negative correlation learning algorithm. Bagging and boosting algorithms are used in NegBagg and NegBoost, respectively, to create different training sets for different NNs in the ensemble. The idea behind using negative correlation learning in conjunction with the bagging/boosting algorithm is to facilitate interaction and cooperation among NNs during their training. Both NegBagg and NegBoost use a constructive approach to automatically determine the number of hidden neurons for NNs. NegBoost also uses the constructive approach to automatically determine the number of NNs for the ensemble. The two algorithms have been tested on a number of benchmark problems in machine learning and NNs, including Australian credit card assessment, breast cancer, diabetes, glass, heart disease, letter recognition, satellite, soybean, and waveform problems. The experimental results show that NegBagg and NegBoost require a small number of training epochs to produce compact NN ensembles with good generalization.

AB - In this paper, we propose two cooperative ensemble learning algorithms, i.e., NegBagg and NegBoost, for designing neural network (NN) ensembles. The proposed algorithms incrementally train different individual NNs in an ensemble using the negative correlation learning algorithm. Bagging and boosting algorithms are used in NegBagg and NegBoost, respectively, to create different training sets for different NNs in the ensemble. The idea behind using negative correlation learning in conjunction with the bagging/boosting algorithm is to facilitate interaction and cooperation among NNs during their training. Both NegBagg and NegBoost use a constructive approach to automatically determine the number of hidden neurons for NNs. NegBoost also uses the constructive approach to automatically determine the number of NNs for the ensemble. The two algorithms have been tested on a number of benchmark problems in machine learning and NNs, including Australian credit card assessment, breast cancer, diabetes, glass, heart disease, letter recognition, satellite, soybean, and waveform problems. The experimental results show that NegBagg and NegBoost require a small number of training epochs to produce compact NN ensembles with good generalization.

KW - constructive approach

KW - diversity

KW - bagging

KW - generalization

KW - boosting

KW - neural network (NN)

KW - negative correlation learning

KW - ensemble design

U2 - 10.1109/TSMCB.2008.922055

DO - 10.1109/TSMCB.2008.922055

M3 - Article

C2 - 18558541

VL - 38

SP - 771

EP - 784

JO - IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics)

JF - IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics)

SN - 1083-4419

IS - 3

ER -