Parallel design of sparse deep belief network with multi-objective optimization

Research output: Contribution to journalArticlepeer-review

Standard

Parallel design of sparse deep belief network with multi-objective optimization. / Li, Yangyang; Fang, Shuangkang; Bai, Xiaoyu; Jiao, Licheng; Marturi, Naresh.

In: Information Sciences, Vol. 533, 09.2020, p. 24-42.

Research output: Contribution to journalArticlepeer-review

Harvard

APA

Vancouver

Author

Li, Yangyang ; Fang, Shuangkang ; Bai, Xiaoyu ; Jiao, Licheng ; Marturi, Naresh. / Parallel design of sparse deep belief network with multi-objective optimization. In: Information Sciences. 2020 ; Vol. 533. pp. 24-42.

Bibtex

@article{ad07b9f647584d0ab5f38ed0dce38829,
title = "Parallel design of sparse deep belief network with multi-objective optimization",
abstract = "Deep belief network (DBN) is an import deep learning model and restricted Boltzmann machine (RBM) is one of its basic models. The traditional DBN and RBM have numerous redundant features. Hence an improved strategy is required to perform sparse operations on them. Previously, we have proposed our own sparse DBN (SDBN): using a multi-objective optimization (MOP) algorithm to learn sparse features, which solves the contradiction between the reconstruction error and network sparsity of RBM. Due to the optimization algorithm and millions of parameters of the network itself, the training process is difficult. Therefore, in this paper, we propose an efficient parallel strategy to speed up the training of SDBN networks. Self-adaptive Quantum Multi-objectives Evolutionary algorithm based on Decomposition (SA-QMOEA/D) that we have proposed as the multi-objective optimization algorithm has the hidden parallelism of populations. Based on this, we not only parallelize the DBN network but also realize the parallelism of the multi-objective optimization algorithm. In order to further verify the advantages of our approach, we apply it to the problem of facial expression recognition (FER). The obtained experimental results demonstrate that our parallel algorithm achieves a significant speedup performance and a higher accuracy rate over previous CPU implementations and other conventional methods.",
keywords = "Restricted Boltzmann machine, Deep belief network, Multi-objective optimization, Parallel acceleration, Facial expression recognition, GPU",
author = "Yangyang Li and Shuangkang Fang and Xiaoyu Bai and Licheng Jiao and Naresh Marturi",
year = "2020",
month = sep,
doi = "10.1016/j.ins.2020.03.084",
language = "English",
volume = "533",
pages = "24--42",
journal = "Information Sciences",
issn = "0020-0255",
publisher = "Elsevier",

}

RIS

TY - JOUR

T1 - Parallel design of sparse deep belief network with multi-objective optimization

AU - Li, Yangyang

AU - Fang, Shuangkang

AU - Bai, Xiaoyu

AU - Jiao, Licheng

AU - Marturi, Naresh

PY - 2020/9

Y1 - 2020/9

N2 - Deep belief network (DBN) is an import deep learning model and restricted Boltzmann machine (RBM) is one of its basic models. The traditional DBN and RBM have numerous redundant features. Hence an improved strategy is required to perform sparse operations on them. Previously, we have proposed our own sparse DBN (SDBN): using a multi-objective optimization (MOP) algorithm to learn sparse features, which solves the contradiction between the reconstruction error and network sparsity of RBM. Due to the optimization algorithm and millions of parameters of the network itself, the training process is difficult. Therefore, in this paper, we propose an efficient parallel strategy to speed up the training of SDBN networks. Self-adaptive Quantum Multi-objectives Evolutionary algorithm based on Decomposition (SA-QMOEA/D) that we have proposed as the multi-objective optimization algorithm has the hidden parallelism of populations. Based on this, we not only parallelize the DBN network but also realize the parallelism of the multi-objective optimization algorithm. In order to further verify the advantages of our approach, we apply it to the problem of facial expression recognition (FER). The obtained experimental results demonstrate that our parallel algorithm achieves a significant speedup performance and a higher accuracy rate over previous CPU implementations and other conventional methods.

AB - Deep belief network (DBN) is an import deep learning model and restricted Boltzmann machine (RBM) is one of its basic models. The traditional DBN and RBM have numerous redundant features. Hence an improved strategy is required to perform sparse operations on them. Previously, we have proposed our own sparse DBN (SDBN): using a multi-objective optimization (MOP) algorithm to learn sparse features, which solves the contradiction between the reconstruction error and network sparsity of RBM. Due to the optimization algorithm and millions of parameters of the network itself, the training process is difficult. Therefore, in this paper, we propose an efficient parallel strategy to speed up the training of SDBN networks. Self-adaptive Quantum Multi-objectives Evolutionary algorithm based on Decomposition (SA-QMOEA/D) that we have proposed as the multi-objective optimization algorithm has the hidden parallelism of populations. Based on this, we not only parallelize the DBN network but also realize the parallelism of the multi-objective optimization algorithm. In order to further verify the advantages of our approach, we apply it to the problem of facial expression recognition (FER). The obtained experimental results demonstrate that our parallel algorithm achieves a significant speedup performance and a higher accuracy rate over previous CPU implementations and other conventional methods.

KW - Restricted Boltzmann machine

KW - Deep belief network

KW - Multi-objective optimization

KW - Parallel acceleration

KW - Facial expression recognition

KW - GPU

UR - http://www.scopus.com/inward/record.url?scp=85084958544&partnerID=8YFLogxK

U2 - 10.1016/j.ins.2020.03.084

DO - 10.1016/j.ins.2020.03.084

M3 - Article

VL - 533

SP - 24

EP - 42

JO - Information Sciences

JF - Information Sciences

SN - 0020-0255

ER -