Dealing with Multiple Classes in Online Class Imbalance Learning

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Standard

Dealing with Multiple Classes in Online Class Imbalance Learning. / WANG, S.; MINKU, L.L.; YAO, X.

Proceedings of the 25th International Joint Conference on Artificial Intelligence (IJCAI). New York City : AAAI Press, 2016. p. 2118-2124.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Harvard

WANG, S, MINKU, LL & YAO, X 2016, Dealing with Multiple Classes in Online Class Imbalance Learning. in Proceedings of the 25th International Joint Conference on Artificial Intelligence (IJCAI). AAAI Press, New York City, pp. 2118-2124. <https://www.ijcai.org/Abstract/16/302>

APA

WANG, S., MINKU, L. L., & YAO, X. (2016). Dealing with Multiple Classes in Online Class Imbalance Learning. In Proceedings of the 25th International Joint Conference on Artificial Intelligence (IJCAI) (pp. 2118-2124). AAAI Press. https://www.ijcai.org/Abstract/16/302

Vancouver

WANG S, MINKU LL, YAO X. Dealing with Multiple Classes in Online Class Imbalance Learning. In Proceedings of the 25th International Joint Conference on Artificial Intelligence (IJCAI). New York City: AAAI Press. 2016. p. 2118-2124

Author

WANG, S. ; MINKU, L.L. ; YAO, X. / Dealing with Multiple Classes in Online Class Imbalance Learning. Proceedings of the 25th International Joint Conference on Artificial Intelligence (IJCAI). New York City : AAAI Press, 2016. pp. 2118-2124

Bibtex

@inproceedings{9194e9c34c044b2986f98e78af0c7c81,
title = "Dealing with Multiple Classes in Online Class Imbalance Learning",
abstract = "Online class imbalance learning deals with data streams having very skewed class distributions in a timely fashion. Although a few methods have been proposed to handle such problems, most of them focus on two-class cases. Multi-class imbalance imposes additional challenges in learning. This paper studies the combined challenges posed by multi-class imbalance and online learning, and aims at a more effective and adaptive solution. First, we introduce two resampling-based ensemble methods, called MOOB and MUOB, which can process multi-class data directly and strictly online with an adaptive sampling rate. Then, we look into the impact of multi-minority and multi-majority cases on MOOB and MUOB in comparison to other methods under stationary and dynamic scenarios. Both multi-minority and multi-majority make a negative impact. MOOB shows the best and most stable G-mean in most stationary and dynamic cases.",
author = "S. WANG and L.L. MINKU and X. YAO",
year = "2016",
month = jul,
day = "15",
language = "English",
pages = "2118--2124",
booktitle = "Proceedings of the 25th International Joint Conference on Artificial Intelligence (IJCAI)",
publisher = "AAAI Press",

}

RIS

TY - GEN

T1 - Dealing with Multiple Classes in Online Class Imbalance Learning

AU - WANG, S.

AU - MINKU, L.L.

AU - YAO, X.

PY - 2016/7/15

Y1 - 2016/7/15

N2 - Online class imbalance learning deals with data streams having very skewed class distributions in a timely fashion. Although a few methods have been proposed to handle such problems, most of them focus on two-class cases. Multi-class imbalance imposes additional challenges in learning. This paper studies the combined challenges posed by multi-class imbalance and online learning, and aims at a more effective and adaptive solution. First, we introduce two resampling-based ensemble methods, called MOOB and MUOB, which can process multi-class data directly and strictly online with an adaptive sampling rate. Then, we look into the impact of multi-minority and multi-majority cases on MOOB and MUOB in comparison to other methods under stationary and dynamic scenarios. Both multi-minority and multi-majority make a negative impact. MOOB shows the best and most stable G-mean in most stationary and dynamic cases.

AB - Online class imbalance learning deals with data streams having very skewed class distributions in a timely fashion. Although a few methods have been proposed to handle such problems, most of them focus on two-class cases. Multi-class imbalance imposes additional challenges in learning. This paper studies the combined challenges posed by multi-class imbalance and online learning, and aims at a more effective and adaptive solution. First, we introduce two resampling-based ensemble methods, called MOOB and MUOB, which can process multi-class data directly and strictly online with an adaptive sampling rate. Then, we look into the impact of multi-minority and multi-majority cases on MOOB and MUOB in comparison to other methods under stationary and dynamic scenarios. Both multi-minority and multi-majority make a negative impact. MOOB shows the best and most stable G-mean in most stationary and dynamic cases.

M3 - Conference contribution

SP - 2118

EP - 2124

BT - Proceedings of the 25th International Joint Conference on Artificial Intelligence (IJCAI)

PB - AAAI Press

CY - New York City

ER -