Abstract
We consider the problem of incrementally learning models from relational data. Most existing learning methods for statistical relational models use batch learning, which becomes computationally expensive and eventually infeasible for large datasets. The majority of the previous work in relational incremental learning assumes the model's structure is given and only the model's parameters needed to be learned. In this paper, we propose algorithms that can incrementally learn the model's parameters and structure simultaneously. These algorithms are based on the successful formalisation of the relational functional gradient boosting system (RFGB), and extend the classical propositional ensemble methods to relational learning for handling evolving data streams.
Original language | English |
---|---|
Pages (from-to) | 22-26 |
Number of pages | 5 |
Journal | CEUR Workshop Proceedings |
Volume | 2085 |
Publication status | Published - 2018 |
Event | Late Breaking Papers of the 27th International Conference on Inductive Logic Programming, LBP-ILP 2017 - Orleans, France Duration: 4 Sept 2017 → 6 Sept 2017 |
Bibliographical note
Publisher Copyright:© by the paper's authors.
Keywords
- Concept drift
- Ensemble methods
- Gradient-based boosting
- Hoeffding bound
- Incremental learning
- Relational regression tree
- Statistical relational learning
ASJC Scopus subject areas
- General Computer Science