Options
Incremental learning with the minimum description length principle
Publikationstyp
Conference Paper
Date Issued
2017-05
Sprache
English
Start Page
1908
End Page
1915
Article Number
7966084
Citation
International Joint Conference on Neural Networks (IJCNN 2017)
Contribution to Conference
Publisher DOI
Scopus ID
Whereas a large number of machine learning methods focus on offline learning over a single batch of data called training data set, the increasing number of automatically generated data leads to the emergence of new issues that offline learning cannot cope with. Incremental learning designates online learning of a model from streaming data. In non-stationary environments, the process generating these data may change over time, hence the learned concept becomes invalid. Adaptation to this non-stationary nature, called concept drift, is an intensively studied topic and can be reached algorithmically by two opposite approaches: active or passive approaches. We propose a formal framework to deal with concept drift, both in active and passive ways. Our framework is derived from the Minimum Description Length principle and exploits the algorithmic theory of information to quantify the model adaptation. We show that this approach is consistent with state of the art techniques and has a valid probabilistic counterpart. We propose two simple algorithms to use our framework in practice and tested both of them on real and simulated data.