Murena, Pierre AlexandrePierre AlexandreMurenaCornuéjols, AntoineAntoineCornuéjolsDessalles, Jean LouisJean LouisDessalles2023-04-272023-04-272017-05International Joint Conference on Neural Networks (IJCNN 2017)http://hdl.handle.net/11420/15257Whereas a large number of machine learning methods focus on offline learning over a single batch of data called training data set, the increasing number of automatically generated data leads to the emergence of new issues that offline learning cannot cope with. Incremental learning designates online learning of a model from streaming data. In non-stationary environments, the process generating these data may change over time, hence the learned concept becomes invalid. Adaptation to this non-stationary nature, called concept drift, is an intensively studied topic and can be reached algorithmically by two opposite approaches: active or passive approaches. We propose a formal framework to deal with concept drift, both in active and passive ways. Our framework is derived from the Minimum Description Length principle and exploits the algorithmic theory of information to quantify the model adaptation. We show that this approach is consistent with state of the art techniques and has a valid probabilistic counterpart. We propose two simple algorithms to use our framework in practice and tested both of them on real and simulated data.enIncremental learning with the minimum description length principleConference Paper10.1109/IJCNN.2017.7966084Other