TUHH Open Research
Help
  • Log In
    New user? Click here to register.Have you forgotten your password?
  • English
  • Deutsch
  • Communities & Collections
  • Publications
  • Research Data
  • People
  • Institutions
  • Projects
  • Statistics
  1. Home
  2. TUHH
  3. Publication References
  4. Minimum Description Length Principle applied to structure adaptation for classification under concept drift
 
Options

Minimum Description Length Principle applied to structure adaptation for classification under concept drift

Publikationstyp
Conference Paper
Date Issued
2016-07
Sprache
English
Author(s)
Murena, Pierre Alexandre  
CornuĂ©jols, Antoine  
TORE-URI
http://hdl.handle.net/11420/15266
Start Page
2842
End Page
2849
Article Number
7727558
Citation
International Joint Conference on Neural Networks (IJCNN 2016)
Contribution to Conference
International Joint Conference on Neural Networks, IJCNN 2016  
Publisher DOI
10.1109/IJCNN.2016.7727558
Scopus ID
2-s2.0-85007236249
Traditional supervised machine learning tests the learned classifiers on data which are drawn from the same distribution as the data used for the learning. In practice, this hypothesis does not always hold and the learned classifier has to be transferred from the space of learning data (also called source data) to the space of test data (also called target data) where it is not directly applicable. To operate this transfer, several methods aim at extracting common structural features in the source and target. Our approach employs a neural model to encode the structure of data: such a model is shown to compress the information in the sense of Kolmogorov theory of information. To perform transfer from source to target, we adapt a result shown for analogy reasoning: the structure of the source and target models are learned by applying the Minimum Description Length Principle which assumes that the chosen transformation has the shortest symbolic description on a universal Turing machine. We encounter a minimization problem over the source and target models. To describe the transfer, we develop a multi-level description of the model transformation which is used directly in the minimization of the description length. Our approach has been tested on toy examples, the difficulty of which can be controlled easily by a one-dimensional parameter and is shown to work efficiently on a wide range of problems.
TUHH
WeiterfĂĽhrende Links
  • Contact
  • Send Feedback
  • Cookie settings
  • Privacy policy
  • Impress
DSpace Software

Built with DSpace-CRIS software - Extension maintained and optimized by 4Science
Design by effective webwork GmbH

  • Deutsche NationalbibliothekDeutsche Nationalbibliothek
  • ORCiD Member OrganizationORCiD Member Organization
  • DataCiteDataCite
  • Re3DataRe3Data
  • OpenDOAROpenDOAR
  • OpenAireOpenAire
  • BASE Bielefeld Academic Search EngineBASE Bielefeld Academic Search Engine
Feedback