TUHH Open Research
Help
  • Log In
    New user? Click here to register.Have you forgotten your password?
  • English
  • Deutsch
  • Communities & Collections
  • Publications
  • Research Data
  • People
  • Institutions
  • Projects
  • Statistics
  1. Home
  2. TUHH
  3. Publications
  4. Neural network surgery: combining training with topology optimization
 
Options

Neural network surgery: combining training with topology optimization

Citation Link: https://doi.org/10.15480/882.3800
Publikationstyp
Journal Article
Date Issued
2021-09-07
Sprache
English
Author(s)
Schiessler, Elisabeth J.  
Aydin, Roland C.  
Linka, Kevin  
Cyron, Christian J.  
Institut
Kontinuums- und Werkstoffmechanik M-15  
TORE-DOI
10.15480/882.3800
TORE-URI
http://hdl.handle.net/11420/10419
Journal
Neural networks  
Volume
144
Start Page
384
End Page
393
Citation
Neural Networks 144: 384-393 (2021-12)
Publisher DOI
10.1016/j.neunet.2021.08.034
Scopus ID
2-s2.0-85115303158
Publisher
Elsevier
With ever increasing computational capacities, neural networks become more and more proficient at solving complex tasks. However, picking a sufficiently good network topology usually relies on expert human knowledge. Neural architecture search aims to reduce the extent of expertise that is needed. Modern architecture search techniques often rely on immense computational power, or apply trained meta-controllers for decision making. We develop a framework for a genetic algorithm that is both computationally cheap and makes decisions based on mathematical criteria rather than trained parameters. It is a hybrid approach that fuses training and topology optimization together into one process. Structural modifications that are performed include adding or removing layers of neurons, with some re-training applied to make up for any incurred change in input–output behaviour. Our ansatz is tested on several benchmark datasets with limited computational overhead compared to training only the baseline. This algorithm can achieve a significant increase in accuracy (as compared to a fully trained baseline), rescue insufficient topologies that in their current state are only able to learn to a limited extent, and dynamically reduce network size without loss in achieved accuracy. On standard ML datasets, accuracy improvements compared to baseline performance can range from 20% for well performing starting topologies to more than 40% in case of insufficient baselines, or reduce network size by almost 15%.
Subjects
Genetic algorithm
Neural architecture search
Singular value decomposition
Topology optimization
MLE@TUHH
DDC Class
600: Technik
Publication version
publishedVersion
Lizenz
https://creativecommons.org/licenses/by/4.0/
Loading...
Thumbnail Image
Name

1-s2.0-S0893608021003476-main.pdf

Size

673.47 KB

Format

Adobe PDF

TUHH
Weiterführende Links
  • Contact
  • Send Feedback
  • Cookie settings
  • Privacy policy
  • Impress
DSpace Software

Built with DSpace-CRIS software - Extension maintained and optimized by 4Science
Design by effective webwork GmbH

  • Deutsche NationalbibliothekDeutsche Nationalbibliothek
  • ORCiD Member OrganizationORCiD Member Organization
  • DataCiteDataCite
  • Re3DataRe3Data
  • OpenDOAROpenDOAR
  • OpenAireOpenAire
  • BASE Bielefeld Academic Search EngineBASE Bielefeld Academic Search Engine
Feedback