TUHH Open Research
Help
  • Log In
    New user? Click here to register.Have you forgotten your password?
  • English
  • Deutsch
  • Communities & Collections
  • Publications
  • Research Data
  • People
  • Institutions
  • Projects
  • Statistics
  1. Home
  2. TUHH
  3. Publication References
  4. From the Euclidean to the Natural
 
Options

From the Euclidean to the Natural

Publikationstyp
Book Part
Date Issued
2024-12-19
Sprache
English
Author(s)
Ay, Nihat  
Data Science Foundations E-21  
TORE-URI
https://hdl.handle.net/11420/55752
Journal
Foundational Papers in Complexity Science: Volume 4
Citation
in: Foundational Papers in Complexity Science (4): (2024)
Publisher DOI
10.37911/9781947864559.85
Publisher
SFI Press
ISBN
9781947864559
The idea of learning as an optimization process can be traced back to the early years of artificial neural networks. This idea has been very fruitful, ultimately leading to the recent successes of deep neural networks as learning machines. While being consistent with optimization, however, the first learning algorithms for neurons were inspired by neurophysiological and neuropsychological paradigms, most notably by the celebrated work of Donald Hebb (1949). Building on such paradigms, Frank Rosenblatt (1957) proposed an algorithm for training a simple neuronal model, which Warren McCulloch and Walter Pitts had introduced in their seminal article in 1943. The convergence of this algorithm can be formally proved with elementary arguments from linear algebra (perceptron convergence theorem; see Novikoff 1962). The idea of learning as an optimization process, however, offers not only a unified conceptual foundation of learning, it also allows us to study learning from a rich mathematical perspective. In  this context, the stochastic gradient descent method plays a fundamentally important role (Widrow 1963; Amari 1967; Rumelhart, Hinton, and Williams 1986). Nowadays, it represents the main instrument for training artificial neural networks, which brings us to Shun-Ichi Amari’s article “Natural Gradient Works Efficiently in Learning.” Let us unfold this title and thereby reveal the main insights of Amari’s work.
TUHH
Weiterführende Links
  • Contact
  • Send Feedback
  • Cookie settings
  • Privacy policy
  • Impress
DSpace Software

Built with DSpace-CRIS software - Extension maintained and optimized by 4Science
Design by effective webwork GmbH

  • Deutsche NationalbibliothekDeutsche Nationalbibliothek
  • ORCiD Member OrganizationORCiD Member Organization
  • DataCiteDataCite
  • Re3DataRe3Data
  • OpenDOAROpenDOAR
  • OpenAireOpenAire
  • BASE Bielefeld Academic Search EngineBASE Bielefeld Academic Search Engine
Feedback