Abstract
In this chapter we shall associate a manifold with each neural network by considering the weights and biasses of a neural network as the coordinate system on the manifold.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
A system of parameters for a continuous function defined on [0, 1] is the set of rational numbers \(Q\cap [0,1]\).
- 2.
This is also called the arc length parameter, since it is proportional to the arc length measured along the curve c(s).
- 3.
The Riemannian curvature of a manifold is described by the tensor
$$R_{ijk}^r = \partial _{\theta _i} \Gamma _{jk}^r - \partial _{\theta _j} \Gamma _{ik}^r + \Gamma _{ih}^r \Gamma _{jk}^h - \Gamma _{jh}^r \Gamma _{ik}^h, $$with summation over the repeated indices.
- 4.
This can be easily understood, for instance, if you try to recite the alphabet in the reverse order. The brain builds coadaptations when learning the alphabet in chronological order from A to Z. The difficulty faced when trying to recite the alphabet in the reverse order shows the existence of certain coadaptations formed among neurons during the learning process.
- 5.
The codimension of a submanifold \({\mathcal S}\) of a manifold \({\mathcal M}\) is the difference of their dimensions, \(k = \dim {\mathcal M} - \dim {\mathcal S}\).
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Calin, O. (2020). Output Manifolds. In: Deep Learning Architectures. Springer Series in the Data Sciences. Springer, Cham. https://doi.org/10.1007/978-3-030-36721-3_13
Download citation
DOI: https://doi.org/10.1007/978-3-030-36721-3_13
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-36720-6
Online ISBN: 978-3-030-36721-3
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)