Advertisement

Deep Learning Architectures

A Mathematical Approach

  • Ovidiu┬áCalin
Textbook
  • 2.2k Downloads

Part of the Springer Series in the Data Sciences book series (SSDS)

Table of contents

  1. Front Matter
    Pages i-xxx
  2. Introduction to Neural Networks

    1. Front Matter
      Pages 1-1
    2. Ovidiu Calin
      Pages 3-19
    3. Ovidiu Calin
      Pages 21-39
    4. Ovidiu Calin
      Pages 41-68
    5. Ovidiu Calin
      Pages 69-131
    6. Ovidiu Calin
      Pages 133-165
    7. Ovidiu Calin
      Pages 167-198
  3. Analytic Theory

    1. Front Matter
      Pages 199-199
    2. Ovidiu Calin
      Pages 201-225
    3. Ovidiu Calin
      Pages 227-250
    4. Ovidiu Calin
      Pages 251-284
    5. Ovidiu Calin
      Pages 285-313
  4. Information Processing

    1. Front Matter
      Pages 315-315
    2. Ovidiu Calin
      Pages 317-349
    3. Ovidiu Calin
      Pages 351-413
  5. Geometric Theory

    1. Front Matter
      Pages 415-415
    2. Ovidiu Calin
      Pages 417-464
    3. Ovidiu Calin
      Pages 465-504
  6. Other Architectures

    1. Front Matter
      Pages 505-505
    2. Ovidiu Calin
      Pages 507-516
    3. Ovidiu Calin
      Pages 517-542
    4. Ovidiu Calin
      Pages 543-559
    5. Ovidiu Calin
      Pages 561-590
    6. Ovidiu Calin
      Pages 591-609
    7. Ovidiu Calin
      Pages 611-635
  7. Back Matter
    Pages 637-760

About this book

Introduction

This book describes how neural networks operate from the mathematical point of view. As a result, neural networks can be interpreted both as function universal approximators and information processors. The book bridges the gap between ideas and concepts of neural networks, which are used nowadays at an intuitive level, and the precise modern mathematical language, presenting the best practices of the former and enjoying the robustness and elegance of the latter.

This book can be used in a graduate course in deep learning, with the first few parts being accessible to senior undergraduates.  In addition, the book will be of wide interest to machine learning researchers who are interested in a theoretical understanding of the subject.

 

 


Keywords

neural networks deep learning machine learning Kullback-Leibler divergence Entropy Fisher information metric Boltzmann machine

Authors and affiliations

  • Ovidiu┬áCalin
    • 1
  1. 1.Department of Mathematics & StatisticsEastern Michigan UniversityYpsilantiUSA

Bibliographic information

  • DOI https://doi.org/10.1007/978-3-030-36721-3
  • Copyright Information Springer Nature Switzerland AG 2020
  • Publisher Name Springer, Cham
  • eBook Packages Mathematics and Statistics
  • Print ISBN 978-3-030-36720-6
  • Online ISBN 978-3-030-36721-3
  • Series Print ISSN 2365-5674
  • Series Online ISSN 2365-5682
  • Buy this book on publisher's site