Skip to main content

Cost Functions and Style Transfer

  • Chapter
  • First Online:

Abstract

In this chapter we will look in more depth at the role of the cost function in neural network models. In particular, we will discuss the MSE (mean square error) and the cross-entropy and discuss their origin and their interpretation. We will look at why we can use them to solve problems and how the MSE can be interpreted in a statistical sense, as well as how cross-entropy is related to information theory. Then, to give you an example of a much more advanced use of special loss functions, we will learn how to do neural style transfer, where we will discuss a neural network to paint in the style of famous painters.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   34.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   44.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    https://en.wikipedia.org/wiki/Loss_function

  2. 2.

    This example is discussed in detail in Michelucci, Umberto, 2018. Applied Deep Learning: A Case-Based Approach To Understanding Deep Neural Networks. 1. Auflage. New York: Apress. ISBN 978-1-4842-3789-2. Available from: https://doi.org/10.1007/978-1-4842-3790-8

  3. 3.

    Remember that the order of the dimensions depends on how you structure your network and you may need to change it. The dimensions here are for illustrative purposes only.

  4. 4.

    https://en.wikipedia.org/wiki/Taylor_series

  5. 5.

    https://en.m.wikipedia.org/wiki/Skewness . In the case of E[ΔY] = 0.

  6. 6.

    https://en.m.wikipedia.org/wiki/Kurtosis . In the case of E[ΔY] = 0.

  7. 7.

    In probability and statistics, a probability mass function (PMF) is a function that gives the probability that a discrete random variable is exactly equal to some value [Stewart, William J. (2011). Probability, Markov Chains, Queues, and Simulation: The Mathematical Basis of Performance Modeling . Princeton University Press. p. 105. ISBN 978-1-4008-3281-1.]

  8. 8.

    https://en.wikipedia.org/wiki/Claude_Shannon

  9. 9.

    https://en.wikipedia.org/wiki/Neural_Style_Transfer

  10. 10.

    Gatys, Leon A.; Ecker, Alexander S.; Bethge, Matthias (26 August 2015). “A Neural Algorithm of Artistic Style”. https://arxiv.org/abs/1508.06576

  11. 11.

    ”Very Deep CNNS for Large-Scale Visual Recognition”. Robots.ox.ac.uk. 2014. Retrieved 13 February 2019, http://www.robots.ox.ac.uk/~vgg/research/very_deep/

  12. 12.

    This part of the chapter has been inspired by the Medium post https://becominghuman.ai/creating-intricate-art-with-neural-style-transfer-e5fee5f89481 .

  13. 13.

    https://en.wikipedia.org/wiki/Darth_Vader

  14. 14.

    Note that all the images used in this chapter were images free of copyright and free to use. If you use images for your papers or block, ensure that you can use them freely or you’ll need to pay royalties.

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Umberto Michelucci

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Michelucci, U. (2019). Cost Functions and Style Transfer. In: Advanced Applied Deep Learning . Apress, Berkeley, CA. https://doi.org/10.1007/978-1-4842-4976-5_5

Download citation

Publish with us

Policies and ethics