Skip to main content

Adaptive Noise Schedule for Denoising Autoencoder

  • Conference paper

Part of the Lecture Notes in Computer Science book series (LNTCS,volume 8834)

Abstract

The paper proposes an Adaptive Stacked Denoising Autoencoder (ASDA) to overcome the limitations of Stacked Denoising Autoencoder (SDA) [6] in which noise level is kept fixed during the training phase of the autoencoder. In ASDA, annealing schedule is applied on noise where the average noise level of input neurons is kept high during initial training phase and noise is slowly reduced as the training proceeds. The noise level of each input neuron is computed based on the weights connecting the input neuron to the hidden layer while keeping the average noise level of input layer to be same as that computed by annealing schedule. This enables the denoising autoencoder to learn the input manifold in greater details. As evident from results, ASDA gives better classification accuracy compared to SDA on variants of MNIST dataset [3].

Keywords

  • Deep Learning
  • Denoising
  • Encoder
  • Decoder

This is a preview of subscription content, access via your institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • DOI: 10.1007/978-3-319-12637-1_67
  • Chapter length: 8 pages
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
eBook
USD   79.99
Price excludes VAT (USA)
  • ISBN: 978-3-319-12637-1
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
Softcover Book
USD   99.99
Price excludes VAT (USA)

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Bengio, Y., et al.: Greedy layer-wise training of deep networks. In: Advances in Neural Information Processing Systems, vol. 19, p. 153 (2007)

    Google Scholar 

  2. Hinton, G.E., Osindero, S., Teh, Y.-W.: A fast learning algorithm for deep belief nets. Neural Computation 18(7), 1527–1554 (2006)

    CrossRef  MathSciNet  MATH  Google Scholar 

  3. Machine Learning Laboratory, University of Montreal, http://www.iro.umontreal.ca/~lisa/twiki/bin/view.cgi/Public/MnistVariations

  4. Rifai, S., et al.: Contractive auto-encoders: Explicit invariance during feature extraction. In: Proceedings of the 28th International Conference on Machine Learning, ICML 2011 (2011)

    Google Scholar 

  5. Rifai, S., Mesnil, G., Vincent, P., Muller, X., Bengio, Y., Dauphin, Y., Glorot, X.: Higher order contractive auto-encoder. In: Gunopulos, D., Hofmann, T., Malerba, D., Vazirgiannis, M. (eds.) ECML PKDD 2011, Part II. LNCS (LNAI), vol. 6912, pp. 645–660. Springer, Heidelberg (2011)

    CrossRef  Google Scholar 

  6. Vincent, P., et al.: Stacked denoising autoencoders: Learning useful representations in a deep network with a local denoising criterion. The Journal of Machine Learning Research 9999, 3371–3408 (2010)

    MathSciNet  Google Scholar 

  7. Zeiler, M.D.: ADADELTA: An adaptive learning rate method. arXiv preprint arXiv:1212.5701 (2012)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and Permissions

Copyright information

© 2014 Springer International Publishing Switzerland

About this paper

Cite this paper

Chandra, B., Sharma, R.K. (2014). Adaptive Noise Schedule for Denoising Autoencoder. In: Loo, C.K., Yap, K.S., Wong, K.W., Teoh, A., Huang, K. (eds) Neural Information Processing. ICONIP 2014. Lecture Notes in Computer Science, vol 8834. Springer, Cham. https://doi.org/10.1007/978-3-319-12637-1_67

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-12637-1_67

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-12636-4

  • Online ISBN: 978-3-319-12637-1

  • eBook Packages: Computer ScienceComputer Science (R0)