Advertisement

Neural Networks with Multidimensional Cross-Entropy Loss Functions

  • Alexander SemenovEmail author
  • Vladimir Boginski
  • Eduardo L. Pasiliao
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11917)

Abstract

Deep neural networks have emerged as an effective machine learning tool successfully applied for many tasks, such as misinformation detection, natural language processing, image recognition, machine translation, etc. Neural networks are often applied to binary or multi-class classification problems. In these settings, cross-entropy is used as a loss function for neural network training. In this short note, we propose an extension of the concept of cross-entropy, referred to as multidimensional cross-entropy, and its application as a loss function for classification using neural networks. The presented computational experiments on a benchmark dataset suggest that the proposed approaches may have a potential for increasing the classification accuracy of neural network based algorithms.

Keywords

Neural networks Cross-entropy loss functions 

Notes

Acknowledgements

The work of A. Semenov was funded in part by the AFRL European Office of Aerospace Research and Development (grant no. FA9550-17-1-0030). This material is based on the work supported by the AFRL Mathematical Modeling and Optimization Institute.

References

  1. 1.
    Chechik, G., Sharma, V., Shalit, U., Bengio, S.: Large scale online learning of image similarity through ranking. J. Mach. Learn. Res. 11, 1109–1135 (2010). http://dl.acm.org/citation.cfm?id=1756006.1756042MathSciNetzbMATHGoogle Scholar
  2. 2.
    Chen, W., Chen, X., Zhang, J., Huang, K.: Beyond triplet loss: a deep quadruplet network for person re-identification. In: The Conference on Computer Vision and Pattern Recognition (2017)Google Scholar
  3. 3.
    Liu, W., Wen, Y., Yu, Z., Yang, M.: Large-margin softmax loss for convolutional neural networks. In: Proceedings of the 33rd International Conference on International Conference on Machine Learning, ICML 2016, vol. 48, pp. 507–516. JMLR.org (2016). http://dl.acm.org/citation.cfm?id=3045390.3045445
  4. 4.
    Panchapagesan, S., et al.: Multi-task learning and weighted cross-entropy for DNN-based keyword spotting. In: Interspeech 2016, pp. 760–764 (2016).  https://doi.org/10.21437/Interspeech.2016-1485
  5. 5.
    Wang, H., et al.: Cosface: large margin cosine loss for deep face recognition. In: The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), June 2018Google Scholar
  6. 6.
    Zhang, Z., Sabuncu, M.R.: Generalized cross entropy loss for training deep neural networks with noisy labels. In: Proceedings of the 32nd International Conference on Neural Information Processing Systems, NIPS 2018, pp. 8792–8802. Curran Associates Inc., New York (2018). http://dl.acm.org/citation.cfm?id=3327546.3327555

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Alexander Semenov
    • 1
    Email author
  • Vladimir Boginski
    • 2
  • Eduardo L. Pasiliao
    • 3
  1. 1.University of JyväskyläJyväskyläFinland
  2. 2.University of Central FloridaOrlandoUSA
  3. 3.Air Force Research LaboratoryEglin AFBUSA

Personalised recommendations