Neural Networks with Multidimensional Cross-Entropy Loss Functions
Deep neural networks have emerged as an effective machine learning tool successfully applied for many tasks, such as misinformation detection, natural language processing, image recognition, machine translation, etc. Neural networks are often applied to binary or multi-class classification problems. In these settings, cross-entropy is used as a loss function for neural network training. In this short note, we propose an extension of the concept of cross-entropy, referred to as multidimensional cross-entropy, and its application as a loss function for classification using neural networks. The presented computational experiments on a benchmark dataset suggest that the proposed approaches may have a potential for increasing the classification accuracy of neural network based algorithms.
KeywordsNeural networks Cross-entropy loss functions
The work of A. Semenov was funded in part by the AFRL European Office of Aerospace Research and Development (grant no. FA9550-17-1-0030). This material is based on the work supported by the AFRL Mathematical Modeling and Optimization Institute.
- 2.Chen, W., Chen, X., Zhang, J., Huang, K.: Beyond triplet loss: a deep quadruplet network for person re-identification. In: The Conference on Computer Vision and Pattern Recognition (2017)Google Scholar
- 3.Liu, W., Wen, Y., Yu, Z., Yang, M.: Large-margin softmax loss for convolutional neural networks. In: Proceedings of the 33rd International Conference on International Conference on Machine Learning, ICML 2016, vol. 48, pp. 507–516. JMLR.org (2016). http://dl.acm.org/citation.cfm?id=3045390.3045445
- 4.Panchapagesan, S., et al.: Multi-task learning and weighted cross-entropy for DNN-based keyword spotting. In: Interspeech 2016, pp. 760–764 (2016). https://doi.org/10.21437/Interspeech.2016-1485
- 5.Wang, H., et al.: Cosface: large margin cosine loss for deep face recognition. In: The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), June 2018Google Scholar
- 6.Zhang, Z., Sabuncu, M.R.: Generalized cross entropy loss for training deep neural networks with noisy labels. In: Proceedings of the 32nd International Conference on Neural Information Processing Systems, NIPS 2018, pp. 8792–8802. Curran Associates Inc., New York (2018). http://dl.acm.org/citation.cfm?id=3327546.3327555