Skip to main content
Log in

TBNN: totally-binary neural network for image classification

  • Published:
Optoelectronics Letters Aims and scope Submit manuscript

Abstract

Most binary networks apply full precision convolution at the first layer. Changing the first layer to the binary convolution will result in a significant loss of accuracy. In this paper, we propose a new approach to solve this problem by widening the data channel to reduce the information loss of the first convolutional input through the sign function. In addition, widening the channel increases the computation of the first convolution layer, and the problem is solved by using group convolution. The experimental results show that the accuracy of applying this paper’s method to state-of-the-art (SOTA) binarization method is significantly improved, proving that this paper’s method is effective and feasible.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. HE R, SUN S, YANG J, et al. Knowledge distillation as efficient pre-training: faster convergence, higher data-efficiency, and better transferability[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, June 19–24, 2022, New Orleans, Louisiana, USA. New York: IEEE, 2022: 9161–9171.

    Google Scholar 

  2. ZHANG L, CHEN X, TU X, et al. Wavelet knowledge distillation: towards efficient image-to-image translation[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, June 19–24, 2022, New Orleans, Louisiana, USA. New York: IEEE, 2022: 12464–12474.

    Google Scholar 

  3. ZHONG Y, LIN M, NAN G, et al. IntraQ: learning synthetic images with intra-class heterogeneity for zero-shot network quantization[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, June 19–24, 2022, New Orleans, Louisiana, USA. New York: IEEE, 2022: 12339–12348.

    Google Scholar 

  4. LIU C, DING W, XIA X, et al. Circulant binary convolutional networks: enhancing the performance of 1-bit DCNNs with circulant back propagation[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, June 13–19, 2019, Long Beach, CA, USA. New York: IEEE, 2019: 2691–2699.

    Google Scholar 

  5. LIU Z, SHEN Z, SAVVIDES M, et al. ReActNet: towards precise binary neural network with generalized activation functions[C]//European Conference on Computer Vision, August 23–28, 2020, Virtual. Cham: Springer, 2020: 143–159.

    Google Scholar 

  6. ZHOU S, WU Y, NI Z, et al. Dorefa-net: training low bit width convolutional neural networks with low bit width gradients[EB/OL]. (2016-06-20) [2022-06-22]. https://arxiv.org/pdf/1606.06160.pdf.

  7. DING R, CHIN T W, LIU Z, et al. Regularizing activation distribution for training binarized deep networks[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, June 13–19, 2019, Long Beach, CA, USA. New York: IEEE, 2019: 11408–11417.

    Google Scholar 

  8. HOWARD A G, ZHU M, CHEN B, et al. Mobilenets: efficient convolutional neural networks for mobile vision applications[EB/OL]. (2017-04-17) [2022-06-22]. https://arxiv.org/pdf/1704.04861.pdf.

  9. ZHANG X, ZHOU X, LIN M, et al. Shufflenet: an extremely efficient convolutional neural network for mobile devices[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, June 18–22, 2018, Salt Lake City, UT, USA. New York: IEEE, 2018: 6848–6856.

    Google Scholar 

  10. MEHTA S, RASTEGARI M. Mobilevit: light-weight, general-purpose, and mobile-friendly vision transformer[EB/OL]. (2021-10-17) [2022-06-22]. https://arxiv.org/pdf/2110.02178v2.pdf.

  11. QIN H, GONG R, LIU X, et al. Forward and backward information retention for accurate binary neural networks[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, June 13–19, 2020, Seattle, WA, USA. New York: IEEE, 2020: 2250–2259.

    Google Scholar 

  12. LIU Z, WU B, LUO W, et al. Bi-real net: enhancing the performance of 1-bit CNNs with improved representational capability and advanced training algorithm[C]//Proceedings of the European Conference on Computer Vision, September 8–14, 2018, Munich, Germany. Berlin, Heidelberg: Springer-Verlag, 2018: 722–737.

    Google Scholar 

  13. LIN X, ZHAO C, PAN W. Towards accurate binary convolutional neural network[J]. Advances in neural information processing systems, 2017, 30: 345–353.

    Google Scholar 

  14. SU Z, FANG L, GUO D, et al. FTBNN: rethinking non-linearity for 1-bit CNNs and going beyond[EB/OL]. (2010-09-29) [2022-06-22]. https://www.xueshufan.com/reader/3096361616.

  15. KIM H, KIM K, KIM J, et al. Binaryduo: reducing gradient mismatch in binary activation network by coupling binary activations[EB/OL]. (2002-06-05) [2022-06-22]. https://arxiv.org/pdf/2002.06517v1.pdf.

  16. XIE S, GIRSHICK R, DOLLÁR P, et al. Aggregated residual transformations for deep neural networks[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, July 21–26, 2017, Honolulu, HI, USA. New York: IEEE, 2017: 1492–1500.

    Google Scholar 

  17. CHOLLET F. Xception: deep learning with depth wise separable convolutions[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, July 21–26, 2017, Honolulu, HI, USA. New York: IEEE, 2017: 1251–1258.

    Google Scholar 

  18. HUBARA I, COURBARIAUX M, SOUDRY D, et al. Binarized neural networks[J]. Advances in neural information processing systems, 2016, 29: 4107–4115.

    Google Scholar 

  19. GONG R, LIU X, JIANG S, et al. Differentiable soft quantization: bridging full-precision and low-bit neural networks[C]//Proceedings of the IEEE International Conference on Computer Vision, October 27-November 3, 2019, Seoul, South Korea. New York: IEEE, 2019: 4852–4861.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Guowei Yang.

Ethics declarations

The authors declare that there are no conflicts of interest related to this article.

Additional information

This work has been supported by the National Natural Science Foundation of China (No.62172229).

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhang, Q., Sun, L., Yang, G. et al. TBNN: totally-binary neural network for image classification. Optoelectron. Lett. 19, 117–122 (2023). https://doi.org/10.1007/s11801-023-2113-2

Download citation

  • Received:

  • Revised:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11801-023-2113-2

Document code

Navigation