Skip to main content
Log in

Edge-priority-extraction network using re-parameterization for real-time super-resolution

  • Original article
  • Published:
The Visual Computer Aims and scope Submit manuscript

Abstract

Recently, super-resolution (SR) has achieved superior performance with the development of deep learning. However, previous methods usually require considerable computational resources with a large model size, which hinders practical applications. To achieve real-time inference and high quality for SR, this paper presents an edge-priority-extraction network, which is constructed with our proposed edge-priority blocks (EPB). The EPB utilizes multiple branches with edge information to further improve the network representation. Moreover, it can be re-parameterized for efficient inference. For more effective utilization of edge information, this paper also proposes the mix-priority filter with edge extraction of horizontal and vertical priorities to improve the network performance. The filters can adaptively extract the edge information with multi-direction derivatives. The experimental results show that our models can use less computational cost to meet the real-time demand and have a better SR performance than the recent real-time SR models.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Data availability

The datasets used in this paper can be downloaded from https://github.com/sanghyun-son/EDSR-PyTorch.

References

  1. Dong, C., Loy, C.C., Tang, X.: Accelerating the super-resolution convolutional neural network. In: European conference on computer vision, pp. 391–407. Springer, Berlin (2016)

    Google Scholar 

  2. Shi, W., Caballero, J., Huszár, F., Totz, J., Aitken, A.P., Bishop, R., Rueckert, D., Wang, Z.: Real-time single image and video super-resolution using an efficient sub-pixel convolutional neural network. In: Proceedings of the IEEE conference on computer vision and pattern recognition. pp. 1874–1883 (2016).

  3. Ahn, N., Kang, B., Sohn, K.-A.: Fast, accurate, and lightweight super-resolution with cascading residual network. In: Proceedings of the European conference on computer vision (ECCV). pp. 252–268 (2018).

  4. Hui, Z., Wang, X., Gao, X.: Fast and accurate single image super-resolution via information distillation network. In: Proceedings of the IEEE conference on computer vision and pattern recognition. pp. 723–731 (2018).

  5. Li, W., Zhou, K., Qi, L., Jiang, N., Lu, J., Jia, J.: Lapar: Linearly-assembled pixel-adaptive regression network for single image super-resolution and beyond. Adv. Neural. Inf. Process. Syst. 33, 20343–20355 (2020)

    Google Scholar 

  6. Luo, X., Xie, Y., Zhang, Y., Qu, Y., Li, C., Fu, Y.: Latticenet: Towards lightweight image super-resolution with lattice block. In: European Conference on Computer Vision. pp. 272–289. Springer (2020).

  7. Liu, J., Tang, J., Wu, G.: Residual feature distillation network for lightweight image super-resolution. In: European Conference on Computer Vision. pp. 41–55. Springer (2020).

  8. Chu, X., Zhang, B., Ma, H., Xu, R., Li, Q.: Fast, accurate and lightweight super-resolution with neural architecture search. In: 2020 25th International conference on pattern recognition (ICPR). pp. 59–64. IEEE (2021).

  9. Wang, S., Zhou, T., Lu, Y., Di, H.: Contextual transformation network for lightweight remote-sensing image super-resolution. IEEE Trans. Geosci. Remote Sens. 60, 1–13 (2022). https://doi.org/10.1109/TGRS.2021.3132093

    Article  Google Scholar 

  10. Chen, Y., Xia, R., Yang, K., Zou, K.: MFFN: image super-resolution via multi-level features fusion network. Vis. Comput. (2023). https://doi.org/10.1007/s00371-023-02795-0

    Article  Google Scholar 

  11. Lu, X., Xie, X., Ye, C., Xing, H., Liu, Z., Cai, C.: A lightweight generative adversarial network for single image super-resolution. Vis. Comput. (2023). https://doi.org/10.1007/s00371-022-02764-z

    Article  Google Scholar 

  12. Wu, J., Wang, Y., Zhang, X.: Lightweight asymmetric convolutional distillation network for single image super-resolution. IEEE Signal Process. Lett. 30, 733–737 (2023). https://doi.org/10.1109/LSP.2023.3286811

    Article  Google Scholar 

  13. Lim, B., Son, S., Kim, H., Nah, S., Mu Lee, K.: Enhanced deep residual networks for single image super-resolution. In: Proceedings of the IEEE conference on computer vision and pattern recognition workshops. pp. 136–144 (2017).

  14. Zhang, Y., Tian, Y., Kong, Y., Zhong, B., Fu, Y.: Residual dense network for image super-resolution. In: Proceedings of the IEEE conference on computer vision and pattern recognition. pp. 2472–2481 (2018).

  15. Dai, T., Cai, J., Zhang, Y., Xia, S.-T., Zhang, L.: Second-order attention network for single image super-resolution. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition. pp. 11065–11074 (2019).

  16. Niu, B., Wen, W., Ren, W., Zhang, X., Yang, L., Wang, S., Zhang, K., Cao, X., Shen, H.: Single image super-resolution via a holistic attention network. In: European conference on computer vision. pp. 191–207. Springer (2020).

  17. Guo, Y., Yao, A., Chen, Y.: Dynamic network surgery for efficient dnns. Adv. Neural Inf. Process. Syst. 29, (2016).

  18. Dong, X., Chen, S., Pan, S.: Learning to prune deep neural networks via layer-wise optimal brain surgeon. Adv. Neural Inf. Process. Syst. 30, (2017).

  19. He, Y., Liu, P., Wang, Z., Hu, Z., Yang, Y.: Filter pruning via geometric median for deep convolutional neural networks acceleration. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition. pp. 4340–4349 (2019).

  20. Zhan, Z., Gong, Y., Zhao, P., Yuan, G., Niu, W., Wu, Y., Zhang, T., Jayaweera, M., Kaeli, D., Ren, B., others: Achieving on-mobile real-time super-resolution with neural architecture and pruning search. In: Proceedings of the IEEE/CVF International Conference on Computer Vision. pp. 4821–4831 (2021).

  21. Wu, Y., Gong, Y., Zhao, P., Li, Y., Zhan, Z., Niu, W., Tang, H., Qin, M., Ren, B., Wang, Y.: Compiler-Aware Neural Architecture Search for On-Mobile Real-time Super-Resolution. In: Avidan, S., Brostow, G., Cissé, M., Farinella, G.M., Hassner, T. (eds.) Computer Vision – ECCV 2022, pp. 92–111. Springer Nature Switzerland, Cham (2022)

    Chapter  Google Scholar 

  22. Yu, J., Fan, Y., Yang, J., Xu, N., Wang, X., Huang, T.S.: Wide Activation for Efficient and Accurate Image Super-Resolution. ArXiv Prepr. ArXiv180808718. (2018).

  23. Bamberger, R.H., Smith, M.J.T.: A filter bank for the directional decomposition of images: theory and design. IEEE Trans. Signal Process. 40, 882–893 (1992). https://doi.org/10.1109/78.127960

    Article  Google Scholar 

  24. Paris, S., Hasinoff, S.W., Kautz, J.: Local laplacian filters: edge-aware image processing with a laplacian pyramid. Commun. ACM 58, 81–91 (2015)

    Article  Google Scholar 

  25. Liang, Y., Wang, J., Zhou, S., Gong, Y., Zheng, N.: Incorporating image priors with deep convolutional neural networks for image super-resolution. Neurocomputing 194, 340–347 (2016)

    Article  Google Scholar 

  26. Zhou, F., Li, X., Li, Z.: High-frequency details enhancing DenseNet for super-resolution. Neurocomputing 290, 34–42 (2018)

    Article  Google Scholar 

  27. Ma, C., Rao, Y., Cheng, Y., Chen, C., Lu, J., Zhou, J.: Structure-Preserving Super Resolution With Gradient Guidance. In: 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). pp. 7766–7775 (2020). https://doi.org/10.1109/CVPR42600.2020.00779.

  28. Zhang, X., Zeng, H., Zhang, L.: Edge-oriented convolution block for real-time super resolution on mobile devices. In: Proceedings of the 29th ACM International Conference on Multimedia. pp. 4034–4043 (2021).

  29. Dong, C., Loy, C.C., He, K., Tang, X.: Image super-resolution using deep convolutional networks. IEEE Trans. Pattern Anal. Mach. Intell. 38, 295–307 (2015)

    Article  Google Scholar 

  30. Hui, Z., Gao, X., Yang, Y., Wang, X.: Lightweight image super-resolution with information multi-distillation network. In: Proceedings of the 27th acm international conference on multimedia. pp. 2024–2032 (2019).

  31. Amaranageswarao, G., Deivalakshmi, S., Ko, S.-B.: Residual learning based densely connected deep dilated network for joint deblocking and super resolution. Appl. Intell. 50, 2177–2193 (2020)

    Article  Google Scholar 

  32. Zeng, C., Li, G., Chen, Q., Xiao, Q.: Lightweight global-locally connected distillation network for single image super-resolution. Appl. Intell. 52, 1–13 (2022)

    Article  Google Scholar 

  33. Zhu, X., Guo, K., Ren, S., Hu, B., Hu, M., Fang, H.: Lightweight image super-resolution with expectation-maximization attention mechanism. IEEE Trans. Circuits Syst. Video Technol. 32, 1273–1284 (2022). https://doi.org/10.1109/TCSVT.2021.3078436

    Article  Google Scholar 

  34. Park, K., Soh, J.W., Cho, N.I.: A dynamic residual self-attention network for lightweight single image super-resolution. IEEE Trans. Multimed. 25, 907–918 (2023). https://doi.org/10.1109/TMM.2021.3134172

    Article  Google Scholar 

  35. Frankle, J., Carbin, M.: The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks. In: International Conference on Learning Representations (2018).

  36. Liu, Z., Sun, M., Zhou, T., Huang, G., Darrell, T.: Rethinking the Value of Network Pruning. In: International Conference on Learning Representations (2018).

  37. Zhuang, Z., Tan, M., Zhuang, B., Liu, J., Guo, Y., Wu, Q., Huang, J., Zhu, J.: Discrimination-aware channel pruning for deep neural networks. In: Proceedings of the 32nd International Conference on Neural Information Processing Systems. pp. 883–894 (2018).

  38. Liu, N., Ma, X., Xu, Z., Wang, Y., Tang, J., Ye, J.: Autocompress: An automatic dnn structured pruning framework for ultra-high compression rates. In: Proceedings of the AAAI Conference on Artificial Intelligence. pp. 4876–4883 (2020).

  39. Gong, Y., Zhan, Z., Li, Z., Niu, W., Ma, X., Wang, W., Ren, B., Ding, C., Lin, X., Xu, X., others: A privacy-preserving-oriented dnn pruning and mobile acceleration framework. In: Proceedings of the 2020 on Great Lakes Symposium on VLSI. pp. 119–124 (2020).

  40. Ma, X., Guo, F.-M., Niu, W., Lin, X., Tang, J., Ma, K., Ren, B., Wang, Y.: Pconv: The missing but desirable sparsity in dnn weight pruning for real-time execution on mobile devices. In: Proceedings of the AAAI Conference on Artificial Intelligence. pp. 5117–5124 (2020).

  41. Li, Y., Gu, S., Zhang, K., Gool, L.V., Timofte, R.: Dhp: Differentiable meta pruning via hypernetworks. In: European Conference on Computer Vision. pp. 608–624. Springer (2020).

  42. Arora, S., Cohen, N., Hazan, E.: On the optimization of deep networks: Implicit acceleration by overparameterization. In: International Conference on Machine Learning. pp. 244–253. PMLR (2018).

  43. Zagoruyko, S., Komodakis, N.: Diracnets: Training very deep neural networks without skip-connections. ArXiv Prepr: https://arxiv.org/abs/1706.00388. (2017).

  44. He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: Proceedings of the IEEE conference on computer vision and pattern recognition. pp. 770–778 (2016).

  45. Ding, X., Zhang, X., Ma, N., Han, J., Ding, G., Sun, J.: Repvgg: Making vgg-style convnets great again. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. pp. 13733–13742 (2021).

  46. Ding, X., Zhang, X., Han, J., Ding, G.: Diverse branch block: Building a convolution as an inception-like unit. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. pp. 10886–10895 (2021).

  47. Timofte, R., Agustsson, E., Van Gool, L., Yang, M.-H., Zhang, L.: Ntire 2017 challenge on single image super-resolution: Methods and results. In: Proceedings of the IEEE conference on computer vision and pattern recognition workshops. pp. 114–125 (2017).

  48. Bevilacqua, M., Roumy, A., Guillemot, C., Alberi-Morel, M.L.: Low-complexity single-image super-resolution based on nonnegative neighbor embedding. (2012).

  49. Zeyde, R., Elad, M., Protter, M.: On single image scale-up using sparse-representations. In: International conference on curves and surfaces. pp. 711–730. Springer (2010).

  50. Martin, D., Fowlkes, C., Tal, D., Malik, J.: A database of human segmented natural images and its application to evaluating segmentation algorithms and measuring ecological statistics. In: Proceedings Eighth IEEE International Conference on Computer Vision. ICCV 2001. pp. 416–423. IEEE (2001).

  51. Huang, J.-B., Singh, A., Ahuja, N.: Single image super-resolution from transformed self-exemplars. In: Proceedings of the IEEE conference on computer vision and pattern recognition. pp. 5197–5206 (2015).

  52. Matsui, Y., Ito, K., Aramaki, Y., Fujimoto, A., Ogawa, T., Yamasaki, T., Aizawa, K.: Sketch-based manga retrieval using manga109 dataset. Multimed. Tools Appl. 76, 21811–21838 (2017)

    Article  Google Scholar 

  53. Wang, Z., Bovik, A.C., Sheikh, H.R., Simoncelli, E.P.: Image quality assessment: from error visibility to structural similarity. IEEE Trans. Image Process. 13, 600–612 (2004)

    Article  Google Scholar 

  54. Kingma, D.P., Ba, J.: Adam: A method for stochastic optimization. ArXiv Prepr. ArXiv14126980. (2014).

  55. Keys, R.: Cubic convolution interpolation for digital image processing. IEEE Trans. Acoust. Speech Signal Process. 29, 1153–1160 (1981)

    Article  MathSciNet  Google Scholar 

  56. Zhou, L., Cai, H., Gu, J., Li, Z., Liu, Y., Chen, X., Qiao, Y., Dong, C.: Efficient image super-resolution using vast-receptive-field attention. In: European Conference on Computer Vision. pp. 256–272. Springer (2022).

Download references

Acknowledgements

This research was supported by the National Natural Science Foundation of China under Grant No. 62072405 and the Zhejiang Provincial Natural Science Foundation of China under Grant No. LGF20F020017.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tian-yang Dong.

Ethics declarations

Conflict of interest

The authors declare that they have no conflicts of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ying, Wy., Dong, Ty. & Fan, J. Edge-priority-extraction network using re-parameterization for real-time super-resolution. Vis Comput (2023). https://doi.org/10.1007/s00371-023-03197-y

Download citation

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s00371-023-03197-y

Keywords

Navigation