Conclusion
In this study, we propose a series of lightweight detectors named TinyDet. TinyDet is with good performance-computation trade-offs (30.3 mAP with only 991 MFLOPs) and applicable to resource-constrained mobile or edge devices. Besides, TinyDet is superior to other lightweight detectors in small object detection.
This is a preview of subscription content, access via your institution.
References
Luo W, Li Y, Urtasun R, et al. Understanding the effective receptive field in deep convolutional neural networks. In: Proceedings of International Conference on Neural Information Processing Systems, 2016. 4905–4913
Wang J, Bohn T A, Ling C X. Pelee: a real-time object detection system on mobile devices. In: Proceedings of International Conference on Neural Information Processing Systems, 2018
Qin Z, Li Z, Zhang Z, et al. ThunderNet: towards real-time generic object detection on mobile devices. In: Proceedings of IEEE International Conference on Computer Vision, 2019
Ghiasi G, Lin T, Le Q V. NAS-FPN: learning scalable feature pyramid architecture for object detection. In: Proceedings of IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2019
Acknowledgements
This work was supported in part by National Natural Science Foundation of China (Grant Nos. 61876212, 61733007), Zhejiang Laboratory (Grant No. 2019NB0AB02), and HUST-Horizon Computer Vision Research Center.
Author information
Authors and Affiliations
Corresponding author
Additional information
Supporting information
Appendixes A and B. The supporting information is available online at info.scichina.com and link.springer.com. The supporting materials are published as submitted, without typesetting or editing. The responsibility for scientific accuracy and content remains entirely with the authors.
Supplementary File
Rights and permissions
About this article
Cite this article
Chen, S., Cheng, T., Fang, J. et al. TinyDet: accurately detecting small objects within 1 GFLOPs. Sci. China Inf. Sci. 66, 119102 (2023). https://doi.org/10.1007/s11432-021-3504-4
Received:
Revised:
Accepted:
Published:
DOI: https://doi.org/10.1007/s11432-021-3504-4