Skip to main content
Log in

Image Key Point Matching by Phase Congruency

  • Published:
Computational Mathematics and Modeling Aims and scope Submit manuscript

A phase congruency measure calculated near image key points is proposed for key point matching. An algorithm for the construction and matching of key point descriptors is presented. The proposed method will match the key points of images of different sizes, with different rotation angles, and acquired under different illumination conditions. A modification of the proposed method can be used for the comparison of key points of iris images.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. X. Jiang, J. Ma, G. Xiao, Z. Shao, and X. Guo, “A review of multimodal image matching: Methods and applications,” Information Fusion, 73, 22–71 (2021).

    Article  Google Scholar 

  2. C. Leng, H. Zhang, B. Li, G. Cai, Z. Pei, and L. He, “Local feature descriptor for image matching: A survey,” IEEE Access, 7, 6424–6434 (2018).

    Article  Google Scholar 

  3. P. A. Van den Elsen, E. J. D. Pol, and M. A. Viergever, “Medical image matching – a review with classification,” IEEE Engineering in Medicine and Biology Magazine, 12, No. 1, 26–39 (1993).

    Article  Google Scholar 

  4. B. Zitova and J. Flusser, “Image registration methods: a survey,” Image and Vision Computing, 21, No. 11, 977–1000 (2003).

    Article  Google Scholar 

  5. H. Li and B. S. Manjunath, “Multisensor image fusion using the wavelet transform,” Graphical Models and Image Processing, 57, No. 3, 235–245 (1995).

    Article  Google Scholar 

  6. X. Jing, Y. X. Hong, S. X. Xin, and M. X. Ying, “Medical image mosaic technology based on image phase correlation,” in: IEEE International Conference on Digital Home (ICDH) (2012), pp. 274–277.

  7. H. B. Kekre and S. D. Thepade, “Scaling invariant fusion of image pieces in panorama making and novel image blending technique,” International Journal of Imaging and Robotics, 1, No. A08, 31–46 (2009).

    Google Scholar 

  8. G. Ravet, System and Method for Tracking the Movement and Location of an Object in a Predefined Area, U.S. Patent No. 7091863 (2006).

  9. K. Ito, A. Morita, T. Aoki, T. Higuchi, H. Nakajima, and K. Kobayashi, “A fingerprint recognition algorithm using phase-based image matching for low-quality fingerprints,” IEEE International Conference on Image Processing, 2, II-33 (2005).

  10. T. Lindeberg, “Scale invariant feature transform,” Scholarpedia, 7, No. 5, 10491 (2012).

  11. Y. Ke and R. Sukthankar, “PCA-SIFT: A more distinctive representation for local image descriptors,” Computer Vision and Pattern Recognition, 2, 2–9 (2004).

    Google Scholar 

  12. K. Mikolajczyk and C. Schmid, “A performance evaluation of local descriptors,” IEEE Transactions on Pattern Analysis and Machine Intelligence, 27, No. 10, 1615–1630 (2005).

    Article  Google Scholar 

  13. H. Bay, T. Tuytelaars, and L. Van Gool, “Surf: Speeded up robust features,” in: European Conference on Computer Vision (2006), pp. 404–417.

  14. A. Barroso-Laguna, E. Riba, D. Ponsa, and K. Mikolajczyk, “Key.Net: Keypoint detection by handcrafted and learned CNN filters,” in: Proceedings of the IEEE/CVF International Conference on Computer Vision (2019), pp. 5836–5844.

  15. G. Georgakis, S. Karanam, Z. Wu, J. Ernst, and J. Košecká, “End-to-end learning of keypoint detector and descriptor for pose invariant 3D matching,” in: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2018), pp. 1965–1973.

  16. S. Cui, Y. Zhong, A. Ma, and L. Zhang, “A novel robust feature descriptor for multi-source remote sensing image registration,” in: IEEE International Geoscience and Remote Sensing Symposium (IGARSS) (2019), pp. 919–922.

  17. X. Liu, Y. Ai, J. Zhang, and Z. Wang, “A novel affine and contrast invariant descriptor for infrared and visible image registration,” Remote Sensing, 10, No. 4, 658 (2018).

  18. X. Liu, J. B. Li, and J. S. Pan, “Feature point matching based on distinct wavelength phase congruency and log-Gabor filters in infrared and visible images,” Sensors, 19, No. 19, 4244 (2019).

  19. Y. Ye and L. Shen, “HOPC: A novel similarity metric based on geometric structural properties for multi-modal remote sensing image matching,” ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences, 3, 9–16 (2016).

    Article  Google Scholar 

  20. X. Xie, Y. Zhang, X. Ling, and X. Wang, “A novel extended phase correlation algorithm based on log-Gabor filtering for multimodal remote sensing image registration,” International J. Remote Sensing, 40, No. 14, 5429–5453 (2019).

    Article  Google Scholar 

  21. Y. Xiang, R. Tao, L. Wan, and F. Wang, “OS-PC: Combining feature representation and 3-D phase correlation for subpixel optical and SAR image registration,” IEEE Transactions on Geoscience and Remote Sensing, 58, No. 9, 6451–6466 (2020).

    Article  Google Scholar 

  22. E. A. Pavel’eva, “Image processing and analysis using phase information,” Komp’yuternaya Optika, 42, No. 6, 1022–1034 (2018).

  23. B. S. Reddy and B. N. Chatterji, “An FFT-based technique for translation, rotation, and scale-invariant image registration,” IEEE Trans. on Image Processing, 5, No. 8, 1266–1271 (1996).

    Article  Google Scholar 

  24. P. Kovesi, “Phase congruency detects corners and edges,” in: The Australian Pattern Recognition Society Conference: DICTA (2003), pp. 309–318.

  25. Z. Wang and E. P. Simoncelli, “Local phase coherence and the perception of blur,” in: Proceedings of the 16th International Conf. on Neural Information Processing Systems (2003), pp. 1435–1442.

  26. E. Rahtu, J. Heikkilä, V. Ojansivu, and T. Ahonen, “Local phase quantization for blur-insensitive image analysis,” Image and Vision Computing, 30, No. 8, 501–512 (2012).

    Article  Google Scholar 

  27. Z. Wang and E. P. Simoncelli, “Translation insensitive image similarity in complex wavelet domain,” Acoustics, Speech, and Signal Processing, 2, 573–576 (2005).

    Google Scholar 

  28. E. A. Pavel’eva, “Matching key points of iris images by projection phase correlation,” Sistemy i Sredstva Informatiki, 23, No. 2, 74–88 (2013).

  29. P. Kovesi, “Edges are not just steps,” Proceedings of the Fifth Asian Conference on Computer Vision, 8, 22–30 (2002).

    Google Scholar 

  30. R. K. M. Hassen, Local Phase Coherence Measurement For Image Analysis and Processing, Ph. D. thesis, Waterloo, Ontario, Canada (2013).

  31. M. R. N. Tagore, G. B. Kande, E. K. Rao, and B. P. Rao, “Segmentation of retinal vasculature using phase congruency and hierarchical clustering,” in: International Conference on Advances in Computing, Communications and Informatics (ICACCI) (2013), pp. 361–366.

  32. J. Fan, Y. Wu, F. Wang, Q. Zhang, G. Liao, and M. Li, “SAR image registration using phase congruency and nonlinear diffusion-based SIFT,” IEEE Geoscience and Remote Sensing Letters, 12, No. 3, 562–566 (2014).

    Google Scholar 

  33. Z. Liu, Y. Feng, H. Chen, and L. Jiao, “A fusion algorithm for infrared and visible based on guided filtering and phase congruency in NSST domain,” Optics and Lasers in Engineering, 97, 71–77 (2017).

    Article  Google Scholar 

  34. Y. Punsawad and Y. Wongsawat, “Palmprint image enhancement using phase congruency,” in: IEEE International Conference on Robotics and Biomimetics (2009), pp. 1643–1646.

  35. P. Kovesi, MATLAB and Octave Functions for Computer Vision and Image Processing; https://www.peterkovesi.com/matlabfns/.

  36. P. Kovesi, “Image features from phase congruency,” Videre: J. Computer Vision Research, 1, No. 3, 1–26 (1999).

  37. M. A. Fischler and R. C. Bolles, “Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography,” Communications of the ACM, 24, No. 6, 381-395 (1981).

  38. C. C. Stearns and K. Kannappan, Method for 2-D Affine Transformation of Images, U.S. Patent No. 5475803 (1995).

  39. Image Database with Affine Invariant Features; http://www.robots.ox.ac.uk/~vgg/research/affine/.

  40. CASIA Iris Image Database Version 4.0; http://biometrics.idealtest.org/dbDetailForUser.do?id=4.

  41. V. A. Tikhonova and E. A. Pavelyeva, “Hybrid iris segmentation method based on CNN and principal curvatures,” CEUR Workshop Proceedings, 2744, No. 31, 1–10 (2020).

    Google Scholar 

  42. M. А. Protsenko and E. A. Pavelyeva, “Iris Image key points descriptors based on phase congruency,” International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, 42, No. 2/W12, 167–171 (2019).

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to M. A. Protsenko.

Additional information

Translated from Prikladnaya Matematika i Informatika, No. 67, 2021, pp. 40–49.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Protsenko, M.A., Pavelyeva, E.A. Image Key Point Matching by Phase Congruency. Comput Math Model 32, 297–304 (2021). https://doi.org/10.1007/s10598-021-09532-z

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10598-021-09532-z

Keywords

Navigation