Abstract
Feature-based methods for image registration frequently encounter the correspondence problem. In this paper, we formulate feature-based image registration as a manifold alignment problem, and present a novel matching method for finding the correspondences among different images containing the same object. Different from the semi-supervised manifold alignment, our methods map the data sets to the underlying common manifold without using correspondence information. An iterative multiplicative updating algorithm is proposed to optimize the objective, and its convergence is guaranteed theoretically. The proposed approach has been tested for matching accuracy, and robustness to outliers. Its performance on synthetic and real images is compared with the state-of-the-art reference algorithms.
Similar content being viewed by others
References
Deriche, R., Zhang, Z., Luong, Q.T., Faugeras, O.: Robust recovery of the epipolar geometry for an uncalibrated stereo rig. In third European conference on computer vision, Springer-Verlag, Stockholm, pp. 567–576 (1994)
Besl, P.J., McKay, N.D.: A method for registration of 3D shapes. IEEE Trans. Pattern Anal. Mach. Intell. 14(2), 239–254 (1992)
Chui, H., Rangarajan, A.: A new point matching algorithm for non-rigid registration. Comput. Vis. Imag. Underst. 89(2), 114–141 (2003)
Fatih Demirci, M.: Graph-based shape indexing. Mach. Vis. Appl. 23(3), 541–555 (2012)
Jiang, H., Ngo, C.: Graph based image matching. In: Proceedings of the 17th International Conference on Pattern Recognition, pp. 658–661 (2004)
Scott, G., Longuet-Higgins, H.: An algorithm for associating the features of two patterns. In: Proceedings of Biological Sciences, vol. 244, pp. 21–26 (1991)
Shapiro, L.S., Brady, J.M.: Feature-based correspondence: an eigenvector approach. Imag. Vis. Comput. 10(5), 283–288 (1992)
Sam Ge, Shuzhi, Guan, Feng, Pan, Yaozhang, Lo, Ai Poh: Neighborhood linear embedding for intrinsic structure discovery. Mach. Vis. Appl. 21(3), 391–401 (2010)
Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science 290, 2323–2326 (2000)
Belkin, M., Niyogi, P.: Laplacian eigenmaps for dimensionality reduction and data representation. Neural Comput. 15, 1373–1396 (2003)
Tenenbaum, J., De Silva, V., Langford, J.: A global geometric framework for nonlinear dimensionality reduction. Science 290, 2319–2323 (2000)
He, X., Niyogi, P.: Locality preserving projections. In: Advances of Neural Information Processing Systems (NIPS), vol. 16, pp. 1–8 (2003)
He, X., Cai, D., Yan, S., Zhang, H.-J.: Neighborhood preserving embedding. In: Proceedings of the 10th IEEE International Conference on Computer Vision, vol. 2, pp. 1208–1213 (2005)
Ham, J., Lee, D., Saul, L.: Learning high dimensional correspondence from low dimensional manifolds. In: Workshop on the Continuum from Labeled to Unlabeled Data in Machine Learning and Data Mining, pp. 34–41 (2003)
Ham, J., Lee, D., Saul, L.: Semisupervised alignment of manifolds. In: Proceedings of the 8th International Workshop on Artificial Intelligence and Statics, pp. 1–8 (2005)
Verbeek, J., Roweis, S., Vlassis, N.: Non-linear CCA and PCA by alignment of local models. In: Advances in Neural Information Processing Systems, vol. 16, pp. 1–8 (2004)
Verbeek, J., Vlassis, N.: Gaussian fields for semi-supervised regression and correspondence learning. Pattern Recogn. 39(10), 1864–1875 (2006)
Lafon, S., Keller, Y., Coifman, R.: Data fusion and multicue data matching by diffusion maps. IEEE Trans. Pattern Anal. Mach. Intell. 28(11), 1784–1797 (2006)
Zhai, D., Li, B., Chang, H., Shan, S., Chen, X., Gao, W.: Manifold alignment via corresponding projections. In: Proceedings of the British Machine Vision Conference, vol. 3, pp. 1–11 (2010)
Wang, C., Mahadevan, S.: Manifold alignment without correspondence. In: Proceedings of the 21st International Joint Conference on Artificial Intelligence, pp. 1273–1278 (2009)
Xiong, L., Wang, F., Zhang, C.: Semi-definite manifold alignment. In: Proceedings of the 18th European Conference on Machine Learning, pp. 773–781 (2007)
Zhang, Limei, Qiao, Lishan, Chen, Songcan: Graph-optimized locality preserving projections. Pattern Recogn. 43(6), 1993–2002 (2010)
Zhang, Honggang, Deng, Weihong, Guo, Jun, Yang, Jie: Locality preserving and global discriminant projection with prior information. Mach. Vis. Appl. 21(4), 577–585 (2010)
Papadimitriou, C., Steiglitz, K.: Combinatorial optimization: algorithms and complexity. Dover Publications, Mineola (1998)
Wang, H.F., Hancock, E.R.: A kernel view of spectral point pattern matching. Struct. Syntactic Stat. Pattern Recogn. Proc. 3138, 361–369 (2004)
Harris, C., Stephens, M.J.: A combined corner and edge detector. In: Proceedings of 4th Alvey Vision Conference, pp. 147–151 (1988)
Szeliski, R.: Computer vision: algorithms and applications. Springer, New York (2010)
Lowe, D.G.: Distinctive image features from scale-invariant key points. Int. J. Comput. Vis. 60(2), 91–110 (2004)
Lee, D., Seung, H.S.: Algorithms for non-negative matrix factorization. Advances in neural information processing systems, vol. 13, MIT Press (2001)
Cai, Deng, He, Xiaofei, Han, Jiawei, Huang, Thomas S.: Graph regularized nonnegative matrix factorization for data representation. IEEE Trans. Pattern Anal. Mach. Intell. 33(8), 1548–1560 (2011)
Acknowledgments
This work was supported by the National Natural Science Foundation of China (No. 61201323, 60972150), Northwestern Polytechnical University Foundation of Fundamental Research (No. JC20110277), and Northwestern Polytechnical University Doctoral Dissertation Innovation Foundation (No. CX200819).
Author information
Authors and Affiliations
Corresponding author
Appendix (Proofs of Theorem 1)
Appendix (Proofs of Theorem 1)
To prove Theorem 1, we need to show that \(J\) is non-increasing under the update steps in Eqs. (19) and (20). First, we need to prove that \(J\) is non-increasing under the update step in Eq. (19). We will follow the similar procedure described in [29]. We use the auxiliary function approach to prove the convergence. Here we first introduce the definition of auxiliary function.
Definition
\(G(u,u^{{\prime }})\) is an auxiliary function for \(F(u)\) if the conditions
are satisfied.
Lemma 1
If \(G\) is an auxiliary function of \(F\), then \(F\) is non-increasing under the update
Proof
\(F(u^{(t+1)})\!\le \! G(u^{(t+1)},u^{(t)})\!\le \! G(u^{(t)},u^{(t)})\!=\!F(u^{(t)})\)
Considering any element \(a_{kl} \) in \(A\), we use \(F\) to denote the part of \(J\) which is only relevant to \(a_{kl} \). It is easy to check that
Lemma 2
Function
is an auxiliary function for \(F(a)\).
Proof
Since \(G(a,a)=F(a)\) is obvious, we need only show that \(G(a,a_{kl}^{(t)} )\ge F(a)\). To do this, we compare the Taylor series expansion of \(F(a)\)
With Eq. (34) to find that \(G(a,a_{kl}^{(t)} )\ge F(a)\) is equivalent to [30]
Since
and
Thus, \(G(a,a_{kl}^{(t)} )\ge F(a)\) \(\square \)
We can now demonstrate the convergence of Theorem 1:
Proof of Theorem 1
\(a^{(t+1)}=\mathop {\arg \min }\limits _a G(a,a^{(t)})\), and
The minimum of \(G(a,a^{(t)})\) with respect to \(a\) is determined by setting the gradient to zeros \(\frac{\partial G(a,a^{(t)})}{\partial a}=0\):
So, we get
Since \(G(a,a^{(t)})\) is an auxiliary function, \(F\) is non-increasing under this update rule. \(J\) can similarly be shown to be non-increasing under the update rules for \(B\). \(\square \)
Rights and permissions
About this article
Cite this article
Yan, W., Tian, Z., Duan, X. et al. Feature matching based on unsupervised manifold alignment. Machine Vision and Applications 24, 983–994 (2013). https://doi.org/10.1007/s00138-012-0479-4
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00138-012-0479-4