Learning the Geometric Structure of Manifolds with Singularities Using the Tensor Voting Graph

Abstract

We present a general framework that addresses manifolds with singularities and multiple intersecting manifolds, which is also robust against a large number of outliers. We suggest a hybrid local–global method that leverages the algorithmic capabilities of the tensor voting framework and, unlike tensor voting, is capable of reliably inferring the global structure of complex manifolds by using a unique graph construction, called the tensor voting graph (TVG). Moreover, we propose to explicitly and directly resolve the ambiguities near the intersections with a novel algorithm, which uses the TVG and the positions of the points near the manifold intersections. Experimental results in estimating geodesic distances and clustering demonstrate that our framework outperforms the state of the art, especially on geometric complex settings such as when the tangent spaces at the intersections points are not orthogonal and in the presence of a large amount of outliers.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Notes

  1. 1.

    http://www.vision.jhu.edu/data/hopkins155/.

References

  1. 1.

    Tenenbaum, J., de Silva, V., Langford, J.: A global geometric framework for nonlinear dimensionality reduction. Science 290(5500), 2319–2323 (2000)

    Article  Google Scholar 

  2. 2.

    Roweis, S., Saul, L.: Nonlinear dimensionality reduction by locally linear embedding. Science 290, 2323–2326 (2000)

    Article  Google Scholar 

  3. 3.

    Belkin, M., Que, Q., Wang, Y., Zhou, X.: Graph laplacians on singular manifolds: toward understanding complex spaces: graph laplacians on manifolds with singularities and boundaries, CoRR, vol. abs/1211.6727 (2012)

  4. 4.

    Mordohai, P., Medioni, G.: Tensor Voting: A Perceptual Organization Approach to Computer Vision and Machine Learning. Morgan & Claypool Publishers, San Rafael (2006)

    Google Scholar 

  5. 5.

    Wang, Y., Jiang, Y., Wu, Y., Zhou, Z.: Spectral clustering on multiple manifolds. IEEE Trans. Neural Netw. 22(7), 1149–1161 (2011)

    Article  Google Scholar 

  6. 6.

    Gong, D., Zhao, X., Medioni, G.: Robust multiple manifold structure learning. In: ICML (2012)

  7. 7.

    Goldberg, A.B., Zhu, X., Singh, A., Xu, Z., Nowak, R.: Multi-manifold semi-supervised learning. In: AISTATS, pp. 169–176 (2009)

  8. 8.

    EryArias-Castro, G., Zhang, T.: Spectral clustering based on local pca. In Review (2013)

  9. 9.

    Belkin, M., Niyogi, P.: Laplacian eigenmaps for dimensionality reduction and data representation. Neural Comput 15(6), 1373–1396 (2003)

    Article  MATH  Google Scholar 

  10. 10.

    Donoho, D.L., Grimes, C.: Hessian eigenmaps: locally linear embedding techniques for high dimensional data. Proc. Natl. Acad. Sci. USA 100, 5591 (2003)

    MathSciNet  Article  MATH  Google Scholar 

  11. 11.

    Coifman, R.R., Lafon, S., Lee, A.B., Maggioni, M., Warner, F., Zucker, S.: Geometric diffusions as a tool for harmonic analysis and structure definition of data: Diffusion maps. In: Proceedings of the National Academy of Sciences, pp. 7426–7431 (2005)

  12. 12.

    Zhang, Z., Zha, H.: Principal manifolds and nonlinear dimensionality reduction via tangent space alignment. SIAM J. Sci. Comput. 26(1), 313–338 (2005)

    MathSciNet  Article  MATH  Google Scholar 

  13. 13.

    Brand, M.: Charting a manifold. Adv. Neural Inf. Process. Syst., pp. 985–992 (2003)

  14. 14.

    Lin, T., Zha, H.: Riemannian manifold learning. IEEE Trans. Pattern Anal. Mach. Intell. 30(5), 796–809 (2008)

    Article  Google Scholar 

  15. 15.

    Dollár, P., Rabaud, V., Belongie, S.: Non-isometric manifold learning: analysis and an algorithm. In: Proceedings of the 24th International Conference on Machine Learning, pp. 241–248 (2007)

  16. 16.

    Singer, A., Wu, H.: Vector diffusion maps and the connection laplacian. Commun. Pure Appl. Math. 65(8), 1067–1144 (2012)

    MathSciNet  Article  MATH  Google Scholar 

  17. 17.

    Zelnik-manor, L., Perona, P.: Self-tuning spectral clustering. In: Advances in Neural Information Processing Systems, pp. 1601–1608 (2004)

  18. 18.

    Gionis, A., Hinneburg, A., Papadimitriou, S., Tsaparas, P.: Dimension induced clustering. In: LWA, pp. 109–110 (2005)

  19. 19.

    Vidal, R., Ma, Y., Sastry, S.: Generalized principal component analysis (gpca) (2003)

  20. 20.

    Elhamifar, E., Vidal, R.: Sparse subspace clustering. In: CVPR, pp. 2790–2797 (2009)

  21. 21.

    Chen, G., Lerman, G.: Spectral curvature clustering (scc). Int. J. Comput. Vis. 81(3), 317–330 (2009)

    Article  Google Scholar 

  22. 22.

    Ng, A., Jordan, M., Weiss, Y.: On spectral clustering: analysis and an algorithm. In: Advances in Neural Information Processing Systems, pp. 849–856 (2001)

  23. 23.

    Deutsch, S., Medioni, G.G.: Unsupervised learning using the tensor voting graph. In: Proceedings of Scale Space and Variational Methods in Computer Vision—5th International Conference, pp. 282–293. SSVM 2015, Lège-Cap Ferret, 31 May–4 June 2015

  24. 24.

    Deutsch, S., Medioni, G.G.: Intersecting manifolds: detection, segmentation, and labeling. In: Proceedings of the Twenty-Fourth International Joint Conference on Artificial Intelligence, pp. 3445–3452. IJCAI 2015, Buenos Aires, 25–31 July 2015

  25. 25.

    Biederman, I.: Recognition-by-components: a theory of human image understanding. Psychol. Rev. 94, 115–147 (1987)

    Article  Google Scholar 

  26. 26.

    Waltz, D.L.: Generating semantic descriptions from drawings of scenes with shadows. Technical Report, Cambridge, MA (1972)

  27. 27.

    Mordohai, P., Medioni, G.: Dimensionality estimation, manifold learning and function approximation using tensor voting. J. Mach. Learn. Res. 11, 411–450 (2010)

    MathSciNet  MATH  Google Scholar 

  28. 28.

    Niyogi, P., Smale, S., Weinberger, S.: Finding the homology of submanifolds with high confidence from random samples. Discrete Comput. Geom. 39(1), 419–441 (2008)

    MathSciNet  Article  MATH  Google Scholar 

  29. 29.

    Dijkstra, E.: Communication with an Automatic Computer. Ph.D. thesis, University of Amsterdam (1959)

  30. 30.

    Genovese, C.R., Perone-Pacifico, M., Verdinelli, I., Wasserman, L.: Minimax manifold estimation. J. Mach. Learn. Res. 13, 1263–1291 (2012)

    MathSciNet  MATH  Google Scholar 

  31. 31.

    Waltz, D.: Understanding line drawings of scenes with shadows. In: The Psychology of Computer Vision. McGraw-Hill (1975)

  32. 32.

    Mordohai, P., Medioni, G.: Junction inference and classification for figure completion using tensor voting. In: 2012 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, vol. 4, pp. 56 (2004)

  33. 33.

    Tang, C.-K., Medioni, G.G.: Inference of integrated surface, curve, and junction descriptions from sparse 3d data. IEEE Trans. Pattern Anal. Mach. Intell. 20(11), 1206–1223 (1998)

    Article  Google Scholar 

  34. 34.

    Luxburg, U.: A tutorial on spectral clustering. Stat. Comput. 17(4), 395–416 (2007)

    MathSciNet  Article  Google Scholar 

  35. 35.

    Li, Z., Guo, J., Cheong, L.F., Zhou, S.Z.: Perspective motion segmentation via collaborative clustering. In: ICCV, pp. 1369–1376 (2013)

  36. 36.

    Arias-Castro, E., Chen, G., Lerman, G.: Spectral clustering based on local linear approximations. Electron. J. Stat. 5, 1537–1587 (2011)

    MathSciNet  Article  MATH  Google Scholar 

  37. 37.

    Martin, S., Pollock, S.N., Coutsias, E.A., Watson, J.P., Brown, W.M.: Algorithmic dimensionality reduction for molecular structure analysis. J. Chem. Phys. 129(6), 064118 (2008)

    Article  Google Scholar 

  38. 38.

    Gong, D., Medioni, G.: Dynamic manifold warping for view invariant action recognition. In: IEEE International Conference on Computer Vision, pp. 571–578 (2011)

Download references

Author information

Affiliations

Authors

Corresponding author

Correspondence to Shay Deutsch.

Appendix: Theoretical Analysis

Appendix: Theoretical Analysis

In this section we prove that given two sub-manifolds that are intersecting, and then under certain conditions, the maximal principal angle of between the local tangent spaces is smaller for points which belong to the same manifold in the local intersection area. Thus, given three points which are sufficiently close to the local intersection area, where only two points belong to the same manifold, the maximal principal angle will be smaller between the pair which belongs to the same manifold. This result serves as a motivation for our ambiguity resolution algorithm, which allows us to untangle the manifolds in the intersection area. The proof is given for the case of \(K=2\) intersecting manifolds for simplicity, and can be extended to \(K>2\) .

Preliminary Definitions Let \(\mathbb {R}^{N}\) be the ambient space. Let \(M_{d}(\varvec{k})\) denote the class of connected, \(C^{2}\) and compact d dimensional manifolds without boundary embedded in \(\mathbb {R}^{N}\), with reach at least \(\frac{1}{\varvec{k}}\), which is a notion used to quantify smoothness[8].

Formally, the reach of \(M\subset \mathbb {R}^{N}\) is the supremum over \(r>0\) such that, for each \(\mathbf {y}\in B(\mathbf {x},r)\), where

$$\begin{aligned} B(\mathbf {x},r)=\left\{ \mathbf {y}\in \mathbb {R}^{N}: ||\mathbf {x}- \mathbf {y}||<r \right\} \end{aligned}$$
(11)

there is a unique point in M nearest \(\mathbf {x}\). For sub-manifolds without boundaries, the reach coincides with the condition number \(1 \backslash \tau \) [8].

Given two connected, \(C^{2}\) and compact d dimensional manifolds \(M_{1}, M_{2} \in M_{d}(\varvec{k})\). Define \(X_{J}\) as the set of points where \(M_{1}, M_{2}\) intersect

$$\begin{aligned} X_{J}= \left\{ \mathbf {x}\in M_{1}\cap M_{2} | \mathbf {x}=\mathbf {s}_{\mathbf {1}}=\mathbf {s}_{\mathbf {2}}, \mathbf {s}_{\mathbf {1}} \in M_{1}, \mathbf {s}_{\mathbf {2}} \in M_{2} \right\} \end{aligned}$$
(12)

Let \(\theta _J\) to be the set of all principal angles corresponding to the maximal principal angle \(\theta _{\max }\) at J:

$$\begin{aligned} \theta _J(M_{1},M_{2})=\left\{ \theta _{\max }\left( T_{M_1}(\mathbf {s}_\mathbf{1}),T_{M_{2}}(\mathbf {s}_{\mathbf {2}}) \right) |\mathbf {s}_{\mathbf {1}},\mathbf {s}_{\mathbf {2}}\in X_{J} \right\} \end{aligned}$$
(13)

and also let

$$\begin{aligned} \theta _{J}^{\inf }= \underset{(\mathbf {s}_{\mathbf {1}},\mathbf {s}_{\mathbf {2}}) }{\inf } \left\{ \theta _{\max }(\mathbf {s}_{\mathbf {1}},\mathbf {s}_{\mathbf {2}}) \right| \theta _{\max } (\mathbf {s}_{\mathbf {1}}, \mathbf {s}_{\mathbf {2}}) \in \theta _J \} \end{aligned}$$
(14)

to be the infimum obtained over all possible maximal principal angles at the intersections points.

To prove our main claim, we need the following lemma, which was proved in [8]. This results give a bound on the maximal principal angle between the local tangent space on a smooth manifold which is much more tight than the one obtained in [28], and therefore is the one we use.

Lemma 1

[8]. For \(M \subset M_{d}(\varvec{k})\) and any \(\mathbf {s}, \mathbf {s^{\prime }} \in M\)

$$\begin{aligned} \theta _{\max }\left( T_{M}(\mathbf {s}),T_{M}(\mathbf {s^{\prime }}) \right) < 2 \text{ asin }\left( \min \left\{ \frac{\varvec{k}}{2} \left\| \mathbf {s^{\prime }}-\mathbf {s} \right\| ,1\right\} \right) \end{aligned}$$
(15)

Lemma 2

Suppose \(M_{1}, M_{2} \subset M_{d}({\varvec{k}})\), and assume that \(\theta _{J}(\mathbf {s}_{\mathbf {1}},\mathbf {s}_{\mathbf {2}})>0\), \(\forall \mathbf {s}_{\mathbf {1}}, \mathbf {s}_{\mathbf {2}} \in X_{J}\), and that the intersection set has a strictly positive reach. Given three points \(\mathbf {y}, \mathbf {t}, \mathbf {z}\), and assume that, without loss of generality, \(\mathbf {y}, \mathbf {t} \in M_{1}, \mathbf {z} \in M_{2}\). Then, for \(r>0\) which obeys \(r=\text{ min }\left\{ \text{ asin }(\theta _{J}^{\inf }), \frac{(\theta _{J}^{\inf })}{4\varvec{k}},1\right\} \), we have that for all \(\mathbf {y} , \mathbf {t} \in M_{1}, \mathbf {z} \in M_{2}\), where \(\mathbf {y}, \mathbf {z}, \mathbf {t} \in B(\mathbf {x},r)\), the following inequality is satisfied:

$$\begin{aligned} \theta _{\max }\left( T_{M_1}(\mathbf {t}),T_{M_1}(\mathbf {y}) \right) < \theta _{\max }\left( T_{M_1}(\mathbf {y}),T_{M_2}(\mathbf {z}) \right) \end{aligned}$$
(16)

Proof

We first prove that \(\text{ inf }_{\theta \in \theta _J}\theta =C1>0\): We claim that there exists \(C1>0\) such that for all \(\mathbf {x} \in X_{J}\), where \(\mathbf {x}=\mathbf {s^{\prime }}=\mathbf {t^{\prime }}\) , \(\mathbf {s^{\prime }} \in M_{1}, \mathbf {t^{\prime }} \in M_{2}\), then for all \(\mathbf {s}\in M_{1}, \mathbf {t}\in M_{2}, \) where \( \mathbf {s}, \mathbf {t} \in B(\mathbf {x},\epsilon )\), we have that \(\theta _{\max } \left( T_{M_{1}}(\mathbf {s}), T_{M_{2}} (\mathbf {t}) \right) >0\). Assume in contrast that there exists \(\mathbf {x} \in X_{J}\) , \(\mathbf {x}=s^{\prime }=t^{\prime }\) such that for all \(\epsilon >0\), \(\mathbf {s}\in M_{1}, \mathbf {t}\in M_{2}, \mathbf {s}, \mathbf {t} \in B(\mathbf {x},\epsilon )\) such that \(\theta _{\max } \left( T_{M_{1}}(\mathbf {s}), T_{M_{2}} (\mathbf {t}) \right) =0\). Thus, by this assumption, there exists sequences \(\left\{ \mathbf {s}_{\mathbf{n}} \right\} _{n=1}^{\infty }, \left\{ \mathbf {t}_{{ \mathbf n}} \right\} _{n=1}^{\infty }\) such that \(\theta _{\max }\left( T_{M_{1}}(\mathbf {s}_{n}),T_{M_{2}}(\mathbf { t}_{n}) \right) =0\), \(\forall n\). Since \(M_{2}\) is compact, there exists a converging sub-sequence \( \left\{ \mathbf {t}_{nk} \right\} _{k=1}^{\infty }\subset M_{2} \) such that \(\left\{ \mathbf {t}_{nk} \right\} _{nk=1}^{\infty } \rightarrow \mathbf {t^{\prime }} \in M_{2}\). On the other hand, we have that \(\theta _{\max }\left( T_{M_1}(\mathbf {s}_{n}),T_{M_2}(\mathbf {t^{\prime }}) ) \right) =0\) \(\forall n\). Since \(M_{1}\) is compact, \( \left\{ \mathbf { s}_{n} \right\} _{n=1}^{\infty } \) has a converging sub-sequence \( \left\{ \mathbf {s}_{nk} \right\} _{k=1}^{\infty } \rightarrow \mathbf {s^{\prime }} \in M_{1}\). Thus we have that \(\theta _{\max }\left( T_{M_1}(\mathbf {s}_{nk}),T_{M_2}(\mathbf {t^{\prime }})) \right) =0\) and \(\theta _{\max }\left( T_{M_1}(\mathbf {s^{\prime }}),T_{M_2}(\mathbf {t^{\prime }}) ) \right) >0\). Letting \(\mathbf {v}_{\mathbf {s}_{nk}}, \mathbf {v}_{\mathbf {t}_{nk}}\) denote the vectors corresponding to the maximal principal angles of the local tangent spaces \(T_{M_1}(\mathbf {s}_{nk})\) and \(T_{M_{2}}(\mathbf {t}_{nk})\), we have that

$$\begin{aligned} \left| \left\langle \mathbf {v}_{\mathbf {s}_{nk}},\mathbf {v}_{\mathbf {t^{\prime }}} \right\rangle \right| =1 , \left| \left\langle \mathbf {v}_{\mathbf {s^{\prime }}}, \mathbf {v}_{t^{\prime }} \right\rangle \right| \ne 1, \end{aligned}$$

and \(\mathbf {v}_{\mathbf {s}_{nk}} \rightarrow \mathbf {v}_{\mathbf {s^{\prime }}} \forall k\), which is a contradiction to the compactness of \(M_{1}\).

We use the results above to conclude the proof. Let \(r=\text{ min }\left\{ \text{ asin }(\theta _{J}^{\inf }), \frac{(\theta _{J}^{\inf })}{4\varvec{k}},1\right\} \). Without loss of generality, choose \(\mathbf {s}, \mathbf {s^{\prime }} \in M_{1}\) and \( \mathbf {t} \in M_{2}\) where \(\mathbf {s}, \mathbf {s^{\prime }} , \mathbf {t} \in B(\mathbf {x},r)\) with \(\mathbf {x} \in X_{J}\). Using Lemma 1 and applying the arc-sine function which is increasing—we have that:

$$\begin{aligned} \theta _{\max }\left( T_{M_1}(\mathbf {s}),T_{M_1}(\mathbf {s^{\prime }}) \right) \le 2 \text{ asin }\left( \min \left\{ \frac{\varvec{k}}{2} \left\| \mathbf {s^{\prime }}-\mathbf {s} \right\| ,1\right\} \right) \nonumber \\ \le 2\text{ asin }\left( \varvec{k} \frac{r}{2} \right) \le 2\text{ asin }(\theta _{J}^{\inf }) \le 2(\theta _{J}^{\inf }/4)=\theta _{J}^{\inf }/2 \end{aligned}$$
(17)

On the other hand, we have that \(\theta _{\max }\left( T_{M_{1}}(\mathbf {s}),T_{M_{2}}((\mathbf {t}) \right) > \theta _{J}^{\inf }/2 \) for each \(\mathbf {s} \in M_{1}, \mathbf {t} \in M_{2}\), and thus we have that \(\theta _{\max }\left( T_{M_1}(\mathbf {s}),T_{M_1}(\mathbf {s^{\prime }}) \right) < \theta _{\max }\left( T_{M_1}(\mathbf {s}),T_{M_2}(\mathbf {t}) \right) .\) \(\square \)

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Deutsch, S., Medioni, G. Learning the Geometric Structure of Manifolds with Singularities Using the Tensor Voting Graph. J Math Imaging Vis 57, 402–422 (2017). https://doi.org/10.1007/s10851-016-0684-2

Download citation

Keywords

  • Tensor voting
  • Manifold learning
  • Unsupervised learning
  • Intersecting manifolds