Locally Adaptive Frames in the RotoTranslation Group and Their Applications in Medical Imaging
Abstract
Locally adaptive differential frames (gauge frames) are a wellknown effective tool in image analysis, used in differential invariants and PDEflows. However, at complex structures such as crossings or junctions, these frames are not well defined. Therefore, we generalize the notion of gauge frames on images to gauge frames on data representations \(U:\mathbb {R}^{d} \rtimes S^{d1} \rightarrow \mathbb {R}\) defined on the extended space of positions and orientations, which we relate to data on the rototranslation group SE(d), \(d=2,3\). This allows to define multiple frames per position, one per orientation. We compute these frames via exponential curve fits in the extended data representations in SE(d). These curve fits minimize first or secondorder variational problems which are solved by spectral decomposition of, respectively, a structure tensor or Hessian of data on SE(d). We include these gauge frames in differential invariants and crossingpreserving PDEflows acting on extended data representation U and we show their advantage compared to the standard leftinvariant frame on SE(d). Applications include crossingpreserving filtering and improved segmentations of the vascular tree in retinal images, and new 3D extensions of coherenceenhancing diffusion via invertible orientation scores.
Keywords
Rototranslation group Gauge frames Exponential curves Nonlinear diffusion Leftinvariant image processing Orientation scores1 Introduction
In this article, our quest is to find locally optimal differential frames in SE(d) relying on similar Hessian and/or structuretensor type of techniques for gauge frames on images, recall Fig. 1. Then, the frames can be used to construct crossingpreserving differential invariants and adaptive diffusions of data in SE(d). In order to find these optimal frames, our main tool is the theory of curve fits. Early works on curve fits have been presented [37, 54] where the notion of curvature consistency is applied to inferring local curve orientations, based on neighborhood cocircularity continuation criteria. This approach was extended to 2D texture flow inference in [7], by lifting images in position and orientation domain and inferring multiple Cartan frames at each point. Our work is embedded in a Lie group framework where we consider the notion of exponential curve fits via formal variational methods. Exponential curves in the SE(d)curved geometry are the equivalents of straight ^{1} lines in the Euclidean geometry. If \(d=2\), the spatial projection of these exponential curves are osculating circles, which are used for constructing the curvature consistency in [54], for defining the tensor voting fields in [52], and for local modeling association fields in [19]. If \(d=3\), the spatial projection of exponential curves are spirals with constant curvature and torsion. Based on cohelicity principles, similar spirals have been used in neuroimaging applications [61] or for modeling heart fibers [62]. In these works curve fits are obtained via efficient discrete optimization techniques, which are beyond the scope of this article.
The main contribution of this article is to provide a general theory for finding locally adaptive frames in the rototranslation group SE(d), for \(d=2,3\). Some preliminary work on exponential curve fits of the second order on SE(2) has been presented in [34, 35, 63]. In this paper we formalize these previous methods (Theorems 2 and 3) and we extend them to firstorder exponential curve fits (Theorem 1). Furthermore, we generalize both approaches to the case \(d=3\) (Theorems 4, 5, 6, 7, and 8). All theorems contain new results except for Theorems 2 and 3. The key ingredient is to consider the fits as formal variational curve optimization problems with exact solutions derived by spectral decomposition of structure tensors and Hessians of the data \(\tilde{U}\) on SE(d). In the SE(3)case we show that in order to obtain torsionfree exponential curve fits with wellposed projection on \(\mathbb {R}^{3}\rtimes S^{2}\), one must resign to a twofold optimization algorithm. To show the potential of considering these locally adaptive frames, we employ them in medical image analysis applications, in improved differential invariants and improved crossingpreserving diffusions. Here, we provide for the first time coherenceenhancing diffusions via 3D invertible orientation scores [42, 43], extending previous methods [34, 35, 63] to the 3D Euclidean motion group.
1.1 Structure of the Article
We start the body of this article reviewing preliminary differential geometry tools in Sect. 2. Then, in Sect. 3 we describe how a given exponential curve fit induces the locally adaptive frame. In Sect. 4 we provide an introduction by reformulating the standard gauge frames construction in images in a grouptheoretical setting. This gives a roadmap towards SE(2)extensions explained in Sect. 5, where we deal with exponential curve fits of the first order in Sect. 5.2 computed via a structure tensor, and exponential curves fits of second order in Sect. 5.3 computed via the Hessian of the data \(\tilde{U}\). In the latter case we have two options for the curve optimization problem: one solved by the symmetric sum, and one by the symmetric product of the nonsymmetric Hessian. The curve fits in SE(2) in Sect. 5 are extended to curve fits in SE(3) in Sect. 6. It starts with preliminaries on the quotient (3) and then it follows the same structure as the previous section. Here we present the twofold algorithm for computing the torsionfree exponential curve fits.
In Sect. 7 we consider experiments regarding medical imaging applications and feasibility studies. We first recall the theory of invertible orientation scores needed for the applications. In the SE(2)case we present crossingpreserving multiscale vessel enhancing filters in retinal imaging, and in the SE(3)case we include a proof of concept of crossingpreserving (coherenceenhancing diffusion) steered by gauge frames via invertible 3D orientation scores.
Finally, there are 5 appendices. Appendix 1 supplements Sect. 3 by explaining the construction of the frame for \(d=2,3\). Appendix 2 describes the geometry of neighboring exponential curves needed for formulating the variational problems. Appendix 3 complements the twofold approach in Sect. 6. Appendix 4 provides the definition of the Hessian used in the paper. Finally, Table 1 in Appendix 5 contains a list of symbols, their explanations and references to the equation in which they are defined. We advise the reader to keep track of this table. Especially, in the more technical sections: Sects. 5 and 6.
2 Differential Geometrical Tools
Relating our data to data on the Euclidean motion group, via Eq. (1), allows us to use tools from Lie group theory and differential geometry. In this section we explain these tools that are important for our notion of an exponential curve fit to smooth data \(\tilde{U}:SE(d) \rightarrow \mathbb {R}\). Often, we consider the case \(d=2\) for basic illustration. Later on, in Sect. 6, we consider the case \(d=3\) and extra technicalities on the quotient structure will enter.
2.1 The RotoTranslation Group
2.2 LeftInvariant Operators
2.3 LeftInvariant Vector Fields and Dual Frame
A special case of leftinvariant operators are leftinvariant derivatives. More precisely (see Remark 1 below), we need to consider leftinvariant vector fields \(g \mapsto \mathcal {A}_{g}\), as the leftinvariant derivative \(\mathcal {A}_{g}\) depends on the location g where it is attached. Intuitively, the leftinvariant vector fields \(\{\mathcal {A}_{i}\}_{i=1}^{n_i}\) provide a local moving frame of reference in the tangent bundle T(SE(d)), which comes in naturally when including alignment of local orientations in the image processing of \(\tilde{U}\).
Remark 1
In differential geometry, there exist two equivalent viewpoints [3, Ch. 2] on tangent vectors \(\mathcal {A}_{g} \in T_{g}(SE(d))\): either one considers them as tangents to locally defined curves; or one considers them as differential operators on locally defined functions. The connection between these viewpoints is as follows. We identify a tangent vector \(\dot{\tilde{\gamma }}(t) \in T_{\tilde{\gamma }(t)}(SE(d))\) with the differential operator \((\dot{\tilde{\gamma }}(t))(\tilde{\phi }) := \frac{d}{dt} \tilde{\phi }(\tilde{\gamma }(t))\) for all locally defined, differentiable, realvalued functions \(\tilde{\phi }\).
Example 1
2.4 Exponential Curves in SE(d)
Example 2
Example 3
2.5 LeftInvariant Metric Tensor on SE(d)
2.6 Convolution and Haar Measure on SE(d)
2.7 Gaussian Smoothing and Gradient on SE(d)
2.8 Horizontal Exponential Curves in SE(d)
Typically, in the distribution \(\tilde{U}\) (e.g., if \(\tilde{U}\) is an orientation score of a grayscale image) the mass is concentrated around socalled horizontal exponential curves in SE(d) (see Fig. 3). Next we explain this notion of horizontal exponential curves.
 For \(d=2\), we have restriction \(\dot{\mathbf{x }}(t)=(\dot{x}(t), \dot{y}(t))=\Vert \dot{\mathbf{x }}(t)\Vert (\cos \theta (t),\sin \theta (t))\), i.e.,where \(\Delta \) denotes the socalled horizontal part of tangent bundle T(SE(2)). See Fig. 4.$$\begin{aligned} \dot{\tilde{\gamma }} \in \left. \Delta \right _{\tilde{\gamma }}, \ \text {with }\Delta= & {} \text {span}\{\cos \theta \partial _x+\sin \theta \partial _y, \partial _{\theta }\}\nonumber \\= & {} \text {span}\{\mathcal {A}_1,\mathcal {A}_3\}, \end{aligned}$$(32)
 For \(d=3\), we impose the constraint:where \(\mathcal {A}_3= \mathbf{n } \cdot \nabla _{\mathbb {R}^3}\), since then spatial transport is always along \(\mathbf{n }\) which is required for for (31).$$\begin{aligned} \dot{\tilde{\gamma }}(t) \in \varDelta _{\tilde{\gamma }(t)}, \ \text {with }\varDelta := \text {span}\{\mathcal {A}_{3},\mathcal {A}_{4}, \mathcal {A}_{5}\}, \end{aligned}$$(33)
Example 4
Example 5
3 From Exponential Curve Fits to Gauge Frames on SE(d)
In Sects. 5 and 6 we will discuss techniques to find an exponential curve \(\tilde{\gamma }^{\mathbf{c }}_g(t)\) that fits the data \(\tilde{U}:SE(d)\rightarrow \mathbb {R}\) locally. Let \(\mathbf{c }(g)=(\tilde{\gamma }^{\mathbf{c }}_g)'(0)\) be its tangent vector at g.
 1.
the main spatial generator (\(\mathcal {A}_{1}\) for \(d=2\) and \(\mathcal {A}_{d}\) for \(d>2\)) is mapped onto \(\left. \mathcal {B}_{1}\right _{g}= \sum \nolimits _{i=1}^{n_d}c^{i}(g) \left. \mathcal {A}_{i}\right _{g}\),
 2.
the spatial generators \(\{\left. \mathcal {B}_{i}\right _{g}\}^d_{i=2}\) are obtained from the other leftinvariant spatial generators \(\{\left. \mathcal {A}_{i}\right _g\}^d_{i=1}\) by a planar rotation of \(\mathbf{a }\) onto \(\frac{{\mathbf{c }}^{(1)}}{\Vert {\mathbf{c }}^{(1)}\Vert }\) by angle \(\chi \). In particular, if \(\chi =0\), the other spatial generators do not change their direction. This allows us to still distinguish spatial generators and angular generators in our adapted frame.
The construction for \(d>2\) is technical and provided in Theorem A in Appendix 1. However, the whole construction of the rotation matrix \(\mathbf{R }^{\mathbf{c }}\) via a concatenation of two subsequent rotations is similar to the case \(d=2\) that we will explain next.
Remark 2
When imposing isotropy (w.r.t. the metric Open image in new window ) in the plane orthogonal to \(\mathcal {B}_{1}\), there is a unique choice \(\mathbf{R }^{\mathbf{c }}\) mapping \((1,0,0)^T\) onto \((\mu c^{1},\mu c^{2},c^{3})^T\) such that it keeps the other spatial generator in the spatial subspace of \(T_g(SE(2))\) (and with \(\chi =0 \Leftrightarrow \mathcal {B}_{2}=\mu ^{1}\mathcal {A}_{2}\)). This choice is given by (37).
The generalization to the ddimensional case of the construction of a locally adaptive frame \(\{\mathcal {B}_{i}\}_{i=1}^{n_d}\) from \(\{\mathcal {A}_{i}\}_{i=1}^{n_d}\) and the tangent vector \(\mathbf{c }\) of a given exponential curve fit \(\tilde{\gamma }^{\mathbf{c }}_g(\cdot )\) to data \(\tilde{U}:SE(d)\rightarrow \mathbb {R}\) is explained in Theorem 7 in Appendix 1.
4 Exponential Curve Fits in \(\mathbb {R}^{d}\)
In this section we reformulate the classical construction of a locally adaptive frame to image f at location \(\mathbf{x } \in \mathbb {R}^d\), in a grouptheoretical way. This reformulation seems technical at first sight, but helps in understanding the formulation of projected exponential curve fits in the higher dimensional Lie group SE(d).
4.1 Exponential Curve Fits in \(\mathbb {R}^{d}\) of the First Order
We will take the structure tensor approach [9, 48], which will be shown to yield firstorder exponential curve fits.
Definition 1
Let \(\mathbf{c }^{*}(\mathbf{x }) \in T_{\mathbf{x }}(\mathbb {R}^{d})\) be the minimizer in (44). We say \(\gamma _{\mathbf{x }}(t)= \mathbf{x }+\exp _{\mathbb {R}^{d}}({t \mathbf{c }^{*}(\mathbf{x }))}\) is the firstorder exponential curve fit to image data \(f: \mathbb {R}^{d} \rightarrow \mathbb {R}\) at location \(\mathbf{x }\).
4.2 Exponential Curve Fits in \(\mathbb {R}^{d}\) of the Second Order
Remark 3
In general the eigenvalues of Hessian matrix \(\mathbf{H }^s\) do not have the same sign. In this case we still take \(\mathbf {c}^*(\mathbf {x})\) as the eigenvector with smallest absolute eigenvalue (representing minimal absolute principal curvature), though this no longer solves (47).
Definition 2
Let \(\mathbf{c }^{*}(\mathbf{x }) \in T_{\mathbf{x }}(\mathbb {R}^{d})\) be the minimizer in (49). We say \(\gamma _{\mathbf{x }}(t)= \mathbf{x }+\exp _{\mathbb {R}^{d}}({t \mathbf{c }^{*}(\mathbf{x }))}\) is the secondorder exponential curve fit to image data \(f: \mathbb {R}^{d} \rightarrow \mathbb {R}\) at location \(\mathbf{x }\).
Remark 4
5 Exponential Curve Fits in SE(2)
As mentioned in the introduction, we distinguish between two approaches: a firstorder optimization approach based on a structure tensor on SE(2), and a secondorder optimization approach based on the Hessian on SE(2). The firstorder approach is new, while the secondorder approach formalizes the results in [28, 34]. They also serve as an introduction to the new, more technical, SE(3)extensions in Sect. 6.
All curve optimization problems are based on the idea that a curve (or a family of curves) fits the data well if a certain quantity is preserved along the curve. This preserved quantity is the data \(\tilde{U}(\tilde{\gamma }(t))\) for the firstorder optimization, and the time derivative \(\frac{d}{dt}\tilde{U}(\tilde{\gamma }(t))\) or the gradient \(\nabla \tilde{U}(\tilde{\gamma }(t))\) for the secondorder optimization. After introducing a family of curves similar to the ones used in Sect. 4 we will, for all three cases, first pose an optimization problem, and then give its solution in a subsequent theorem.
In this section we rely on grouptheoretical tools explained in Sect. 2 (only the case \(d = 2\)), listed in panels (a) and (b) in our table of notations presented in Appendix 5. Furthermore, we introduce notations listed in the first part of panel (c) in our table of notations in Appendix 5.
5.1 Neighboring Exponential Curves in SE(2)
Akin to (45) we fix reference point \(g \in SE(2)\) and velocity components \(\mathbf{c }=\mathbf{c }(g) \in \mathbb {R}^3\), and we shall rely on a family \(\{\tilde{\gamma }_{h,g}^{\mathbf{c }}\}\) of neighboring exponential curves around \(\tilde{\gamma }^{\mathbf{c }}_g\). As we will show in subsequent Lemma 1 neighboring curve \(\tilde{\gamma }^{\mathbf{c }}_{h,g}\) departs from h and has the same spatial and rotational velocity as the curve \(\tilde{\gamma }^{\mathbf{c }}_g\) departing from g. This geometric idea is visualized in Fig. 6, where it is intuitively explained why one needs the initial velocity vector \(\tilde{\mathbf{R }}_{h^{1}g}\mathbf{c }\), instead of \(\mathbf{c }\) in the following definition for the exponential curve departing from a neighboring point h close to g.
Definition 3
Lemma 1
Exponential curve \(\tilde{\gamma }_{h,g}^{\mathbf{c }}\) departing from \(h\in SE(2)\) given by (51) has the same spatial and angular velocity as exponential curve \(\tilde{\gamma }^{\mathbf{c }}_g\) departing from \(g \in SE(2)\).
On the Lie algebra level, we have that the initial velocity component vectors of the curves \(\tilde{\gamma }^{\mathbf{c }}_g\) and \(\tilde{\gamma }_{h,g}^{\mathbf{c }}\) relate via \(\mathbf{c } \mapsto \tilde{\mathbf{R }}_{h^{1}g} \mathbf{c }\).
Proof
The proof follows from the proof of a more general theorem on the SE(3) case which follows later (in Lemma 3).
\(\square \)
Additional geometric background is given in Appendix 2.
5.2 Exponential Curve Fits in SE(2) of the First Order
Theorem 1
(FirstOrder Fit via Structure Tensor) The normalized eigenvector \(\mathbf{M }_{\mu }\mathbf{c }^{*}(g)\) with smallest eigenvalue of the rescaled structure matrix \(\mathbf{M }_{\mu }\mathbf{S }^{\mathbf{s },\varvec{\rho }}(g)\mathbf{M }_{\mu }\) provides the solution \(\mathbf{c }^{*}(g)\) to optimization problem (54).
Proof
We will apply four steps. In the first step we write the time derivative as a directional derivative, in the second step we express the directional derivative in the gradient. In the third step we put the integrand in matrixvector form. In the final step we express our optimization functional in the structure tensor and solve the Euler–Lagrange equations.
The next remark explains the frequent presence of the \(\mathbf{M }_{\mu }\) matrices in (69).
Remark 6
The diagonal \(\mathbf{M }_{\mu }\) matrices enter the functional due to the gradient definition (28), and they enter the boundary condition via \(\Vert \mathbf{c }\Vert _{\mu }^2=\mathbf{c }^T \mathbf{M }_{\mu ^2} \mathbf{c }=1\). In both cases they come from the metric tensor (23). Parameter \(\mu \) which controls the stiffness of the exponential curves has physical dimension \([\text {Length}]^{1}\). As a result, the normalized eigenvector \(\mathbf {M}_{\mu } \mathbf{c }^{*}(g)\) is, in contrast to \(\mathbf{c }^{*}(g)\), dimensionless.
5.3 Exponential Curve Fits in SE(2) of the Second Order
Remark 7
As the leftinvariant vector fields are noncommutative, there are many ways to define the Hessian matrix on SE(2), since the ordering of the leftinvariant derivatives matters. From a differential geometrical point of view our choice (62) is correct, as we motivate in Appendix 4.
Remark 8
In the next two theorems we solve these optimization problems.
Theorem 2
Proof
Theorem 3
Proof
6 Exponential Curve Fits in SE(3)
In this section we generalize the exponential curve fit theory from the preceding chapter on SE(2) to SE(3). Because our data on the group SE(3) was obtained from data on the quotient \(\mathbb {R}^{3}\rtimes S^{2}\), we will also discuss projections of exponential curve fits on the quotient.
We start in Sect. 6.1 with some preliminaries on the quotient structure (3). Here we will also introduce the concept of projected exponential curve fits. Subsequently, in Sect. 6.2, we provide basic theory on how to obtain the appropriate family of neighboring exponential curves. More details can be found in Appendix 2. In Sect. 6.3 we formulate exponential curve fits of the first order as a variational problem. For that we define the structure tensor on SE(3), which we use to solve the variational problem in Theorems 4 and 5. Then we present the twofold algorithm for achieving torsionfree exponential curve fits. In Sect. 6.4 we formulate exponential curve fits of the second order as a variational problem. Then we define the Hessian tensor on SE(3), which we use to solve the variational problem in Theorem 6. Again torsionfree exponential curve fits are accomplished via a twofold algorithm.
Throughout this section we will rely on the differential geometrical tools of Sect. 2, listed in panels (a) and (b) in in Table 1 in Appendix 5. We also generalize concepts on exponential curve fits introduced in the previous section to the case \(d=3\) (requiring additional notation). They are listed in panel (c) in the table in Appendix 5.
6.1 Preliminaries on the Quotient \(\mathbb {R}^{3} \rtimes S^2\)
6.1.1 Legal Operators
6.1.2 Projected Exponential Curve Fits
Definition 4
Lemma 2
Proof
6.2 Neighboring Exponential Curves in SE(3)
Here we generalize the concept of family of neighboring exponential curves (45) in the \(\mathbb {R}^d\)case, and Definition 3 in the SE(2)case, to the SE(3)case.
Definition 5
The next lemma motivates our specific choice of neighboring exponential curves. The geometric idea is visualized in Fig. 7 and is in accordance with Fig. 6 on the SE(2)case.
Lemma 3
Exponential curve \(\tilde{\gamma }_{h,g}^{\mathbf{c }}\) departing from \(h=(\mathbf{x }',\mathbf{R }')\in SE(3)\) given by (84) has the same spatial and rotational velocity as exponential curve \(\tilde{\gamma }^{\mathbf{c }}_g\) departing from \(g=(\mathbf{x },\mathbf{R }) \in SE(3)\).
On the Lie algebra level, we have that the initial velocity component vectors of the curves \(\tilde{\gamma }^{\mathbf{c }}_g\) and \(\tilde{\gamma }_{h,g}^{\mathbf{c }}\) relate via \(\mathbf{c } \mapsto \tilde{\mathbf{R }}_{h^{1}g} \mathbf{c }\).
Proof
See Appendix 2. \(\square \)
Remark 9
Lemma 3 extends Lemma 1 to the SE(3) case. When projecting the curves \(\tilde{\gamma }_{g}^{\mathbf{c }}\) and \(\tilde{\gamma }_{h,g}^{\mathbf{c }}\) into the quotient, one has that curves \(\tilde{\gamma }_{g}^{\mathbf{c }} \odot (\mathbf{0 },\mathbf{a })\) and \(\tilde{\gamma }_{h,g}^{\mathbf{c }} \odot (\mathbf{0 },\mathbf{a })\) in \(\mathbb {R}^{3}\rtimes S^2\) carry the same spatial and angular velocity.
Remark 10
6.3 Exponential Curve Fits in SE(3) of the First Order
6.3.1 The Structure Tensor on SE(3)
Remark 11
Remark 12
We assume that \(\mathbf{s }=(s_{p}, s_{o})\) and function \(\tilde{U}\) are chosen in such a way that the null space of the structure matrix is precisely equal to \(\mathcal {N}\) (and not larger).
Theorem 4
(FirstOrder Fit via Structure Tensor) The normalized eigenvector \(\mathbf{M }_{\mu }\mathbf{c }^{*}(g)\) with smallest nonzero eigenvalue of the rescaled structure matrix \(\mathbf{M }_{\mu }\mathbf{S }^{\mathbf{s },\varvec{\rho }}(g)\mathbf{M }_{\mu }\) provides the solution \(\mathbf{c }^{*}(g)\) to optimization problem (88).
Proof
Finally, the constraint \(c^{6}=0\) is included in our optimization problem (88) to excluded the null space (90) from the optimization; therefore, we take the eigenvector with the smallest nonzero eigenvalue providing us the final result. \(\square \)
6.3.2 Projected Exponential Curve Fits in \(\mathbb {R}^{3}\rtimes S^{2}\)
In the following theorem we summarize the wellposedness of our projected curve fits on data \(U:\mathbb {R}^{3}\rtimes S^2 \rightarrow \mathbb {R}\) and use the quotient structure to simplify the structure tensor.
Theorem 5
Proof
The proof consists of two parts. First we prove that (97) follows from the structure tensor defined in (89). Then we use Lemma 2 to prove that our projected exponential curve fit (98) is well defined. For both we use Theorem 4 as our venture point.
6.3.3 TorsionFree Exponential Curve Fits of the First Order via a Twofold Approach
Theorem 4 provides us exponential curve fits that possibly carry torsion. From Eq. (22) we deduce that the torsion norm of such an exponential curve fit is given by \(\tau =\frac{1}{\Vert \mathbf{c }^{(1)}\Vert } (c^{1}c^{4}+c^{2}c^5 +c^{3}c^6)\kappa \). Together with the fact that we exclude the null space \(\mathcal {N}\) from our optimization domain by including constraint \(c^6=0\), this results in insisting on zero torsion along horizontal exponential curves where \(c^{1}=c^{2}=0\). Along other exponential curves torsion appears if \(c^{1}c^{4}+c^{2}c^5\ne 0\).
Now the problem is that insisting, a priori, on zero torsion for horizontal curves while allowing nonzero torsion for other curves is undesirable. On top of this, torsion is a higher order lessstable feature than curvature. Therefore, we would like to exclude it altogether from our exponential curve fits presented in Theorems 4 and 5, by a different theory and algorithm. The results of the algorithm show that even if structures do have torsion, the local exponential curve fits do not need to carry torsion in order to achieve good results in the local frame adaptation, see, e.g., Fig. 8.
 Step 1

Estimate at \(g \in SE(3)\) the spatial velocity part \(\mathbf{c }^{(1)}(g)\) from the spatial structure tensor.
 Step 2

Move to a different location \(g_{new} \in SE(3)\) where a horizontal exponential curve fit makes sense and then estimate the angular velocity \(\mathbf{c }^{(2)}\) from the rotation part of the structure tensor over there.
Lemma 4
Consider the class of exponential curves with nonzero spatial velocity \(\mathbf{c }^{(1)}\ne \mathbf{0 }\) such that their spatial projections do not have torsion. Within this class the constraint \(c^{6}=0\) does not impose constraints on curvature if and only if the exponential curve is horizontal.
Proof
From these observations we draw the following conclusion for our exponential curve fit algorithms.
Conclusion In order to allow for all possible curvatures in our torsionfree exponential curve fits, we must relocate the exponential curve optimization at \(g \in SE(3)\) in \(\tilde{U}:SE(3) \rightarrow \mathbb {R}\) to a position \(g_{new} \in SE(3)\) where a horizontal exponential curve can be expected. Subsequently, we can use Lemma 3 to transport the horizontal and torsionfree curve through \(g_{new}\), back to a torsionfree exponential curve through g.
This conclusion is the central idea behind our following twofold algorithm for exponential curve fits.
6.3.4 Algorithm Twofold Approach
The algorithm follows the subsequent steps:
Step 1a Initialization. Compute structure tensor \(\mathbf {S}^{\mathbf{s },\varvec{\rho }}(g)\) from input image \(U:\mathbb {R}^{3} \times S^{2} \rightarrow \mathbb {R}^{+}\) via Eq. (97).
Remark 13
Lemma 5
The preceding algorithm is well defined on the quotient \(\mathbb {R}^{3}\rtimes S^{2}=SE(3)/(\{{\mathbf {0}}\}\times SO(2))\).
Proof
To show that the preceeding algorithm is well defined on the quotient, we need to show that the final result (104) is independent on both the choice of of \(\mathbf{R }_{\mathbf{n }} \in SO(3)\) s.t. \(\mathbf{R }_{\mathbf{n }}\mathbf e _{z}=\mathbf{n }\) and the choice of \(\mathbf{R }_{\mathbf{n }_\mathrm{new}} \in SO(3)\) s.t. \(\mathbf{R }_{\mathbf{n }_\mathrm{new}}\mathbf e _{z}=\mathbf{n }_{new}\).
Finally, Eq. (104) is independent of the choice of \(\mathbf{R }_{\mathbf{n }_\mathrm{new}}\). This follows from \(\mathbf{c }_\mathrm{new}(g h_{\alpha }) = \mathbf Z _{\alpha }^T \mathbf{c }_\mathrm{new}(g)\) in Step 2a. Then \(\mathbf{c }^{*}_\mathrm{final}\) in Eq. (103) is independent of the choice of \(\mathbf{R }_{\mathbf{n }_\mathrm{new}}\) because \(\mathbf Z _{\alpha }^{T}\) in \(\mathbf{c }_\mathrm{new} \mapsto \mathbf Z _{\alpha }^{T} \mathbf{c }_\mathrm{new}\) is canceled by \(\mathbf{R }_\mathrm{new} \mapsto \mathbf{R }_\mathrm{new} \mathbf{R }_\mathbf{e _{z},\alpha }\) in Eq. (103). \(\square \)
In Fig. 8 we provide an example of spatially projected exponential curve fits in SE(3) via the twofold approach. Here we see that the resulting gauge frames better follow the curvilinear structures of the data (in comparison to the normal leftinvariant frame).
6.4 Exponential Curve Fits in SE(3) of the Second Order
In this section we will generalize Theorem 2 to the case \(d=3\), where again we include the restriction to torsionfree exponential curves.
6.4.1 The Hessian on SE(3)
Theorem 6
(SecondOrder Fit via Symmetric Sum Hessian) Let \(g \in SE(3)\) be such that the symmetrized Hessian matrix \(\frac{1}{2} \mathbf{M }_{\mu }^{1}( \mathbf{H }^{\mathbf{s }}(g)+ (\mathbf{H }^{\mathbf{s }}(g))^T) \mathbf{M }_{\mu }^{1}\) has eigenvalues with the same sign. Then the normalized eigenvector \(\mathbf{M }_{\mu }\mathbf{c }^{*}(g)\) with smallest absolute nonzero eigenvalue of the symmetrized Hessian matrix provides the solution \(\mathbf{c }^{*}(g)\) of optimization problem (106).
Proof
Similar to the proof of Theorem 2 (only now with summations from 1 to 5). Again we include our additional constraint \(c^6=0\) by taking the smallest nonzero eigenvalue. \(\square \)
Remark 14
 1.
Move towards a neighboring point where the Hessian eigenvalues have the same sign and apply transport (Lemma 3, Fig. 7) of the exponential curve fit at the neighboring point.
 2.
Take \(\mathbf{c }^{*}(g)\) still as the eigenvector with smallest absolute eigenvalue (representing minimal absolute principal curvature), though this no longer solves (106).
6.4.2 TorsionFree Exponential Curve Fits of the Second Order via a Twofold Algorithm
In order to obtain torsionfree exponential curve fits of the second order via our twofold algorithm, we follow the same algorithm as in Subsection 6.3.3, but now with the Hessian field \(\mathbf{H }^{\mathbf{s }}\) (107) instead of the structure tensor field.
Step 1a Initialization. Compute Hessian \(\mathbf {H}^{\mathbf{s }}(g)\) from input image \(U:\mathbb {R}^{3} \times S^{2} \rightarrow \mathbb {R}^{+}\) via Eq. (107).
Step 1b Find the optimal spatial velocity by (100) where we replace \(\mathbf{M }_{\mu ^2} \mathbf {S}^{\mathbf{s },\varvec{\rho }}(g) \mathbf{M }_{\mu ^2} \) by \(\mathbf {H}^{\mathbf{s }}(g)\).
Step 2a We again fit a horizontal curve at \(g_{new}\) given by (101). The procedure is done via (102) where we again replace \(\mathbf{M }_{\mu ^2} \mathbf {S}^{\mathbf{s },\varvec{\rho }}(g) \mathbf{M }_{\mu ^2} \) by \(\mathbf {H}^{\mathbf{s }}(g)\).
Step 2b Remains unchanged. We again apply Eq. (103) and Eq. (104).
There are some serious computational technicalities in the efficient computation of the entries of the Hessian for discrete input data, but this is outside the scope of this article and will be pursued in future work.
Remark 15
In Appendix 3 we propose another twofold secondorder exponential curve fit method. Here one solves a variational problem for exponential curve fits where exponentials are factorized over, respectively, spatial and angular part. Empirically, this approach performs good (see, e.g., Fig. 9).
7 Image Analysis Applications
In this section we present examples of applications where the use of gauge frame in SE(d) obtained via exponential curve fits is used for defining dataadaptive leftinvariant operators. Before presenting the applications, we start by briefly summarizing the invertible orientation score theory in Sect. 7.1.
In case \(d=2\) the application presented is the enhancing of the vascular tree structure in 2D retinal images via differential invariants based on gauge frames. This is achieved by extending the classical Frangi vesselness filter [36] to distributions \(\tilde{U}\) on SE(2). Gauge frames in SE(2) can also be used in nonlinear multiplescale crossingpreserving diffusions as demonstrated in [63], but we will not discuss this application in this paper.
In case \(d=3\) the envisioned applications include blood vessel detection in 3D MR angiography, e.g., the detection of the Adamkiewicz vessel, relevant for surgery planning. Also in extensions towards fiberenhancement of diffusionweighted MRI [29, 30] the nonlinear diffusions are of interest. Some preliminary practical results have been conducted on such 3D datasets [21, 23, 42], but here we shall restrict ourselves to very basic artificial 3D datasets to show a proof of concept, and leave these three applications for future work.
7.1 Invertible Orientation Scores
 1.
for \(d=2\), differential invariants on orientation scores based on gauge frames \(\{\mathcal {B}_{1},\mathcal {B}_{2},\mathcal {B}_{3}\}\).
 2.for \(d=2,3\), nonlinear adaptive diffusions steered along the gauge frames, i.e.,where \(\widetilde{W}(g,t)\), with \(t\ge 0\), is the solution of$$\begin{aligned} W(\mathbf{x },\mathbf{n },t)=\widetilde{W}(\mathbf{x },\mathbf{R }_{\mathbf{n }},t)=\varPhi _{t}(\tilde{U})(\mathbf{x },\mathbf{R }_{\mathbf{n }}), \end{aligned}$$(111)where the gauge frame is induced by an exponential curve fit to data \(\tilde{U}\) at location \(g \in SE(d)\).$$\begin{aligned} \left\{ \begin{array}{ll} \frac{\partial \tilde{W}}{\partial t}(g,t)= \sum \limits _{i=1}^{n_d} D_{ii}\left. (\mathcal {B}_{i})^2\right _{g} \tilde{W}(g,t), \\ \tilde{W}(g,0)=\tilde{U}(g), \end{array} \right. \end{aligned}$$(112)
7.2 Experiments in SE(2)
Note that another option for constructing a SIM(2)vesselness is to use the nonadaptive leftinvariant frame \(\{\mathcal {A}_{1},\mathcal {A}_{2},\mathcal {A}_{3}\}\) instead of the gauge frame. This nonadaptive SE(2)vesselness operator is obtained by simply replacing the \(\mathcal {B}_{i}\) operators by the \(\mathcal {A}_{i}\) operators in Eq. (113) accordingly.
 Advantage 1

The improvement of considering the multiplescale vesselness filter via gauge frames in SE(2), compared to multiplescale vesselness [36] acting directly on images.
 Advantage 2

Further improvement when using the gauge frames instead of using the leftinvariant vector fields in SE(2)vesselness (113).
In the following experiment, we test these 3 techniques (Frangi vesselness [36], SIM(2)vesselness via the nonadaptive leftinvariant frame, and the newly proposed SIM(2)vesselness via gauge frames) on the publically available^{3} HighResolution Fundus (HRF) dataset [47], containing manually segmented vascular trees by medical experts. The HRF dataset consists of widefield fundus photographs for a healthy, diabetic retinopathy and a glaucoma group (15 images each). A comparison of the 3 vesselness filters on a small patch is depicted in Fig. 12. Here, we see that our method performs better both at crossing and noncrossing structures.
7.3 Experiments in SE(3)
We now show first results of the extension of coherenceenhancing diffusion via invertible orientation scores (CEDOS [35]) of 2D images to the 3D setting. Again, data are processed according to Fig. 3. First, we construct an orientation score according to (108), using the 3D cakewavelets (Fig. 11). For determining the gauge frame we use the firstorder structure tensor method in combination with Eq. (118) in Appendix 1. In CEDOS we have \(\varPhi =\varPhi _t\), as defined in (111) and (112), which is a diffusion along the gauge frame.
8 Conclusion
 1.
Along the firstorder exponential curve fits, the firstorder variation of the data (on SE(d)) along the exponential curve is locally minimal. The Euler–Lagrange equations are solved by finding the eigenvector of the structure tensor of the data, with smallest eigenvalue.
 2.
Along the secondorder exponential curve fits, a secondorder variation of the data (on SE(d)) along the exponential curve is locally minimal. The Euler–Lagrange equations are solved by finding the eigenvector of the Hessian of the data, with smallest eigenvalue.
Finally, we considered the application of a differential invariant for enhancing retinal images. Experiments show clear advantages over the classical vesselness filter [36]. Furthermore, we also show clear advantages of including the gauge frame over the standard leftinvariant frame in SE(2). Regarding 3D image applications, we managed to construct and implement crossingpreserving coherenceenhancing diffusion via invertible orientation scores (CEDOS), for the first time. However, it has only been tested on artificial datasets. Therefore, in future work we will study the use of locally adaptive frames in real 3D medical imaging applications, e.g., in 3D MR angiography [43]. Furthermore, in future work we will apply the theory of this work and focus on the explicit algorithms, where we plan to release Mathematica implementations of locally adaptive frames in \({ SE}(3)\).
Footnotes
Notes
Acknowledgments
The authors wish to thank J.M. Portegies for fruitful discussions on the construction of gauge frames in SE(3) and T.C.J. Dela Haije for help in optimizing code for the SE(3)case. Finally, we would like to thank the reviewers and Dr. A.J.E.M. Janssen for careful reading and valuable suggestions on the structure of the paper. The research leading to these results has received funding from the European Research Council under the European Community’s Seventh Framework Programme (FP7/20072013) / ERC Grant Lie Analysis, agr. nr. 335555.
References
 1.Aganj, I., Lenglet, C., Sapiro, G., Yacoub, E., Ugurbil, K., Harel, N.: Reconstruction of the orientation distribution function in single and multiple shell qball imaging within constant solid angle. MRM. 64(2), 554566 (2010)Google Scholar
 2.Ali, S.T., Antoine, J.P., Gazeau, J.P.: Coherent States, Wavelets and Their Generalizations. Springer, New York (2000)Google Scholar
 3.Aubin, T.: A Course in Differential Geometry. Graduate Studies in Mathematics, vol. 27. American Mathematical Society, Providence (2001)Google Scholar
 4.August, J., Zucker, S.W.: The curve indicator random field: curve organization and correlation. Perceptual Organization for Artificial Vision Systems, pp. 265–288. Kluwer Academic, Boston (2000)Google Scholar
 5.Barbieri, D., Citti, G., Sanguinetti, G., Sarti, A.: An uncertainty principle underlying the functional architecture of V1. J. Physiol. Paris 106(5–6), 183–193 (2012)CrossRefGoogle Scholar
 6.Bekkers, E., Duits, R., Berendschot, T., ter Haar, B.M.: A multiorientation analysis approach to retinal vessel tracking. J. Math. Imaging Vis. 49, 583–610 (2014)CrossRefMATHGoogle Scholar
 7.BenShahar, O., Zucker, S.W.: The perceptual organization of texture flow: a contextual inference approach. IEEE Trans. PAMI 25(4), 401–417 (2003)CrossRefGoogle Scholar
 8.Bergholm, F.: Edge focussing. IEEE Trans. PAMI 9(6), 726–741 (1987)CrossRefGoogle Scholar
 9.Bigun, J., Granlund, G.: Optimal orientation detection of linear symmetry. In: ICCV, pp. 433438 (1987)Google Scholar
 10.Blom, J.: Topological and geometrical aspects of image structure. PhD thesis, University of Utrecht (1992)Google Scholar
 11.Boscain, U., Chertovskih, R.A., Gauthier, J.P., Remizov, A.O.: Hypoelliptic diffusion and human vision: a semidiscrete new twist. SIAM J. Imaging Sci. 7(2), 669–695 (2014)MathSciNetCrossRefMATHGoogle Scholar
 12.Breuss, M., Burgeth, B., Weickert, J.: Anisotropic continuousscale morphology. IbPRIA. LNCS, vol. 4478, pp. 515–522. Springer, Heidelberg (2007)Google Scholar
 13.Burgeth, M., Breuss, M., Didas, S., Weickert, J.: PDEbased morphology for matrix fields: numerical solution schemes. In: AjaFernandez, S., de LuisGarcia, R., Tao, D., Li, X. (eds.) Tensors in Image Processing and Computer Vision, pp. 125–150. Springer, London (2009)CrossRefGoogle Scholar
 14.Budai, A., Bock, R., Maier, A., Hornegger, J., Michelson, G.: Robust vessel segmentation in fundus images. Int. J. Biomed. Imaging (2013)Google Scholar
 15.Cao, F.: Geometric Curve Evolution and Image Processing. Springer, Heidelberg (2003)CrossRefMATHGoogle Scholar
 16.Caselles, V., Kimmel, R., Sapiro, G.: Geodesic active contours. Int. J. Comput. Vis. 22(1), 61–79 (1997)CrossRefMATHGoogle Scholar
 17.Chirikjian, G.S.: Stochastic Models, Information Theory, and Lie Groups. Analytic Methods and Modern Applications, vol. 2. Birkhäuser, Boston (2011)Google Scholar
 18.Chirikjian, G.S., Kyatkin, A.B.: Engineering Applications of Noncommutative Harmonic Analysis: With Emphasis on Rotation and Motion Groups. CRC, Boca Raton (2000)CrossRefMATHGoogle Scholar
 19.Citti, G., Sarti, A.: A cortical based model of perceptual completion in the rototranslation space. J. Math. Imaging Vis. 24(3), 307–326 (2006)MathSciNetCrossRefMATHGoogle Scholar
 20.Citti, G., Franceschiello, B., Sanguinetti, G., Sarti, A.: SubRiemannian mean curvature flow for image processing. Preprint on arXiv:1504.03710 (2015)
 21.Creusen, E.J., Duits, R., Vilanova, A., Florack, L.M.J.: Numerical schemes for linear and nonlinear enhancement of DWMRI. NMTMA 6(1), 138–168 (2013)MathSciNetMATHGoogle Scholar
 22.Descoteaux, M., Angelino, E., Fitzgibbons, S., Deriche, R.: Regularized, fast, and robust analytical Qball imaging. Magn. Reson. Med. 58(3), 497–510 (2007)CrossRefGoogle Scholar
 23.Duits, R., Felsberg, M., Granlund, G., ter Haar Romeny, B.M.: Image analysis and reconstruction using a wavelet transform constructed from a reducible representation of the Euclidean motion group. Int. J. Comput. Vis. 79(1), 79–102 (2007)CrossRefGoogle Scholar
 24.Duits, R.: Perceptual organization in image analysis, a mathematical approach based on scale, orientation and curvature. PhDthesis, TU/e, Eindhoven (2005)Google Scholar
 25.Duits, R., Boscain, U., Rossi, F., Sachkov, Y.: Association Fields via Cuspless SubRiemannian Geodesics in SE(2). J. Math. I Vis. 49(2), 384–417 (2014)MathSciNetCrossRefMATHGoogle Scholar
 26.Duits, R., van Almsick, M.A.: The explicit solutions of linear leftinvariant second order stochastic evolution equations on the 2DEuclidean motion group. Q. Appl. Math. AMS 66(1), 27–67 (2008)MathSciNetCrossRefMATHGoogle Scholar
 27.Duits, R., Franken, E.M.: Left invariant parabolic evolution equations on \({SE}(2)\) and contour enhancement via invertible orientation scores, part I: linear leftinvariant diffusion equations on \({SE}(2)\). Q. Appl. Math. AMS 68, 255–292 (2010)MathSciNetCrossRefMATHGoogle Scholar
 28.Duits, R., Franken, E.M.: Left invariant parabolic evolution equations on \({SE}(2)\) and contour enhancement via invertible orientation scores, part II: nonlinear leftinvariant diffusions on invertible orientation scores. Q. Appl. Math. AMS 68, 293–331 (2010)MathSciNetCrossRefMATHGoogle Scholar
 29.Duits, R., Dela Haije, T.C.J., Creusen, E.J., Ghosh, A.: Morphological and linear scale spaces for fiber enhancement in DWMRI. J. Math. Imaging Vis. 46(3), 326368 (2013)MathSciNetMATHGoogle Scholar
 30.Duits, R., Franken, E.M.: Leftinvariant diffusions on the space of positions and orientations and their application to crossing preserving smoothing of HARDI images. Int. J. Comput. Vis. 92, 231–264 (2011)MathSciNetCrossRefMATHGoogle Scholar
 31.Duits, R., Ghosh, A., Dela Haije, T.C.J., Sachkov, Y.L.: Cuspless subRiemannian geodesics within the Euclidean motion group SE(d). Neuromathematics of Vision. Springer Series Lecture Notes in Morphogenesis, pp. 173–240. Springer, Berlin (2014)Google Scholar
 32.Felsberg, M.: Adaptive filtering using channel representations. In: Florack, L., et al. (eds.) Mathematical Methods for Signal and Image Analysis and Representation. Computational Imaging and Vision, vol. 41, pp. 35–54. Springer, London (2012)Google Scholar
 33.Florack, L.M.J.: Image Structure. KAP, Dordrecht (1997)CrossRefGoogle Scholar
 34.Franken, E.M.: Enhancement of crossing elongated structures in images. PhDthesis, Department of Biomedical Engineering, Eindhoven University of Technology (2008)Google Scholar
 35.Franken, E.M., Duits, R.: Crossing preserving coherenceenhancing diffusion on invertible orientation scores. Int. J. Comput. Vis. 85(3), 253278 (2009)MathSciNetCrossRefGoogle Scholar
 36.Frangi, A.F., Niessen, W.J., Vincken, K.L., Viergever, M.A.: Multiscale Vessel Enhancement Filtering. LNCS, vol. 1496, pp. 130–137. Springer, Berlin (1998)Google Scholar
 37.van Ginkel, M., van de Weijer, J., van Vliet, L.J., Verbeek, P.W.: Curvature estimation from orientation fields. Proc. 11th Scand. Conf. Image Anal. 2, 545–552 (1999)Google Scholar
 38.Guichard, F., Morel, J.M.: Geometric partial differential equations and iterative filtering. In: Heymans, H.J.A.M., Roerdink, J.B.T.M. (eds.) Mathematical Morphology and Its Applications to Image and Signal Processing, pp. 127–138. KAP, Dordrecht (1998)Google Scholar
 39.ter Haar Romeny, B.M.: FrontEnd Vision and MultiScale Image Analysis, Computational Imaging and Vision, vol. 27. Springer, Berlin (2003)CrossRefGoogle Scholar
 40.Hannink, J., Duits, R., Bekkers, E.J.: Multiple scale crossing preserving vesselness. In: MICCAI Proc., LNCS 8674, pp. 603–610 (2014)Google Scholar
 41.Ikram, M.K., Ong, Y.T., Cheung, C.Y., Wong, T.Y.: Retinal vascular caliber measurements: clinical significance, current knowledge and future perspectives. Ophthalmologica 229(3), 125–136 (2013)CrossRefGoogle Scholar
 42.Janssen, M.H.J., Duits, R., Breeuwer, M.: Invertible orientation scores of 3D images. In: SSVM 2015. LNCS 9087, pp. 563–575 (2015)Google Scholar
 43.Janssen, M.H.J.: 3D orientation scores applied to MRA vessel analysis. Master Thesis, Department of Biomedical Image Analysis, Eindhoven University of Technology, The Netherlands (2014)Google Scholar
 44.Jost, J.: Riemannian Geometry and Geometric Analysis, 4th edn. Springer, Berlin (2005)MATHGoogle Scholar
 45.Kindlmann, G., Ennis, D.E., Witaker, R.T., Westin, C.F.: Diffusion tensor analysis with invariant gradients and rotation tangents. IEEE Trans. Med. Imaging 23(11), 1483–1499 (2007)CrossRefGoogle Scholar
 46.Kindlmann, G., Estepar, R.S.J., Smith, S.M., Westin, C.F.: Sampling and visualization creases with scalespace particles. IEEE Trans. VCG 15(6), 1415–1424 (2010)Google Scholar
 47.Kohler, T., Budai, A., Kraus, M.F., Odstrcilik, J., Michelson, G., Hornegger, J.: Automatic noreference quality assessment for retinal fundus images using vessel segmentation. In: IEEE 26th Symposium on CBMS, pp. 95–100 (2013)Google Scholar
 48.Knutsson, H.: Representing local structure using tensors. In: Scandinavian Conference on Image Analysis, pp. 244–251 (1989)Google Scholar
 49.Lawlor, M., Zucker, S.W.: Third order edge statistics: contour continuation, curvature, and cortical connections. In: NIPS, pp. 1763–1771 (2013)Google Scholar
 50.Lindeberg, T.: ScaleSpace Theory in Computer Vision. The Springer International Series in Engineering and Computer Science. Kluwer Academic Publishers, Dordrecht (1994)CrossRefGoogle Scholar
 51.Lupascu, C.A., Tegolo, D., Trucco, E.: FABC: retinal vessel segmentation using AdaBoost. IEEE Trans. Inf. Technol. 14(5), 1267–1274 (2010)CrossRefGoogle Scholar
 52.Medioni, G., Lee, M.S., Tang, C.K.: A Computational Framework for Feature Extraction and Segmentation. Elsevier, Amsterdam (2000)MATHGoogle Scholar
 53.Mumford, D.: Elastica and computer vision. In: Bajaj, C.L. (ed.) Algebraic Geometry and Its Applications. Springer, New York (1994)Google Scholar
 54.Parent, P., Zucker, S.W.: Trace inference, curvature consistency, and curve detection. IEEE Trans. PAMI 11(8), 823–839 (1989)CrossRefGoogle Scholar
 55.Pennec, X., Fillard, P., Ayache, N.: Invariant metric on SPD matrices and use of Frechet mean to define manifoldvalued image processing algorithms. A Riemannian framework for tensor computing. Int. J. Comput. Vis. 66(1), 41–66 (2006)MathSciNetCrossRefMATHGoogle Scholar
 56.Pennec, X., Arsigny, V.: Exponential Barycenters of the Canonical Cartan connection and invariant means on Lie groups. Matrix Information Geometry, pp. 123–166. Springer, Heidelberg (2012)Google Scholar
 57.Petitot, J.: The neurogeometry of pinwheels as a subRiemannian contact structure. J. Phys. Paris 97(2–3), 265–309 (2003)CrossRefGoogle Scholar
 58.Sanguinetti, G.: Invariant models of vision between phenomenology, image statistics and neurosciences. PhD thesis, Universidad de la Republica, Uruguay (2011)Google Scholar
 59.Sanguinetti, G., Citti, G., Sarti, A.: A model of natural image edge cooccurrence in the rototranslation group. J Vis 10(14), 37 (2010)CrossRefGoogle Scholar
 60.Sapiro, G.: Geometric Partial Differential Equations and Image Analysis. Cambridge University Press, Cambridge (2001)CrossRefMATHGoogle Scholar
 61.Savadjiev, P., Campbell, J.S.W., Pike, G.B., Siddiqi, K.: 3D curve inference for diffusion MRI regularization and fibre tractography. Med. Image Anal. 10(5), 799–813 (2006)CrossRefGoogle Scholar
 62.Savadjiev, P., Strijkers, G.J., Bakermans, A.J., Piuze, E., Zucker, S.W., Siddiqi, K.: Heart wall myofibers are arranged in minimal surfaces to optimize organ function. PNAS 109(24), 9248–9253 (2012)CrossRefGoogle Scholar
 63.Sharma, U., Duits, R.: Leftinvariant evolutions of wavelet transforms on the similitude group. Appl. Comput. Harm. Anal. 39, 110–137 (2015)MathSciNetCrossRefMATHGoogle Scholar
 64.MomayyezSiahkal, P., Siddiqi, K.: 3D Stochastic completion fields: a probabilistic view of brain connectivity. IEEE Trans. PAMI 35(4), 983–995 (2013)CrossRefGoogle Scholar
 65.Sinnaeve, D.: The StejskalTanner Equation generalized for any gradient shape—an overview of most pulse sequences measuring free diffusion. Concepts Magn. Reson. Part A 40A(2), 39–65 (2012)CrossRefGoogle Scholar
 66.Staal, J., Abramoff, M.D., Viergever, M.A., van Ginneken, B.: Ridgebased vessel segmentation in color images of the retina. IEEE Trans. Med. Imaging 23, 501–509 (2004)CrossRefGoogle Scholar
 67.Tournier, J.D., Yeh, C.H., Calamante, F., Cho, K.H., Connolly, A., Lin, C.P.: Resolving crossing fibres using constrained spherical deconvolution: validation using diffusionweighted imaging phantom data. NeuroImage 42, 617–625 (2008)CrossRefGoogle Scholar
 68.Tuch, D.S., Reese, T.G., Wiegell, M.R., Makris, N., Belliveau, J.W., Wedeen, V.J.: High angular resolution diffusion imaging reveals intravoxel white matter fiber heterogeneity. MRM 48, 577–582 (2002)CrossRefGoogle Scholar
 69.Unser, M., Aldroubi, A., Eden, M.: BSpline signal processing: part I—theory. IEEE Trans. Signal Proc. 41, 831–833 (1993)MATHGoogle Scholar
 70.van Almsick, M.: Context models of lines and contours. PhD thesis, Department of Biomedical Engineering, Eindhoven University of Technology, the Netherlands (2007)Google Scholar
 71.Weickert, J.: Anisotropic diffusion in image processing. ECMI Series, TeubnerVerlag, Stuttgart (1998)Google Scholar
 72.Welk, M.: Families of generalised morphological scale spaces. Scale Space Methods in Computer Vision. LNCS, vol. 2695, pp. 770–784. Springer, Berlin (2003)CrossRefGoogle Scholar
 73.Zweck, J., Williams, L.R.: Euclidean group invariant computation of stochastic completion fields using shiftabletwistable functions. J. Math. Imaging Vis. 21(2), 135–154 (2004)MathSciNetCrossRefMATHGoogle Scholar
Copyright information
Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.