Two Pandora multi-algorithm reconstruction paths have been created for use in the analysis of MicroBooNE data. One option, PandoraCosmic, is optimised for the reconstruction of cosmic-ray muons and their daughter delta rays. The second option, PandoraNu, is optimised for the reconstruction of neutrino interactions. Many algorithms are shared between the PandoraCosmic and PandoraNu reconstruction paths, but the overall algorithm selection results in the following key features:
-
PandoraCosmic This reconstruction is more strongly track-oriented, producing primary particles that represent cosmic-ray muons. Showers are assumed to be delta rays and are added as daughter particles of the most appropriate cosmic-ray muon. The reconstructed vertex/start-point for the cosmic-ray muon is the high-y coordinate of the muon track.
-
PandoraNu This reconstruction identifies a neutrino interaction vertex and uses it to aid the reconstruction of all particles emerging from the vertex position. There is careful treatment to reconstruct tracks and showers. A parent neutrino particle is created and the reconstructed visible particles are added as daughters of the neutrino.
The PandoraCosmic and PandoraNu reconstructions are applied to the MicroBooNE data in two passes. The PandoraCosmic reconstruction is first used to process all hits identified during a specified readout window and to provide a list of candidate cosmic-ray particles. This list of particles is then examined by a cosmic-ray tagging module, implemented within LArSoft, which identifies unambiguous cosmic-ray muons, based on their start and end positions and associated hits. Hits associated with particles flagged as cosmic-ray muons are removed from the input hit collection and a new cosmic-removed hit collection is created. This second hit collection provides the input to the PandoraNu reconstruction, which outputs a list of candidate neutrinos. The overall chain of Pandora algorithms is illustrated in Fig. 2.
Cosmic-ray muon reconstruction
The PandoraCosmic reconstruction proceeds in four main stages, each of which uses multiple algorithms and algorithm tools, as described in this section.
Two-dimensional reconstruction
The first step is to separate the input hits into three separate lists, corresponding to the three readout planes (u, v and w). This operation is performed by the EventPreparation algorithm.Footnote 2 For each wire plane, the TrackClusterCreation algorithm then produces a list of 2D clusters that represent continuous, unambiguous lines of hits. Separate clusters are created for each structure in the input hit image, with clusters starting/stopping at each branch feature or any time there is any bifurcation or ambiguity. The approach is to consider each hit and identify its nearest neighbouring hits in a forward direction (hits at a larger wire position) and in a backward direction. Up to two nearby hits are considered in each direction, allowing primary and secondary associations to be formed between pairs of hits in both directions. Any collections of hits for which only primary associations are present can be safely added to a cluster. If secondary associations are present, navigation along the different possible chains of association allows identification of cases where significant ambiguities or bifurcations arise and where the clustering must stop. This initial clustering provides clusters of high purity, representing energy deposits from exactly one true particle, even if this means that the clusters are initially of low completeness, containing only a small fraction of the total hits associated with the true particle. The clusters are then examined by a series of topological algorithms.
Cluster-merging algorithms identify associations between multiple 2D clusters and look to grow the clusters to improve completeness, without compromising purity. Typical definitions of cluster association consider the proximity of (hits in) two clusters, or use pointing information (whether, for instance, a linear fit to one cluster points towards the second cluster). The challenge for the algorithms is to make cluster-merging decisions in the context of the entire event, rather than just by considering individual pairs of clusters in isolation. The ClusterAssociation and ClusterExtension algorithms are reusable base classes, which allow different definitions of cluster association to be provided. Algorithms inheriting from these base classes need only to provide an initial selection of potentially interesting clusters (e.g. based on length, number of hits) and a metric for determining whether a given pair of clusters should be declared associated. The common implementation then provides functionality to navigate forwards and backwards between associated clusters, identifying chains of clusters that can be safely merged together. Evaluation of all possible chains of cluster association allows merging decisions to be based upon an understanding of the overall event topology, rather than simple consideration of isolated pairs of clusters.
To improve purity, cluster-splitting algorithms refine the hit selection by breaking single clusters into two parts if topological features indicate the inclusion of hits from multiple particles. Clusters are split if there is a significant discontinuity in the cluster direction, or if multiple clusters intersect or point towards a common position along the length of an existing cluster. Figure 3a shows initial clusters formed for simulated cosmic-ray muons in MicroBooNE. These clusters form the input to the series of topological algorithms, in which multiple cluster-merging and cluster-splitting procedures are interspersed. Processing by these algorithms results in the refined clusters shown in Fig. 3b. The final 2D clusters provide the input to the process used to “match” features reconstructed in multiple readout planes, and to construct particles.
Three-dimensional track reconstruction
The aim of the 3D track reconstruction is to collect the 2D clusters from the three readout planes that represent individual, track-like particles. The clusters can be assigned as daughter objects of new Pandora particles. The challenge for the algorithms is to identify consistent groupings of clusters from the different views. The 3D track reconstruction is primarily performed by the ThreeDTransverseTracks algorithm. This algorithm considers the suitability of all combinations of clusters from the three readout planes and stores the results in a three-dimensional array, hereafter loosely referred to as a rank-three tensor. The three tensor indices are the clusters in the u, v and w views and, for each combination of clusters, a detailed TransverseOverlapResult is stored. The information in the tensor is examined in order to identify cluster-matching ambiguities. If ambiguities are discovered, the information can be used to motivate changes to the 2D reconstruction that would ensure that only unambiguous combinations of clusters emerge. This procedure is often loosely referred to as “diagonalising” the tensor.
To populate the TransverseOverlapResult for three clusters (one from each of the u, v and w views), a number of sampling points are defined in the x (drift time) region common to all three clusters. Sliding linear fits to each cluster are used to allow the cluster positions to be sampled in a simple and well-defined manner. The sliding linear fits record the results of a series of linear fits, each using only hits from a local region of the cluster. They provide a smooth representation of cluster trajectories. To construct a sliding linear fit, a linear fit to all hits is used to define a local coordinate system tailored to the cluster. Each hit is then assigned local longitudinal and transverse coordinates. The parameter space is divided into bins along the longitudinal coordinate (the bin width is equal to the wire pitch), and each hit is assigned to a bin. For every bin, a local linear fit is performed, which, crucially, considers only hits from a configurable number of adjacent bins either side of the bin under consideration. Each bin then holds a “sliding” fit position, direction and RMS, transformed back to the x-wire plane.
For a sampling point at a given x coordinate, the sliding-fit position can quickly be extracted for a pair of clusters, e.g. in the u and v views. These positions, together with the coordinate transformation plugin, can be used to predict the position of the third cluster, e.g. in the w view, at the same x coordinate. This prediction can be compared to the sliding-fit position for the third cluster and, by considering all combinations (\(u,v\rightarrow w\); \(v,w\rightarrow u\); \(u,w\rightarrow v\)), a quantity motivated as a \(\chi ^{2}\) can be calculated (this quantity is a scaled sum of the squared residuals, used as an indicative goodness-of-fit metric, rather than a true \(\chi ^{2}\)). The \(\chi ^{2}\)-like value, together with the common x-overlap span, the number of sampling points and the number of consistent sampling points, is stored in the TransverseOverlapResult in the tensor.
Crucially, the results stored in the tensor do not just provide isolated information about the consistency of groups of three clusters. The results also provide detailed information about the connections between multiple clusters and their matching ambiguities. For instance, starting from a given cluster, it is possible to navigate between multiple tensor elements, each of which indicate good cluster matching but share, or “re-use”, one or two clusters. In this way, a complete set of connected clusters can be extracted. If this set contains more than one cluster from any single view, an ambiguity is identified. The exact form of the ambiguity can often indicate the mechanism by which it may be addressed and can identify the specific clusters that require modification. This detailed information about cluster connections is queried by a series of algorithm tools, which can create particles or modify the 2D pattern recognition. The algorithm tools have a specific ordering and, if any tool makes a change, the tensor is updated and the full list of tools runs again. The tensor is processed until no tool can perform any further operations.
The algorithm tools, in the order that they are run, are:
-
ClearTracks tool, which looks to create particles from unambiguous groupings of three clusters. It examines the tensor to find regions where only three clusters are connected, one from each of the u, v and w views, as illustrated in Fig. 4a. Quality cuts are applied to the TransverseOverlapResult and, if passed (the common x-overlap must be \(>90\%\) of the x-extent for all clusters at this stage), a new particle is created.
-
LongTracks tool, which aims to resolve obvious ambiguities. In the example in Fig. 4b, the presence of small delta-ray clusters near long muon tracks means that clusters are matched in multiple configurations and the tensor is not diagonal. One of the combinations of clusters is, however, better than the others (with larger x-overlap and a larger number of consistent sampling points) and is used to create a particle. The common x-overlap threshold remains \(>90\%\) of the x-extent for all clusters.
-
OvershootTracks tool, which addresses cluster-matching ambiguities of the form 1:2:2 (one cluster in the u view, matched to two clusters in the v view and two clusters in the w view). In the example in Fig. 4c, the pairs of clusters in the v and w views connect at a common x coordinate, but there is a single, common cluster in the u view, which spans the full x-extent. The tool considers all clusters and decides whether they represent a kinked topology in 3D. If a 3D kink is identified, the u cluster can be split at the relevant position and two new u clusters fed back into the tensor. The initial ClearTracks tool will then be able to identify two unambiguous groupings of three clusters and create two particles.
-
UndershootTracks tool, which examines the tensor to find cluster-matching ambiguities of the form 1:2:1. In the example in Fig. 4d, two clusters in the v view are matched to common clusters in the u and w views, leading to two conflicting TransverseOverlapResults in the tensor. The tool examines all the clusters to assess whether they represent a kinked topology in 3D. If a 3D kink is not found, the two v clusters can be merged and a single v cluster fed-back into the tensor, removing the ambiguity.
-
MissingTracks tool, which understands that particle features may be obscured in one view, with a single cluster representing multiple overlapping particles. If this tool identifies appropriate cluster overlap, using the cluster-relationship information available from the tensor, the tool can create two-cluster particles.
-
TrackSplitting tool, which looks to split clusters if the matching between views is unambiguous, but there is a significant discrepancy between the cluster x-extents and evidence of gaps in a cluster.
-
MissingTrackSegment tool, which looks to add missing hits to the end of a cluster if the matching between views is unambiguous, but there is a significant discrepancy between the cluster x-extents.
-
LongTracks tool, which is used again with the common x-overlap threshold reduced to \(>75\%\) of the x-extent for all clusters.
In addition to the ThreeDTransverseTracks algorithm, there are other algorithms that form a cluster-association tensor, and query it using algorithm tools. These algorithms target different topologies and store different information in the tensor. The ThreeDLongitudinalTracks algorithm examines the case where the x-extent of a cluster grouping is small. In this case, there are too many ambiguities when trying to sample the clusters at fixed x coordinates. The ThreeDTrackFragments algorithm is optimised to look for situations where there are single, clean clusters in two views, associated with multiple fragment clusters in a third view.
Delta-ray reconstruction
Following 3D track reconstruction, the PandoraCosmic reconstruction dissolves any 2D clusters that have not been included in a reconstructed particle. The assumption is that these clusters likely represent fragments of delta-ray showers. The relevant hits are reclustered using the SimpleClusterCreation algorithm, which is a proximity-based clustering algorithm. A number of topological algorithms, which re-use implementation from the earlier 2D reconstruction, refine the clusters to provide a more complete delta-ray reconstruction. The DeltaRayMatching algorithm subsequently matches the delta-ray clusters between views, creates new shower-like particles and identifies the appropriate parent cosmic-ray particles. The cluster matching is simple and assesses the x-overlap between clusters in multiple views. Parent cosmic-ray particles are identified via simple comparison of inter-cluster distances.
Three-dimensional hit reconstruction
At this point, the assignment of hits to particles is complete and the particles contain 2D clusters from one, two or usually all three readout planes. For each input (2D) hit in a particle, a new 3D hit is created. The mechanics differ depending upon the cluster topology, with separate approaches for: hits on transverse tracks (significant extent in x coordinate) with clusters in all views; hits on longitudinal tracks (small extent in x coordinate) with clusters in all views; hits on tracks that are multi-valued at specific x coordinates; hits on tracks with clusters in only two views; and hits in shower-like particles. Only two such approaches are described here:
-
For transverse tracks with clusters in all three views, a 2D hit in one view, e.g. u, is considered and sliding linear fit positions are evaluated for the two other clusters, e.g. v and w, at the same x coordinate. An analytic \(\chi ^{2}\) minimisation is used to extract the optimal y and z coordinates at the given x coordinate. It is also possible to run in a mode whereby the chosen y and z coordinates ensure that the 3D hit can be projected precisely onto the specific wire associated with the input 2D hit.
-
For a 2D hit in a shower-like particle (a delta ray, e.g. in the u view), all combinations of hits (e.g. in the v and w views) located in a narrow region around the hit x coordinate are considered. For a given combination of hit u, v and w values, the most appropriate y and z coordinates can be calculated. The position yielding the best \(\chi ^{2}\) value is identified and a \(\chi ^{2}\) cut is applied to help ensure that only satisfactory positions emerge.
After 3D hit creation, the PandoraCosmic reconstruction is completed by the placement of vertices/start-positions at the high-y coordinates of the cosmic-ray muon particles. Vertices are also reconstructed for delta-ray particles and are placed at the 3D point of closest approach between the parent cosmic-ray muon and daughter delta ray.
Neutrino reconstruction
A key requirement for the PandoraNu reconstruction path is that it must be able to deal with the presence of any cosmic-ray muon remnants that remain in the input, cosmic-removed hit collection. The approach is to begin by running the 2D reconstruction, 3D track reconstruction and 3D hit reconstruction algorithms described in Sect. 4.1. The 3D hits are then divided into slices (separate lists of hits), using proximity and direction-based metrics. The intent is to isolate neutrino interactions and cosmic-ray muon remnants in individual slices.
The slicing algorithm seeds a new slice with an unassigned 3D cluster. Any 3D clusters deemed to be associated with this seed cluster are added to the slice, which then grows iteratively. If a 3D cluster is track-like, linear fits can be used to identify whether clusters point towards one another. If a 3D cluster is shower-like, clusters bounded by a cone (defined by the shower axes) can also be added to the slice. Finally, clusters can be added if they are in close proximity to existing 3D clusters in the slice. If no further additions can be made, another slice is seeded. The original 2D hits associated with each 3D slice are used as an input to the dedicated neutrino reconstruction described in this section. All 2D hits will be assigned to a slice. Each slice (including those containing cosmic-ray muon remnants) is processed in isolation and results in one candidate reconstructed neutrino.
The dedicated neutrino reconstruction begins with a track-oriented clustering algorithm and series of topological algorithms, as described in Sect. 4.1.1. The lists of 2D clusters for the different readout planes are then used to identify the neutrino interaction vertex. The vertex reconstruction is a key difference between the PandoraCosmic and PandoraNu reconstruction paths, and the 3D vertex position plays an important role throughout the subsequent algorithms. Correct identification of the neutrino interaction vertex helps algorithms to identify individual primary particles and to ensure that they each result in separate reconstructed particles.
Three-dimensional vertex reconstruction
Reconstruction of the neutrino interaction vertex begins with creation of a list of possible vertex positions. The CandidateVertexCreation algorithm compares pairs of 2D clusters, ensuring that the two clusters are from different readout planes and have some overlap in the common x coordinate. The endpoints of the two clusters are then compared. For instance, the low-x endpoint of one cluster can be identified. The same x coordinate will not necessarily correspond to an endpoint of the second cluster, but the position of the second cluster at this x coordinate can be evaluated, using a sliding linear fit and allowing some extrapolation of cluster positions. The two cluster positions, from two views, are sufficient to provide a candidate 3D position. Using all of the cluster endpoints allows four candidate vertices to be created for each cluster pairing. Figure 5 shows the candidate vertex positions created for a typical simulated CC \(\nu _{\mu }\) event in MicroBooNE.
Having identified an extensive list of candidate vertex positions, it is necessary to select one as the most likely neutrino interaction vertex. There are a large number of candidates, so each is required to pass a simple quality cut before being put forward for assessment: candidates are required to sit on or near a hit, or in a registered detector gap, in all three views. The EnergyKickVertexSelection algorithm then assigns a score to each remaining candidate and the candidate with the highest score is selected.
There are three components to the score, and the key quantities used to evaluate each component are illustrated in Fig. 6:
$$\begin{aligned} S = S_{\text {energy kick}} \cdot S_{\text {asymmetry}} \cdot S_{\text {beam deweight}} \end{aligned}$$
(1)
-
Energy kick score Each 3D vertex candidate is projected into the u, v and w views. A parameter, \(E^{T'}_{ij}\), is then calculated to assess whether the candidate is consistent with observed cluster j in view i. This parameter is closely related to the transverse energy, but has additional degrees of freedom that introduce a dependence on the displacement between the cluster and vertex projection. Candidates are suppressed if the sum of \(E^{T'}_{ij}\), over all clusters, is large. This reflects the fact that primary particles produced in the interaction should point back towards the true interaction vertex, whilst downstream secondary particles may not, but are expected to be less energetic:
$$\begin{aligned} S_\text {energy kick}= & {} \exp \left\{ -\sum _{\text {view } i}\,\,\,\,\sum _{\text {cluster } j}{\frac{E^{T'}_{ij}}{\epsilon }}\right\} \end{aligned}$$
(2)
$$\begin{aligned} E^{T'}_{ij}= & {} \frac{E_j\times (x_{ij} + \delta _x)}{(d_{ij} + \delta _d)} \end{aligned}$$
(3)
where \(x_{ij}\) is the transverse impact parameter between the vertex and a linear fit to cluster j in view i, \(d_{ij}\) is the closest distance between the vertex and cluster and \(E_{j}\) is the cluster energy, taken as the integral of the hit waveforms converted to a modified GeV scale. The parameters \(\epsilon \), \(\delta _d\) and \(\delta _x\) are tunable constants: \(\epsilon \) determines the relative importance of the energy kick score, \(\delta _d\) protects against cases where \(d_{ij}\) is zero and controls weighting as a function of \(d_{ij}\), and \(\delta _x\) controls weighting as a function of \(x_{ij}\).
-
Asymmetry score This suppresses candidates incorrectly placed along single, straight clusters, by counting the numbers of hits (in nearby clusters) deemed upstream and downstream of the candidate position. For the true vertex, the expectation is that there should be a large asymmetry. In each view, a 2D principal axis is determined and used to define which hits are upstream or downstream of the projected vertex candidate. The difference between the numbers of hits is used to calculate a fractional asymmetry, \(A_{i}\) for view i:
$$\begin{aligned} S_\text {asymmetry}= & {} \exp \left\{ {\sum _{\text {view } i}\frac{A_i}{\alpha }}\right\} \end{aligned}$$
(4)
$$\begin{aligned} A_{i}= & {} \frac{|N^{\uparrow }_{i} - N^{\downarrow }_{i}|}{N^{\uparrow }_{i} + N^{\downarrow }_{i}} \end{aligned}$$
(5)
where \(\alpha \) is a tunable constant that determines the relative importance of the asymmetry score and \(N^{\uparrow }_{i}\) and \(N^{\downarrow }_{i}\) are the numbers of hits deemed upstream and downstream of the projected vertex candidate in view i.
-
Beam deweighting score For the reconstruction of beam neutrinos, knowledge of the beam direction can be used to preferentially select vertex candidates with low z coordinates:
$$\begin{aligned} S_\text {beam deweight}= & {} \exp \left\{ -Z/\zeta \right\} \end{aligned}$$
(6)
$$\begin{aligned} Z= & {} \frac{z - z_\mathrm {min}}{z_\mathrm {max} - z_\mathrm {min}} \end{aligned}$$
(7)
where \(\zeta \) is a tunable constant that determines the relative importance of the beam deweighting score and \(z_{\mathrm {min}}\) and \(z_{\mathrm {max}}\) are the lowest and highest z positions from the list of candidate vertices.
Figure 5 shows the scores assigned to a number of vertex candidates in a typical simulated CC \(\nu _{\mu }\) event in MicroBooNE, including a breakdown of each score into its component parts. Following selection of the neutrino interaction vertex, any 2D clusters crossing the vertex are split into two pieces, one on either side of the projected vertex position.
Track and shower reconstruction
The PandoraNu 3D track reconstruction proceeds as described in Sect. 4.1.2. Unlike the cosmic-ray muon reconstruction, PandoraNu also attempts to reconstruct primary electromagnetic showers, from electrons and photons. An example of the typical topologies under investigation is shown in Fig. 7. PandoraNu performs 2D shower reconstruction by adding branches to any long clusters that represent “shower spines”. This procedure uses the following steps:
-
The 2D clusters are characterised as track-like or shower-like, based on length, variations in sliding-fit direction along the length of the cluster, an assessment of the extent of the cluster transverse to its linear-fit direction, and the closest approach to the projected neutrino interaction vertex.
-
Any existing track particles that are now deemed to be shower-like are dissolved to allow assessment of the clusters as shower candidates.
-
Long, shower-like 2D clusters that could represent shower spines are identified. The shower spines will typically point back towards the interaction vertex.
-
Short, shower-like 2D branch clusters are added to shower spines. The ShowerGrowing algorithm operates recursively, finding branches on a candidate shower spine, then branches on branches. For every branch, a strength of association to each spine is recorded. Branch addition decisions are then made in the context of the overall event topology.
Following 2D shower reconstruction, the 2D shower-like clusters are matched between readout planes in order to form 3D shower particles. The ideas described in Sect. 4.1.2 are re-used for this process. The ThreeDShowers algorithm builds a rank-three tensor to store cluster-overlap and relationship information, then a series of algorithm tools examine the tensor. Iterative changes are made to the 2D reconstruction to diagonalise the tensor and to ensure that 3D shower particles can be formed without ambiguity. Fits to the hit positions in 2D shower-like clusters are used to characterise the spatial extent of the shower. This identification of the shower edges uses separate sliding linear fits to just the hits deemed to lie on the two extremal transverse edges of candidate shower-like clusters. In order to calculate a ShowerOverlapResult for a group of three clusters, the shower edges from two are used to predict a shower envelope for the third cluster. The fraction of hits in the third cluster contained within the envelope is then stored, alongside details of the common cluster x-overlap. This procedure is illustrated in Fig. 8.
The shower tensor is first queried by the ClearShowers tool, which looks to form shower particles from any unambiguous associations between three clusters. The association between the clusters must satisfy quality cuts on the common x-overlap and fraction of hits enclosed in the predicted shower envelopes. The SplitShowers tool then looks to resolve ambiguities associated with splitting of sparse showers into multiple 2D clusters. This tool searches for 2D clusters that can be merged in order to ensure that each electromagnetic shower is represented by a single cluster from each readout plane.
After 3D shower reconstruction, a second pass of the 3D track reconstruction is applied, to recover any inefficiencies associated with dissolving track particles to examine their potential as showers. The ParticleRecovery algorithm then examines any groups of clusters that previously failed to satisfy the quality cuts for particle creation, due to problems with the hit-finding or 2D clustering, or due to significant detector gaps. Ideas from the earlier 3D track and shower reconstruction are re-used, but the thresholds for matching clusters between views are reduced. Finally, the ParticleCharacterisation algorithm classifies each particle as being either track-like or shower-like.
Particle refinement
The list of 3D track-like and shower-like particles can be examined and refined, to provide the final assignment of hits to particles. For MicroBooNE, the primary issue to address at this stage is the completeness of sparse showers, which can frequently be represented as multiple, separate reconstructed particles. A number of distinct algorithms are used:
-
The ClusterMopUp algorithms consider 2D clusters that have been assigned to shower-like particles. They use parameterisations of the 2D cluster extents, including cone fits and sliding linear fits to the edges of the showers, to pick up any remaining, unassociated 2D clusters that are either bounded by the assigned clusters, or in close proximity.
-
The SlidingConeParticleMopUp algorithm uses sliding linear fits to the 3D hits for shower-like particles. Local 3D cone axes and apices are defined and cone opening angles can be specified as algorithm parameters or derived from the topology of the 3D shower hits. The 3D cones are extrapolated and downstream particles deemed fragments of the same shower are collected and merged into the parent particle.
-
The SlidingConeClusterMopUp algorithm projects fitted 3D cones into each view and searches for any remaining 2D clusters (not added to any particle) that are bounded by the projections.
-
The IsolatedClusterMopUp algorithm dissolves any remaining unassociated 2D clusters and looks to add their hits to the nearest shower-like particle. The maximum distance allowed between a remaining hit and shower-like particle is configurable, with a default value of 5 cm.
Upon completion of these algorithms, only an insignificant number of 2D hits should remain unassigned to a particle.
Particle hierarchy reconstruction
The final step in the PandoraNu reconstruction is to organise the reconstructed particles into a hierarchy. The procedure used is:
-
A neutrino particle is created and the 3D neutrino interaction vertex is added to this particle.
-
The 3D hits associated with the reconstructed particles are considered and any particles deemed to be associated to the interaction vertex are added as primary daughters of the neutrino particle.
-
Algorithm tools look to add subsequent daughter particles to the existing primary daughters of the neutrino, for example a decay electron may be added as a daughter of a primary muon particle.
-
If the primary daughter particle with the largest number of hits is flagged as track-like or shower-like, the reconstructed neutrino will be labelled as a \(\nu _{\mu }\) or a \(\nu _{e}\) respectively.
-
3D vertex positions are calculated for each of the particles in the neutrino hierarchy. The vertex positions are the points of closest approach to their parent particles, or to the neutrino interaction vertex.
Each slice results in a single reconstructed neutrino particle, with a hierarchy of reconstructed daughter particles. The particles reconstructed for a typical simulated CC \(\nu _{\mu }\) event in MicroBooNE are illustrated in Fig. 9.