Introduction

Genome sequencing and comprehensive gene expression analyses have been conducted on many insect species (Ellegren 2014; Oppenheim et al. 2015). Combined with the establishment of gene manipulation methods (Mello and Conte 2004; Adli 2018), the molecular and neural bases of insect behavior have been elucidated in various non-model insect species other than Drosophila (Sun et al. 2017; Mansourian et al. 2019; Walton et al. 2020). The European honey bee (Apis mellifera) is a well-known social insect, and its social behavior has been extensively studied for many years (Frisch et al. 1967; Winston 1987; Seeley 1995). In addition, genome editing and transgenic technologies have been established and applied for molecular- and neuro-ethological analyses in honey bees (Schulte et al. 2014; Kohno et al. 2016; Otte et al. 2018; Kohno and Kubo 2018, 2019; Roth et al. 2019; Carcaud et al. 2023), although the molecular and neural bases underlying honey bee behaviors still remain to be solved.

Most innate behaviors of honey bees, such as nursing their brood, division of labor of workers, and waggle dance, have been described by observations and behavioral experiments in the field (Frisch et al. 1967; Seeley 1995), but genetically modified honey bees must be confined to laboratory conditions due to legal restrictions. To date, a simple and robust behavioral experimental paradigm, olfactory conditioning of the proboscis extension reflex (PER), has been extensively utilized to analyze the abilities and mechanisms of learning and memory of honey bees inside a laboratory (Kuwabara 1957; Giurfa and Sandoz 2012; Eisenhardt 2014). Some previous studies have developed original devices for analyzing bee behaviors and sophisticated psychological experimental paradigms (Giurfa et al. 2001; Kirkerud et al. 2013, 2017; Schultheiss et al. 2017; Howard et al. 2018, 2019; Marchal et al. 2019; Nouvian and Galizia 2019; Geng et al. 2022), but they have not necessarily been widely used due to their uniqueness or difficulties requiring skilled handling of bees with care. Against this background, developing a variety of robust (highly reproducible), simple, and versatile behavioral experimental systems which can be used inside a laboratory other than the olfactory PER associative learning paradigm is important for the rise of honey bee behavioral genetics.

We focused on the antennal response (AR) of honey bees as one of these behavioral experimental systems. Insect antennae are essential for various sensory receptions such as olfaction, gustation, and mechanoreception and are used to sense the external world (Vogt and Riddiford 1981; Staudacher et al. 2005; Hallem et al. 2006). Various insect species move their antennae in response to odorants, visual stimuli, and mechanical stimuli (Honegger 1981; Staudacher et al. 2005; Mamiya et al. 2011; Natesan et al. 2019). In honey bees, ARs to motion, odor, and mechanical stimuli and learning-dependent changes in AR to odors have been reported (Suzuki 1975; Erber et al. 1993; Erber and Kloppenburg 1995; Cholé et al. 2015; Gascue et al. 2022). In addition, antennal contact is essential in maintaining society through nestmate recognition and pheromone reception in eusocial insects, including honey bees (Ozaki et al. 2005; Sharma et al. 2015; Gomez Ramirez et al. 2023). To date, multi-animal tracking studies in eusocial insects using individual identification tags have revealed developmental changes in the social behaviors and responses of individuals to different social circumstances, which were inferred from individual positions inside the nest and inter-individual interactions (Mersch et al. 2013; Crall et al. 2015, 2018; Wario et al. 2015; Ai et al. 2017; Stroeymeyt et al. 2018; Liberti et al. 2022). However, by colony-level observations, determining which behavioral components change and affect individual responses is generally difficult. Therefore, the measurement and analysis of antennal movements in honey bees is expected to lead to a better understanding of the behavioral components that influence not only environmental recognition but also social behaviors.

In previous studies, the movements of insect antennae were measured by manually determining their position and angle for each frame of video or using phototransistors, which register the movement of the antennae (Erber and Schildberger 1980; Erber et al. 1993; Erber and Kloppenburg 1995; Okada and Toh 2004). Recently, video analysis technologies have advanced, and methods for automatically tracking the movement of body parts of the honey bee have been reported, such as tracking the marked colors at the tip of the antennae or separating the outline of the antennae from the background by image processing (Cholé et al. 2015, 2022; Khurana and Sane 2016). In addition, a markerless posture-tracking tool using deep learning, DeepLabCut (Mathis et al. 2018; Nath et al. 2019), has been widely used in various research fields, such as ecology and neuroscience (Mathis and Mathis 2020), and is becoming recognized as a powerful tool that does not require special devices or expertise, in addition to being noninvasive and robust in analyzing videos with a complex background. Recently, DeepLabCut has also been used in honey bees and bumble bees, a close relative of honey bees, and the AR to odors with positive or negative valence (Gascue et al. 2022) or 3D antennal movements in response to odor stimuli (Claverie et al. 2023) were analyzed, revealing more accurate behavioral descriptions than traditional PER protocols which only output binary response.

This study aimed to enhance the experimental system for analyzing the ARs of honey bees using DeepLabCut. We focused on the AR to motion stimuli reported in previous studies, in which bees tilted their antennae in the opposite direction to the upward and downward motion stimuli in the transverse plane (Erber et al. 1993; Erber and Kloppenburg 1995) and confirmed that this AR was successfully detected by tracking antennal movements using DeepLabCut. In addition, we revealed that ARs in the coronal plane to forward and backward motion stimuli were observed in honey bees. An investigation of the developmental maturation of honey bee ARs showed that ARs to motion stimuli were not detectable in bees immediately after emergence but became mature through post-emergence development in an experience-independent manner. Furthermore, unsupervised clustering analysis using multidimensional data created by processing tracking data using DeepLabCut classified antennal movements into different clusters, suggesting the efficacy of data-driven analysis for the behavioral classification of honey bees.

Materials and Methods

Animal

European honey bee (Apis mellifera) colonies were purchased from Kumagaya Beekeeping Company (Saitama, Japan) and Okinawa Kariyushi apiary (Okinawa, Japan) and maintained at the University of Tokyo (Tokyo, Japan). Workers flying outside the hive were collected and used for experiments examining ARs to motion stimuli, and for the ‘flying (F)’ group in the experiment examining the developmental maturation of AR. To test whether ARs are acquired based on each individual’s flight experience or developmental maturation after emergence, we collected dozens of newly emerged workers identified by their fuzzy appearance (Winston 1987) from the hives and divided them into four groups; ‘newly emerged (NE)’ group whose AR was analyzed on the day of collection, ‘colony-reared (Cr)’ group which was returned to the colony without any treatments, ‘colony-reared with one wing removed (CrWR)’ group which had one wing removed and returned to the colony, and ‘incubator-reared (Ir)’ group which was maintained in an acryl cage (95 × 55 × 110 mm in size) with a piece of honeycomb and approximately 30 workers captured from the hive, and kept in an incubator at 34 ºC with honey and water fed ad libitum. Bees in the different groups were marked on their thoraxes with different colors using paint markers (POSCA, Japan). After rearing in the hive or incubator for ten days, bees in the Cr, CrWR, and Ir groups were collected from the hive or acrylic cage and used in the experiment.

Recording Antennal Movement During Presenting Motion Stimuli to Bees

The collected workers were anesthetized on ice and fixed in a P-1000 pipette tip with masking tape. After awakening from anesthesia, bees were fed 4 µL of 30% (w/v) sucrose solution. 6 mm wide black and white vertical stripes were displayed on two LED monitors (BenQ, GW2283) with 60 Hz refresh rate placed facing each other at a 30º angle, and motion stimuli were presented to a bee set between the monitors by moving these stripes at 24 mm s−1 using PowerPoint animation (Fig. 1). Bees were set vertically when vertical motion stimuli were presented, or set horizontally when horizontal motion stimuli were presented. After familiarization for at least 3 min, each of the vertical or horizontal motion stimuli was presented alternately twice with an interstimulus interval of approximately 10 s (Fig. 1a). The heads of the bees were recorded at 30 fps (frames per second) with a resolution of 1024 × 768 by a Raspberry pi camera module (Kumantech) set above the bees during the experiment. To capture the entire antenna, the camera was tilted at 20° when the bees were set horizontally (Fig. 1d).

Fig. 1
figure 1

Overview of the experimental setup for the presentation of motion stimuli to a fixed bee. a The time course for the experiments. Blue boxes indicate the periods during the presentation of upward (U1 and U2) or backward (B1 and B2) motion stimuli, and magenta boxes indicate the periods during the presentation of downward (D1 and D2) or forward (U1 and U2) motion stimuli. b A picture of the experimental setup. c Schematic diagram of the experimental setup viewed from the top. The positions of monitors and a bee are shown. d Schematic diagrams of the side view of the experimental setup in the experiment for the presentation of vertical (left) or horizontal (right) motion

Automatic Tracking of Antennal Movement Using DeepLabCut

Antennal movements were tracked using the DeepLabCut software (DeepLabCut GUI v2.2.2)(Mathis et al. 2018; Nath et al. 2019). A total of 200 frames were extracted from videos of 10 individuals (20 frames per individual) to create a training dataset for each vertical or horizontal motion presentation experiment. The following eight points were manually labeled on each extracted frame: the base, middle (the joint between the scape and flagellum), and tip of each antenna, central ocellus, and middle of the ventral clypeus (for the vertical motion presentation experiment), or middle of posterior part of the head (for the horizontal motion presentation experiment). We trained the networks using these data with default parameter settings (ResNet50, 500,000 iterations) and evaluated the trained networks by checking the labeled frames. Frames with low-likelihood labeling were extracted for additional training, and the networks were retrained after manually correcting the mislabeled points with the same settings as in the first training. The networks created were used to track the antennal movement in each video.

Data Analysis

Data analyses were performed using R version 4.1.0. R package ‘dlcpr’ was used to load the tracking data for each stimulus for each individual into tidy data. Tracked points with a low likelihood (< 0.90) were linearly interpolated from the points in the previous and following frames using the ‘imputeTS’ package. The angle of each antenna was calculated as the angle of the line connecting the base and tip of each antenna from the centerline of the head connecting the central ocellus and the ventral center (for the vertical motion presentation experiment) or the posterior center (for the horizontal motion presentation experiment). For each antenna, the density plot and the arrow of the average angle were drawn using all the frames for each stimulus using the ‘circular’ package. One-way analysis of variance (ANOVA) and post hoc Tukey’s honest significant difference (HSD) tests were performed using the average antennal angle of both antennae of each individual during the presentation of each stimulus. A Kruskal-Wallis test and post-hoc Steel-Dwass test were used to compare the direction-specific antennal response (DAR). The angular range of antennae of each individual was calculated by subtracting the minimum angle from the maximum angle of antennae using the tracking data for the spontaneous antennal movements. The moving distance of spontaneously moving antennae was calculated using the tracking data for the tip of both antennae of each individual. One-way ANOVA and post hoc Tukey’s HSD tests were performed using the angular range and moving distance of each group. Unsupervised clustering was performed using data obtained by tracking the full-length video of each individual in the vertical motion presentation experiment. To standardize each frame, all tracking points were translated such that the dorsal center was at the origin and rotated such that the ventral center overlapped with the y-axis, followed by scaling to make the distance from the dorsal center to the ventral center constant. For every 10 frames (0.33 s), the maximum, minimum, mean, and standard deviation of the x- and y-coordinates were calculated for eight points: the base, middle, and tip of each antenna and the centroid of these three points for each antenna. The resulting data points, with a total of 64 dimensions, were used to determine the appropriate number of clusters by calculating the gap statistic (Tibshirani et al. 2001) and were clustered into eight clusters using Hartigan–Wong’s k-means algorithm. Clustering results were visualized using a uniform manifold approximation and projection (UMAP). The corresponding stimulus (spontaneous, downward, or upward) in each frame was manually annotated.

Results

Antennal Response to Motion Stimuli

First, we examined whether the ARs to upward and downward motion stimuli in the transverse plane, reported in previous studies (Erber et al. 1993; Erber and Kloppenburg 1995), could be detected in our experimental system by automated tracking using DeepLabCut. Vertical motion stimuli were presented to a fixed worker between the monitors (Fig. 1c, d, and 2a). The movements of the left and right antennae were tracked using DeepLabCut during the presentation of each stimulus, and the angles of the antennae from the centerline of the head in the transverse plane were calculated (Fig. 2b, Movie 1). The angular distribution of all frames of the antennae during the presentation of upward or downward motion stimuli showed a tendency to move the antennae in the opposite direction of the motion stimuli (Fig. 2c). The angular distribution of the spontaneous antennal movement was rather similar to that of the downward motion stimulus. However, the average angle of all frames of ‘spontaneous’ movement was in between the average angles during ‘upward’ and ‘downward’ motion presentation (Fig. 2c). Furthermore, we calculated the average angles of both the antennae of each bee for each stimulus and found significant differences in the angles between the stimuli: approximately 90°, 60°, and 100° for spontaneous movement, responses to upward and downward motion stimuli, respectively (Fig. 2d). These results are consistent with those of previous studies (Erber et al. 1993; Erber and Kloppenburg 1995), confirming the applicability of our automated tracking system.

Fig. 2
figure 2

ARs in the transverse plane to vertical motion stimuli. a Schematic diagram of the experimental setup to present the vertical motion stimuli displayed on the monitors to a fixed honey bee. A bee was fixed in a plastic tip (light blue) with masking tape (yellow). Magenta and blue arrows indicate downward and upward motions, respectively. b One frame of the video after processing using DeepLabCut (left) and the schematic diagram of angles of the left (θL) and right (θR) antennae (right) in the transverse plane. c Density plots of θL and θR for all frames (15,000 frames per stimulus; 25 individuals, 10 s × 2 for each stimulus, recording at 30 fps) of spontaneous movements or during the presentation of the upward or downward motion stimulus. The angles of arrows in the circles indicate the average antennal angles of all frames for each stimulus, and their lengths indicate the lengths of the summed unit vectors of the antennal angles for all frames divided by the number of frames. d Comparison of ARs among stimuli. Each dot represents the average angle of both antennae for each individual while each motion stimulus was presented. N = 25 for each stimulus. *** : p < 0.001, ** : p < 0.01 by Tukey’s HSD test

The previous study also examined the ARs to horizontal motion stimuli by measuring the antennal angles in the transverse plane and reported that the antennal angles in response to horizontal motion stimuli were smaller than those during spontaneous movements (Erber et al. 1993). Although the antennal angles in the transverse plane in response to forward and backward motion stimuli did not differ in the previous study, we assumed that bees would exhibit direction-specific ARs even to horizontal motion stimuli if the antennal angles were measured in the coronal plane, which is parallel to the horizontal motion stimuli. To test this possibility, we next analyzed the AR in response to backward and forward motion stimuli by measuring the antennal angles in the coronal plane. A worker fixed in a plastic tip was placed horizontally between the monitors to present horizontal motion stimuli (Fig. 1c, d, and 3a). Antennal movements were tracked as in the experiment for AR to vertical motion stimuli, and the angles of the antennae from the centerline in the coronal plane were calculated (Fig. 3b, Movie 2). As in the AR to vertical motion stimuli, the angular histogram of all video frames during the presentation of each stimulus showed a tendency to move the antennae in the opposite direction to the motion stimuli, and the average angles of the antennae spontaneously moving were between those during the presentation of each stimulus (Fig. 3c). The average angles of both antennae of each bee also differed significantly between the stimuli: approximately 65°, 45°, and 70° for spontaneous movement, responses to backward and forward motion stimuli, respectively (Fig. 3d). The difference in the average angles for different stimuli in the coronal plane was smaller than that in the transverse plane because of the narrower range of antennal movements in the coronal plane. These results clearly indicated that workers respond to external motion stimuli by moving their antennae in the direction opposite to the motion in at least four directions: upward, downward, backward, and forward.

Fig. 3
figure 3

ARs in the coronal plane to horizontal motion stimuli. a Schematic diagram of the experimental setup to present the horizontal motion stimuli displayed on the monitors to a fixed honey bee. A bee was fixed in a plastic tip (light blue) with masking tape (yellow). Magenta and blue arrows indicate forward and backward motions, respectively. b One frame of the video after processing using DeepLabCut (left) and the schematic diagram of angles of the left (θL) and right (θR) antennae (right) in the coronal plane. c Density plots of θL and θR for all frames (15,000 frames per stimulus; 25 individuals, 10 s × 2 for each stimulus, recording at 30 fps) of spontaneous movements or during the presentation of the backward or forward motion stimulus. The angles of arrows in the circles indicate the average antennal angles of all frames for each stimulus, and their lengths indicate the lengths of the summed unit vectors of the antennal angles for all frames divided by the number of frames. d Comparison of antennal responses among stimuli. Each dot represents the average angle of both antennae for each individual while each motion stimulus was presented. n = 25 for each stimulus. *** : p < 0.001, * : p < 0.05 by Tukey’s HSD test

Developmental Maturation of AR After Emergence

A previous study reported that newly emerged workers exhibit limited ARs to some odorants and pheromones compared to older workers, suggesting that these ARs are acquired during the adult life (Cholé et al. 2022). Thus, we next compared the AR to the vertical motion stimuli of newly emerged (NE) and flying (F) workers to investigate whether ARs to visual stimuli also mature after emergence. We calculated the direction-specific antennal response (DAR), which is the angular difference between the average angle of the antennae during upward and downward motion stimuli presentation for each bee (Fig. 4a) (Erber et al. 1993; Erber and Kloppenburg 1995), and compared the results between groups. The results showed that F had a significantly larger DAR than NE, which had almost zero DAR (Fig. 4b). The NE showed similar angular distributions for all frames and average angles during the presentation of each stimulus (Fig. S1a, b), which was contrary to the results obtained for F (Fig. S1c, d).

Fig. 4
figure 4

Development of AR to vertical motion stimuli. a Schematic diagram of DAR. The difference between the mean angles of antennae during the presentation of upward and downward motions is defined as DAR. b Comparison of DARs of bees in the groups analyzed at different developmental stages (NE, F) or with different treatments (Cr, CrWR, Ir). * : p < 0.05 by Steel-Dwass test. c, d Comparison of the angular ranges (c) and the moving distances (d) of antennae of bees in different groups. ** : p < 0.01, * : p < 0.05 by Tukey’s HSD test. n = 6, 6, 6, 6, 7 for NE, F, Cr, CrWR, Ir, respectively. NE: newly emerged, F: flying, Cr: colony-reared, CrWR: colony-reared with one wing removed, Ir: incubator-reared

As the motion presented to the compound eye is opposite to the direction in which a bee moves under natural conditions, moving the antennae in the direction opposite to the motion stimuli displayed on the monitors in this experimental system implies that bees were trying to move their antennae to perceive sensory information in the direction in which they are moving, for example, to recognize objects around the landing site (Evangelista et al. 2010). Therefore, the difference in DAR between the NE and F groups suggests that AR to motion stimuli is acquired based on each individual’s flight experience or developmental maturation after emergence. To test these possibilities, we divided newly emerged workers collected from a hive into three groups and reared them under different conditions: (1) those that were returned to the same hive from which they were collected with no treatment (colony-reared [Cr]), (2) those that were returned to the same hive from which they were collected with two wings on one side removed (colony-reared with wings removed [CrWR]), and (3) those that were reared in a small acrylic cage containing approximately 30 adult workers in an incubator (incubator-reared [Ir]). The bees in the CrWR and Ir groups could not fly, thus, they could not acquire AR through experience. The ARs to motion stimuli of bees in the Cr, CrWR, and Ir groups were analyzed 10 days after the treatments because the first orientating flight of workers was reported to occur as early as 4 days old (Winston 1987), and bees in the Cr group were considered to have flight experience. The results showed that the DARs of bees in the Cr, CrWR, and Ir groups were significantly different from those in the NE group. In contrast, they were comparable to that of the F group and not significantly different from each other (Fig. 4b). In addition, the average angle of the antennae during the presentation of each stimulus was not constant across stimuli in the Cr, CrWR, and Ir groups (Fig. S1e-j).

To examine whether the smaller DAR in the NE group is simply explained by the lower antennal mobility due to, for example, the underdevelopment of antennal muscles, we compared the angular ranges and the moving distances of antennae of bees in the different groups during spontaneous antennal movements (10 s × 2 times). Both the angular range and moving distance of antennae in the NE group tended to be smaller than those in the other groups (Fig. 4c, d). Especially, the angular ranges in the Cr and Ir groups and the moving distances in the F, Cr, and CrWR groups were significantly larger than those in the NE group. However, the angular range of antennae in NE group was approximately 130° (Fig. 4c), which was larger than DARs of bees in any groups (all less than 80°) (Fig. 4b). Therefore, the fact that the NE group had a smaller DAR than the other groups could not be explained solely by differences in antennal mobilities. These results indicate that AR to motion develops with age in a flight experience-independent manner, suggesting that this response is an innate honey bee behavior.

Classification of AR by Unsupervised Clustering

Animal tracking data can be used for unsupervised behavioral classification (Fujimori et al. 2020; Huang et al. 2021; Segalin et al. 2021). Therefore, we examined whether data-driven analysis could classify the tracking data of the antennal movements. Antennal movements were tracked using DeepLabCut for all video frames of 25 individuals in the vertical motion presentation experiment. After the tracking points of each frame were standardized (see Methods for details), the maximum, minimum, mean, and standard deviation of the x- and y-coordinates were calculated every 10 frames (0.33 s) for the following eight points: the base, middle, and tip of each antenna and the centroid of these three points. The resulting data points with 64 dimensions were divided into eight clusters according to the appropriate number of clusters estimated by the gap statistic (Tibshirani et al. 2001) and visualized using a uniform manifold approximation and projection (UMAP) (Fig. 5a). While data points for spontaneous movements were distributed almost uniformly on the UMAP, those during the presentation of upward and downward motion stimuli were distributed nonuniformly, with gradients in opposite directions (Fig. 5b). The proportions of data points for spontaneous movements included in each cluster were almost uniform (approximately 50% in each cluster), whereas those during the presentation of the downward and upward motion stimuli varied across clusters (Fig. 5c). As each video contained frames from over 10 s before the first stimulus presentation to a few seconds after the last stimulus, the number of data points corresponding to spontaneous movements was greater than the sum of the number of data points during the presentation of the upward and downward motion stimuli. In addition, we examined how the data points corresponding to spontaneous movements or during the presentation of upward and downward motion stimuli were distributed into eight clusters. The data points for spontaneous movements were somewhat uniformly classified into all clusters, with a slight tendency to be more classified into Cluster 3 and less classified into Cluster 2 (Fig. 5d). The data points during the presentation of the upward motion stimulus were prominently classified into Clusters 2 and 4 and less frequently classified into Cluster 3 (Fig. 5e). The data points during the presentation of the downward motion stimulus were more classified into Cluster 3 and less into Clusters 2 and 3, showing the opposite tendency to that during the presentation of the upward motion stimulus (Fig. 5f). All clusters included data points from at least 23 of the 25 individuals, and there were no clusters with a high percentage of data points from any particular individual, suggesting that the biases described above were not due to variations among individuals, such as variations in the AR, the angle at which bees were fixed in the experimental setup, or the distance from the head of the bee to the monitors (Fig. S2).

Fig. 5
figure 5

Unsupervised clustering of antennal movements. a UMAP visualization of clusters of data points processed from tracking data of all individuals for every 10 frames. b Distribution of data points corresponding to spontaneous movements (gray) or during the presentation of upward motion (blue) and downward motion (magenta) on the UMAP. c The proportion of data points corresponding to each stimulus in each cluster. Gray indicates the proportion of spontaneous movements, whereas blue and magenta indicate that of downward and upward motion stimuli, respectively. d-f Pie charts of the proportions of data points classified into each cluster among all data points corresponding to spontaneous movements (d), upward motion stimulus (e), and downward motion stimulus (f). g, h Boxplots of the left (g) and right (h) antennae angles for each cluster. The antennal angles of each data point were calculated from the averages of the x- and y-coordinates of the tip of each antenna

Since the distribution of data points across clusters and the average antennal angle of each individual varied depending on the presented motion stimuli (Fig. 2c, d, and 5b-f), we measured the antennal angle for each cluster to examine the properties that characterize each cluster. Because we noticed that the bees often moved their left and right antennae separately when observing the frames corresponding to each cluster, we calculated the angle of each antenna. The antennal angles varied among clusters; for example, the angles of both antennae in Cluster 3 tended to be large, whereas those in Clusters 2 and 4 were small, and the difference between the angles of the left and right antennae in Clusters 1 and 6 was large compared to the other clusters (Fig. 5g, h). Reasonably, the data points in Cluster 3 had large angles for both antennae, and those in Clusters 2 and 4 had small angles, considering the differences in the distribution of data points (Fig. 5c, e, f) and the average angles (Fig. 2d) during the presentation of upward and downward motion stimuli. These results suggest that a data-driven analysis using data points processed from tracking data can be used to classify antennal movements based on their characteristics.

Discussion

We investigated AR to motion stimuli in a honey bee using a video analysis software based on deep learning, DeepLabCut, which has recently been utilized for analyzing ARs to odor stimuli in the honey bees and bumble bees (Gascue et al. 2022; Claverie et al. 2023). Our video analysis also successfully detected ARs in honey bees; honey bees moved their antennae in the direction opposite to the vertical motion stimuli in the transverse plane, as reported in previous studies (Erber et al. 1993; Erber and Kloppenburg 1995). In addition, we found that honey bees exhibit direction-specific AR to horizontal motion stimuli by moving their antennae in the direction opposite to the motion stimuli, when the antennal angles were measured in the coronal plane. However, compared with previous studies (Erber et al. 1993; Erber and Kloppenburg 1995), the antennal angles and DAR, when presented with upward and downward motion stimuli, tended to be larger in the present study. This might be due to differences in the experimental setup, such as the width of the stripes used to present motion stimuli, the distance between the monitors and bees, and the speed of the motion stimuli. In addition, this could be due to the difference in the measurement devices because previous studies used only four phototransistors to measure the angles of each antenna and could not measure angles with as fine a resolution as the motion tracking by DeepLabCut in this study. However, the tendency to move the antennae in the direction opposite to the motion stimulus is consistent between previous and present studies, showing that AR to motion is a robust behavioral property. Future studies by 3D tracking of antennal movements, which has already utilized in other bee species (Claverie et al. 2023), will provide a more comprehensive understanding of ARs to motion stimuli in the honey bee.

AR to motion has been observed in flying insect species other than the honey bee, such as fruit flies and hawk moths, and is believed to play an important role in sensory perception during flight, in combination with AR to other sensory information (Mamiya et al. 2011; Krishnan and Sane 2014). Although we examined AR only to motion at a constant speed (visual stimulus) in the present study, previous studies have examined how insects control their antennal position during exposure to airflow (physical stimuli) or airflow and visual stimuli together (Khurana and Sane 2016; Natesan et al. 2019). Considering that a recent study used DeepLabCut to examine AR to odors with positive or negative valence in honey bees (Gascue et al. 2022), the experimental setup coupled with automated tracking using DeepLabCut used in this study could easily be expanded to examine AR to multimodal sensory information by simultaneously presenting motion (visual stimulus), airflow (physical stimulus), and odor (olfactory stimulus). This will deepen our understanding of how insects control their antennal movements to properly acquire sensory feedback in nature.

In honey bees, direction-specific AR to motion stimuli has been reported to be affected by the administration of serotonin and octopamine in the optic lobes, which are the primary visual centers in the insect brain. The effect of serotonin on AR was observed when injected into the lobula, medulla, and lamina, whereas that of octopamine was observed only when injected into the lobula, suggesting a particularly important role of the lobula for AR in motion stimuli (Erber and Kloppenburg 1995). In Drosophila, some neurons which project to the lobula plate respond specifically in each of four directions (upward, downward, forward, and backward) (Fischbach and Dittrich 1989). Notably, the motor neurons that control antennal movement in the honey bee extend dendrites to the dorsal lobe of the antennal lobe, where projections from the lobula terminate (Maronde 1991; Kloppenburg 1995). This study revealed the presence of AR to each of the four motion directions (upward, downward, forward, and backward), suggesting that AR may arise in response to direction-specific neuronal input from the lobula to the motor neurons regulating antennal movements. This study also revealed that ARs to motion stimuli are not experience-dependent but are acquired through individual development after emergence. Because workers engage in in-hive tasks for a certain period after emergence (Winston 1987; Seeley 1995) and do not need to respond to visual stimuli, the inability of newly emerged workers to respond to motion stimuli is not expected to significantly impact their survival rate. Since previous immunohistochemical studies have revealed the presence of serotonin, serotonin receptor 5-HT1A, and octopamine in the lobula of the honey bee (Schürmann and Klemm 1984; Sinakevitch et al. 2005; Thamm et al. 2010), investigating changes in their expression patterns after emergence in correlation with the development of AR can contribute to elucidating the molecular and neural mechanisms of AR to motion stimuli.

Finally, the data points processed from the tracking data using DeepLabCut were classified into clusters with different antennal movement characteristics by data-driven analysis. Contrary to the analysis focusing on the antennal angle in each frame, in which the responses of the left and right antennae were averaged and the differences between them could not be detected (Fig. 2), data-driven analysis successfully classified the data points with different left and right antennal movements into different clusters, indicating the efficacy of data-driven analysis without relying on any hypotheses. Honey bees use their antennae not only to perceive sensory cues from the environment but also to interact with nestmates (Winston 1987). Therefore, tracking antennal movements during inter-individual interactions and subsequent data-driven clustering may contribute to unveiling the details of how nest-mate recognition and interactions between individuals engaging in different tasks are reflected in antennal movements. Recently, behavior classification by data-driven analysis has been used to explore behavior-related brain regions in combination with neural activity mapping, or to analyze the phenotypes of individuals whose gene functions and neuronal activities are manipulated (Huang et al. 2021; Markowitz et al. 2023). Since genome editing and transgenesis have recently been utilized in the honey bee (Kohno and Kubo 2018; Roth et al. 2019; Değirmenci et al. 2020; Chen et al. 2021; Nie et al. 2021; Wang et al. 2021; Carcaud et al. 2023; Cheng et al. 2023), it is expected that the data-driven behavioral analysis could contribute to the analysis of the molecular and neural basis of social behaviors of the honey bee, which are still largely unknown, in the near future.