Forage area estimation in European honeybees (Apis mellifera) by automatic waggle decoding of videos using a generic camcorder in field apiaries


The waggle dances of European honeybees provide important information that can be used to estimate forage areas and identify food resource limitations. However, manually decoding these dances is labor-intensive. This study develops an automatic waggle decoding method applicable to video recordings taken in field apiaries using a generic camcorder with a normal frame rate. Particle image velocimetry was used to detect the typical characteristics of abdominal waggling in bees. We demonstrated our proposed method using video recordings taken at three hives in field apiaries. The decoded information was used to estimate forage area, which was compared against estimates obtained from manual decoding. For all three video recordings, we obtained a 78–87% overlap in the probable forage regions estimated using automatic and manual decoding. Our results suggest that our automatic decoding method is comparable to manual interpretation for the purposes of forage area estimation.


The European honeybee (Apis mellifera) plays a critical role in global honey production and crop pollination. As the demand for managed crop pollinators has risen in recent decades, a stable supply of bee colonies has become increasingly important for maintaining food production (Aizen and Harder, 2009; Aizen et al., 2008; Klein et al., 2007). Whereas commercial bumblebee production for crop pollination is now industrialized in enclosed environments (Velthuis and van Doorn, 2006), honeybee-rearing is still reliant on open environments, which are increasingly being impacted by land-use change and intensive agricultural management (IPBES, 2016). In the face of these changes, providing adequate floral resources around field apiaries is crucial for maintaining crop pollination (Requier et al., 2015). Consequently, much attention has focused on identifying the forage area of honeybees in order to understand seasonal and spatial patterns in food resource preference and limitation (Couvillon et al., 2014a, 2014b; Park and Nieh, 2017; Steffan-Dewenter and Kuhn, 2003), as well as to assess the risk of pesticide exposure (Danner et al., 2014; Garbuzov et al., 2015) and evaluate whether managed pollinators are effective for the target crop (Balfour and Ratnieks, 2017).

The honeybee waggle dance is a communication tool used by foraging bees to relay the location of attractive resources to their hive (von Frisch, 1967). The information derived from decoding this dance can be used to estimate forage area (Schürch et al., 2013). However, manual decoding protocols are labor-intensive both in real time and with digital video recordings (Wario et al., 2017). To address this issue, recent studies have used advances in computer technology to detect waggle dances from video data automatically (Kimura et al., 2011; Wario et al., 2015, 2017). Yet most such techniques require a laboratory environment equipped with artificial lighting or a relatively high-performance camcorder recording at a high frame rate. Consequently, these techniques are difficult to apply to video recordings taken under natural light in field apiaries, where the forage area estimation results are needed to rear the bees efficiently.

The present paper develops a new method applicable to videos from field apiaries using a generic camcorder recording at a standard frame rate. Some existing methods can successfully extract dancing behavior from trajectories recorded using video tracking software equipped with image cognition technology to identify individual bees (e.g., Feldman and Balch, 2004; Kimura et al., 2011; Takahashi et al., 2017). Such methods for labeling and tracking multiple bees are necessary to identify waggle dances, which are comprised of a series of waggle and return runs performed by an individual bee, as well as more generally to understand behavior and communication in social animals (Kimura et al., 2011, 2014). However, these methods are not yet capable of decoding waggle dances automatically. Wario et al. (2017) proposed a method for detecting waggle runs without trajectory monitoring of labeled bees, which would be a practical and time-saving innovation for estimating forage area. In this study, we adopt the same approach (Wario et al., 2017), which focus on detecting waggle runs through framewise pixel-based image analyses to decode run duration (corresponding to the distance from hive to forage site) and vertical orientation (corresponding to the bearing to a forage site relative to the sun’s azimuth). Since the purpose of our proposed method is to analyze video recordings from field apiaries, we assumed the use of generic camcorders and developed an analytical framework to detect the typical characteristics of the abdominal waggling movement using less sophisticated equipment than in similar studies such as Wario et al. (2017). The performance of our automatic waggle run decoding procedure was evaluated by comparing forage site estimation maps derived from automatically and manually decoded waggle dance data.

Materials and methods

Automatic waggle detection

Particle image velocimetry (PIV) was employed to detect abdominal waggling in dancing bees. PIV is usually used to measure velocity fields in experimental fluid mechanics. In this technique, the motions of illuminated tracer particles in a fluid flow are analyzed to calculate fluid movement by measuring particle displacement in time-consecutive image frames (Adrian, 1991). To obtain velocity fields, an adequate number of lattice points are generated for each image, and an interrogation window is centered on each lattice point. Then, the displacement velocity for each interrogation window is calculated by finding the most similar image area within a search window in the next frame using a cross-correlation analysis (Keane and Adrian, 1992).

We hypothesized that PIV techniques could be used to quantify waggle motions by considering bees (which stand out due to their bright coloration) on a honeycomb as tracer particles. In preliminary tests, we confirmed that waggle motions could be distinguished in PIV velocity fields due to the large velocities associated with rapid abdominal movement. However, large velocities occurred in other bee behaviors such as flight. To distinguish waggles from non-dance motions, we developed the following procedures. In videos taken with a relatively slow frame rate (about 30 frames/s), which is equivalent to approximately a half cycle of the waggle frequency (about 10 to 16 Hz; Wario et al., 2017), a dancing bee appears to move in one direction in a pair of adjacent frames (the frame numbers t and t + 1, or the velocity field at time t) and move in the reverse direction in the next pair of frames (the frames t + 1 and t + 2, or the velocity field at time t + 1) (Figure 1 a–c). This characteristic of honeybee waggling can be used to extract candidate runs (Figure 1d).

Figure 1.

Schematic outline of the automatic waggle run decoding method. First, a PIV analysis (ac) is used to calculate the displacement velocity between two adjacent frames in a video recording. Second, we extract waggle candidates (d), which are defined by reversals in displacement velocity across three consecutive frames. Once a waggle candidate is detected, the program continues searching in the neighborhood of the previous candidate points over the next 14 acceleration fields to find a new waggle candidate (e, f), then assigns a unique ID to the all of the candidates identified in the sequence. A sequence of waggle candidates qualify as a waggle run if it lasts more than 15 frames or approximately 0.5 s.

In initial tests, we successfully detected the expected velocity fluctuations in video recordings of dancing bees, but the fluctuations did not always appear during a waggle run: there were periods where velocity fluctuations could not be detected during waggling, or when other bees blocked out the dancing bee. Therefore, we needed to add a procedure to continue searching within a 5 × 5 area of lattice points in the neighborhood of the waggle candidate for up to 14 subsequent time points (Figure 1e, f). If the next candidate is found, we assign the same unique ID number to all previous and new candidate points. Finally, we identify a sequence of candidate points as a waggle run if the waggle duration is more than 15 frames or about 0.5 s. The waggle duration is defined as the number of frames in which velocity fluctuations were detected, including any intervening periods of no detection. The waggle direction of each candidate point is calculated from the velocity field by bisecting the angle created by two sequential velocity vectors (Figure 1d), and the direction of the overall waggle run is defined as the median direction of all candidate points assigned a given ID number.

To implement these procedures, we developed a method consisting of two steps. The first step, PIV analysis, was carried out using commercially available software (Flownizer2D, DITECT Co. Ltd., Tokyo, Japan). In the second step, we detected waggle runs based on the PIV velocity fields calculated in Step 1. We used a custom Python script to analyze velocity changes using text data exported from the PIV software (Online Resource 1). The waggle run detection results, which included their duration and median direction, as well as the frame number at the start of the runs, were outputted to a comma-delimited text file.

Hive observation and video recording in open fields

To assess the performance of our method, we prepared three video recordings taken in separate experimental field apiaries in Hokkaido, Japan. The three apiaries were more than 2 km away from each other (for details on the experimental apiaries, see Kamo et al., 2018). Six-framed hives containing hybrid European honeybees from Italian bees (A. mellifera ligustica) were rented from a local beekeeper and used for the experiments.

We modified a standard Japanese 10-frame Langstroth hive for dance observation. On the right side of the hive as seen from the front, the wooden side plate was replaced by a transparent acrylic plate to allow observation of most of the end frame. By restricting the hive gate, we ensured that returning worker bees would pass through this frame to enter the hive. Consequently, most waggle dances occurred on the observable side of this frame.

Black plastic panels were used to shield the observation comb from direct sunlight and suppress reflections on the transparent acrylic sheet. We placed a commercially available camcorder (Handycam® HDR-PJ675, Sony Corp., Tokyo, Japan) on a tripod about 20 cm away from the observation comb, with the lens positioned to face the comb surface as precisely as possible to minimize image distortion (Figure 2). Recordings were made at 1920 × 1080 pixel resolution, which corresponds to about 5 pixels/mm on the comb surface, and at 29.97 frames/s. All recordings were performed on the same morning (beginning around 9:00 a.m., 25 August 2016), and lasted for approximately 30 min. This observation duration was chosen to conform to previous studies, such as Steffan-Dewenter and Kuhn (2003).

Figure 2.

Photograph of the observation hive and video recording setup in an experimental field apiary in Hokkaido, Japan.

Forage area estimation and validation of automatic decoding

Automatic waggle detection procedures were implemented on three video recordings. For the PIV analysis, the interrogation window was set to 64 × 64 pixels (equivalent to the body length of worker bees), the search window to 128 × 128 pixels, and the spacing of lattice points for calculating velocity fields to 16 × 16 pixels. These parameters were based on results from preliminary trials.

Prior to implementing automatic decoding, we manually interpreted the waggle dances in the three video recordings using HBBM software (DITECT Co. Ltd., Tokyo, Japan), which we have previously used to measure waggle duration and direction by manually drawing vector lines on video frames. For each waggle dance, we measured up to five consecutive waggle runs.

We converted waggle direction as obtained from both manual and automatic decoding results into forage site orientation by incorporating solar azimuth data calculated from the geographic location of each hive and the date and time of the video recording. To convert waggle duration into distance and generate a forage area estimation map, we followed the method developed by Schürch et al. (2013), with modifications to the process of calculating probability density. These modifications were the following: maps for each waggle dance were generated using a kernel density estimation of 1000 simulation runs using a 250 × 250 m grid across a 20 × 20-km area. During automatic decoding, the density maps for all waggle dances or runs were summed to create the final map. Following the method in Couvillon et al. (2014a), we weighted the probability density by a factor equal to the squared distance between the hive and the forage site to evaluate the relative importance of each foraging habitat.

These probability density maps were used to evaluate the performance of our automatic decoding method. To compare the maps generated by automatic and manual waggle decoding, we defined a “probable forage region” as the area where the probability density weighted by distance squared was higher than the median value across all cells in the map. We then determined the overlap between the probable forage regions calculated using manual and automatic decoding at each hive.


Our automatic waggle decoding method detected 486, 158, and 2873 waggle runs, respectively, from 30-min videos recorded at each of the three hives, while the manual interpretation procedure conducted on the same videos detected 177, 86, and 371 waggle runs, respectively, belonging to 41, 38, and 115 waggle dances. Both automatic and manual decoding methods detected similar patterns in the duration and orientation of waggle runs at each hive (Figure 3). At all three hives, the manual decoding results revealed preferable forage sites in the northwestward direction (around 315°; Figure 3a, c, e), and the modal direction was well captured by the automatic decoding results in each case (Figure 3b, d, f). At hive 1, a group of waggle runs of approximately 6-s duration and indicating a northeastward direction was captured by both manual and automatic methods, but the automatic method somewhat underestimated run duration and produced increased variability in the results (Figure 3a, b). A similar underestimation of run duration by the automatic method was also seen with the northwestward mode at Hive 3 (Figure 3e, f). At Hives 2 and 3, the automatic method detected waggle runs in the southeastward direction that were not detected by manual interpretation.

Figure 3.

Waggle run duration and orientation calculated by manual (a, c, e) and automatic (b, d, f) decoding of video recordings captured at three hives located in different experimental field apiaries.

The spatial distribution of estimated forage area was similar between the automatic and manual decoding results (Figure 4). To quantify the accuracy of automatic waggle decoding when estimating forage sites, we compared the probable forage regions obtained from both methods (Table I). The forage regions estimated by automatic decoding covered 78–87% of the regions identified by manual methods. The size of the regions wrongly estimated by automatic decoding corresponded to 17–21% of the area of the forage regions estimated using manual interpretation.

Figure 4.

Forage area probability maps estimated from manual (a, c, e) and automatic (b, d, f) decoding of video recordings captured at three hives located in different experimental field apiaries. The map size in each panel is 20 × 20 km. Black crosses mark the location of the hive. The area bounded by the white line indicates the probable forage region, which was defined as the area in which the probability density weighted by the squared distance from the hive was higher than the median value across all cells in each map.

Table I Comparisons of forage area estimates produced by manual and automatic decoding of video recordings captured at three different hives. The probable forage region was defined here as the area in which the probability density weighted by the squared distance from the hive was higher than the median value across all cells in each map (see white outlines in Figure 4 for illustration)


The automatic waggle decoding method we proposed here consists of only two steps: (1) velocity calculation by PIV analysis and (2) waggle detection and decoding based on the calculated velocities. We used commercially available software for the PIV analysis because it allowed us to determine the appropriate parameter settings using a visual interface. However, open-source PIV analysis software is now widely available, including the program OpenPIV, which is compatible with Matlab, Python, and C++ (Taylor et al., 2010). Thus, the methodology we describe in this study, including our custom post-processing script, can be adopted free of charge by other investigators working on any computer platform.

Our results show that our method performs to an acceptable level of accuracy when estimating the forage area of hives in field apiaries. However, our method miscalculated forage direction and underestimated the distance indicated by some waggle runs as described in the “Results” section. In addition, our method detected a much higher number of waggle runs compared to manual interpretation. This may be attributable to some waggle runs not being decoded by the manual method (only up to five runs were decoded per waggle dance), and to the presence of false positives due to signal noise.

The errors in estimating forage direction were likely due to the code misidentifying the heading of some dancing bees, causing the calculated waggle run angle to be 180° off of the true direction. To increase accuracy, future procedures could consider the heading of dancing bees, or omit outliers from the distribution of waggle run direction (Wario et al., 2017). Errors in the calculation of waggle duration may have been due to inaccuracies in detecting the exact beginning and end points of a waggle run due to the bees exhibiting altered waggle frequencies or oscillation irregularities at the beginning, middle, and end of the run. This issue could be solved through the use of other techniques, such as by detecting run duration from the trajectory pattern of a dancing bee (e.g., Landgraf et al., 2011). Identifying the trajectory of individual dancing bees may also be needed to decode waggle dances consisting of multiple waggle runs because the direction of different waggle runs are known to vary within a single dance (Landgraf et al., 2011).

One key method of identifying and addressing the source of these calculation errors would be to verify the accuracy of our automatic decoding method on a dance-by-dance or run-by-run basis. At the moment, however, such comparisons are hampered by the difficulty of manually identifying waggle runs that correspond to those detected by our automatic method. Further studies are thus required to enable dance-by-dance or run-by-run comparison.

Although there is room for further elaboration of our method, it is also worthwhile to consider the trade-offs between decoding accuracy and computational complexity. Previous studies have revealed high variability in the information encoded in the waggle runs of a single dance (Couvillon et al., 2012), as well as in the information encoded in the dances of different individual bees even in cases where they were trained at an artificial feeder (Schürch and Couvillon, 2013). Waggle dances are not the sole communication tool used by bees but are complemented by odor and wing oscillation patterns to inform nestmates of preferable forage sites (Grüter and Farina, 2009). Therefore, there are likely to be inherent errors in encoding and decoding dance information. To cope with these errors, Schürch et al. (2013) developed a method that incorporates intra- and inter-dance variability when estimating forage sites. By incorporating this method into our automatic decoding procedure, we were able to conduct automatic forage area estimation with a reduced error rate.

The velocity fields obtained through our PIV analysis successfully detected waggle motions under natural light conditions. However, our experiments showed that the analysis was vulnerable to direct sunlight penetration and reflection on the transparent acrylic sheet, and it did not perform well when the observation surface was a new honeycomb base. Consequently, good contrast is mandatory to enable the analysis to discern bee abdomens. Although our system does not require artificial lighting, care should be taken to use a well-developed comb for observation and darken the side of observation hive.


Our results demonstrate that our automatic decoding method, which is not reliant on artificial lighting or high-performance camcorders, is comparable with manual interpretation for estimating forage area. Although further improvements to accuracy are needed to refine the waggle decoding process, our system has the potential to enhance our understanding of diurnal, seasonal, and spatial variations in forage site usage in field apiaries, thereby helping to streamline honeybee management.


  1. Adrian, R.J. (1991) Particle-imaging techniques for experimental fluid mechanics. Annu. Rev. Fluid Mech. 23, 261–304

    Article  Google Scholar 

  2. Aizen, M.A., Harder, L.D. (2009) The global stock of domesticated honey bees is growing slower than agricultural demand for pollination. Curr. Biol. 19, 915–918

    Article  CAS  PubMed  Google Scholar 

  3. Aizen, M.A., Garibaldi, L.A., Cunningham, S.A., Klein, A.M. (2008) Long-term global trends in crop yield and production reveal no current pollination shortage but increasing pollinator dependency. Curr. Biol. 18, 1572–1575

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  4. Balfour, N.J., Ratnieks, F.L.W. (2017) Using the waggle dance to determine the spatial ecology of honey bees during commercial crop pollination. Agric. For. Entomol. 19, 210–216

    Article  Google Scholar 

  5. Couvillon, M.J., Riddell Pearce, F.C., Harris-Jones, E.L., Kuepfer, A.M., Mackenzie-Smith, S.J., Rozario, L.A., Schurch, R., Ratnieks, F.L.W. (2012) Intra-dance variation among waggle runs and the design of efficient protocols for honey bee dance decoding. Biol. Open 1, 467–472

    Article  PubMed  PubMed Central  Google Scholar 

  6. Couvillon, M.J., Schürch, R., Ratnieks, F.L.W. (2014a) Dancing bees communicate a foraging preference for rural lands in high-level agri-environment schemes. Curr. Biol. 24, 1212–1215

    Article  CAS  PubMed  Google Scholar 

  7. Couvillon, M.J., Schürch, R., Ratnieks, F.L.W. (2014b) Waggle dance distances as integrative indicators of seasonal foraging challenges. PLoS One 9, e93495

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  8. Danner, N., Härtel, S., Steffan-Dewenter, I. (2014) Maize pollen foraging by honey bees in relation to crop area and landscape context. Basic Appl. Ecol. 15, 677–684

    Article  Google Scholar 

  9. Feldman, A., Balch, T. (2004) Representing honey bee behavior for recognition using human trainable models. Adapt. Behav. 12, 241–250

    Article  Google Scholar 

  10. Garbuzov, M., Couvillon, M.J., Schürch, R., Ratnieks, F.L.W. (2015) Honey bee dance decoding and pollen-load analysis show limited foraging on spring-flowering oilseed rape, a potential source of neonicotinoid contamination. Agric. Ecosyst. Environ. 203, 62–68

    Article  Google Scholar 

  11. Grüter, C., Farina, W.M. (2009) The honeybee waggle dance: can we follow the steps? Trends Ecol. Evol. 24, 242–247

    Article  PubMed  Google Scholar 

  12. IPBES (2016) Summary for policymakers of the assessment report of the Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services on pollinators, pollination and food production. Secretariat of the Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services, Bonn

    Google Scholar 

  13. Kamo, T., Kusumoto, Y., Tokuoka, Y., Okubo, S., Hayakawa, H., Yoshiyama, M., Kimura, K., Konuma, A. (2018) A DNA barcoding method for identifying and quantifying the composition of pollen species collected by European honeybees, Apis mellifera (Hymenoptera: Apidae). Appl. Entomol. Zool. 53, 353–361

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  14. Keane, R.D., Adrian, R.J. (1992) Theory of cross-correlation analysis of PIV images. Appl. Sci. Res. 49, 191–215

    Article  Google Scholar 

  15. Kimura, T., Ohashi, M., Okada, R., Ikeno, H. (2011) A new approach for the simultaneous tracking of multiple honeybees for analysis of hive behavior. Apidologie 42, 607–617

    Article  Google Scholar 

  16. Kimura, T., Ohashi, M., Crailsheim, K., Schmickl, T., Okada, R., Radspieler, G., Ikeno, H. (2014) Development of a new method to track multiple honey bees with complex behaviors on a flat laboratory arena. PLoS One 9, e84656

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  17. Klein, A.-M., Vaissiere, B.E., Cane, J.H., Steffan-Dewenter, I., Cunningham, S.A., Kremen, C., Tscharntke, T. (2007) Importance of pollinators in changing landscapes for world crops. Proc. R. Soc. B Biol. Sci. 274, 303–313

    Article  Google Scholar 

  18. Landgraf, T., Rojas, R., Nguyen, H., Kriegel, F., Stettin, K. (2011) Analysis of the Waggle Dance Motion of Honeybees for the Design of a Biomimetic Honeybee Robot. PLoS One 6, e21354

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  19. Park, B., Nieh, J.C. (2017) Seasonal trends in honey bee pollen foraging revealed through DNA barcoding of bee-collected pollen. Insectes Soc. 64, 425–437

    Article  Google Scholar 

  20. Requier, F., Odoux, J.-F., Tamic, T., Moreau, N., Henry, M., Decourtye, A., Bretagnolle, V. (2015) Honey bee diet in intensive farmland habitats reveals an unexpectedly high flower richness and a major role of weeds. Ecol. Appl. 25, 881–890

  21. Schürch, R., Couvillon, M.J. (2013) Too much noise on the dance floor. Commun. Integr. Biol. 6, e22298

    Article  PubMed  PubMed Central  Google Scholar 

  22. Schürch, R., Couvillon, M.J., Burns, D.D.R., Tasman, K., Waxman, D., Ratnieks, F.L.W. (2013) Incorporating variability in honey bee waggle dance decoding improves the mapping of communicated resource locations. J. Comp. Physiol. A Neuroethol. Sensory, Neural, Behav. Physiol. 199, 1143–1152

    Article  Google Scholar 

  23. Steffan-Dewenter, I., Kuhn, A. (2003) Honeybee foraging in differentially structured landscapes. Proc. R. Soc. B Biol. Sci. 270, 569–575

    Article  Google Scholar 

  24. Takahashi, S., Hashimoto, K., Maeda, S., Tsuruta, N., Ai, H. (2017) Development of behavior monitoring system for honeybees in hive. Trans. Japanese Soc. Artif. Intell. 32, B-GC2_1-11. (In Japanese with English abstract)

    Article  Google Scholar 

  25. Taylor, Z.J., Gurka, R., Kopp, G.A., Liberzon, A. (2010) Long-duration time-resolved PIV to study unsteady aerodynamics. IEEE Trans. Instrum. Meas. 59, 3262–3269

    Article  Google Scholar 

  26. Velthuis, H.H.W., van Doorn, A. (2006) A century of advances in bumblebee domestication and the economic and environmental aspects of its commercialization for pollination. Apidologie 37, 421–451

    Article  Google Scholar 

  27. von Frisch, K. (1967) The dance language and orientation of bees, Translated. ed. Belknap Press Harvard University Press, Cambridge

    Google Scholar 

  28. Wario, F., Wild, B., Couvillon, M.J., Rojas, R., Landgraf, T. (2015) Automatic methods for long-term tracking and the detection and decoding of communication dances in honeybees. Front. Ecol. Evol. 3.

  29. Wario, F., Wild, B., Rojas, R., Landgraf, T. (2017) Automatic detection and decoding of honey bee waggle dances. PLoS One 12, e0188626

    Article  CAS  PubMed  PubMed Central  Google Scholar 

Download references


This research was partially supported by grants from the NARO Bio-oriented Technology Research Advancement Institution (Special Scheme Project on Vitalizing Management Entities of Agriculture, Forestry and Fisheries). We thank Yasuyuki Hasada, Yosuke Hasada, and Nobuyuki Murakami for their generous cooperation in establishing and managing experimental apiaries in Hokkaido. We also thank Yasuhiro Ihara for allowing us to apply a PIV analysis to dancing bees and all of our project members for assisting with data collection.

Author information




SO, MY, and KK wrote the paper; SO, AN, and CT prepared and analyzed the data; MY, KK, and NM designed field experiments; SO and KK conceived the research and analytical concept. All the authors read and approved the final manuscript.

Corresponding author

Correspondence to Satoru Okubo.

Ethics declarations

Conflict of interest

The authors declare that they have no conflicts of interest..

Additional information

Estimation de la surface de butinage chez les abeilles européennes ( Apis mellifera ) par décodage automatique de danse frétillante de colonie d'abeilles sur le terrain, à l'aide de vidéos d'un caméscope

Corrélation croisée / surface de butinage / traitement des images / PIV / Python

Abschätzung des Sammelgebietes von europäischen Honigbienen ( Apis mellifera ) durch eine automatische Decodierung des Schwänzeltanzes durch Videoaufnahmen mit einem Camcoder bei Bienenvölkern im Freiland.

Kreuzkorrelation / Sammelgebiet / Wahrscheinlichkeitskarte / Bildverarbeitung / PIV / Python

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Manuscript editor: Peter Rosenkranz

Electronic supplementary material

ESM 1.

(TXT 4 kb)

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Okubo, S., Nikkeshi, A., S. Tanaka, C. et al. Forage area estimation in European honeybees (Apis mellifera) by automatic waggle decoding of videos using a generic camcorder in field apiaries. Apidologie 50, 243–252 (2019).

Download citation


  • direct cross-correlation
  • forage area probability map
  • image processing
  • PIV
  • Python