Skip to main content

Bionic Tracking: Using Eye Tracking to Track Biological Cells in Virtual Reality

Part of the Lecture Notes in Computer Science book series (LNIP,volume 12535)

Abstract

We present Bionic Tracking, a novel method for solving biological cell tracking problems with eye tracking in virtual reality using commodity hardware. Using gaze data, and especially smooth pursuit eye movements, we are able to track cells in time series of 3D volumetric datasets. The problem of tracking cells is ubiquitous in developmental biology, where large volumetric microscopy datasets are acquired on a daily basis, often comprising hundreds or thousands of time points that span hours or days. The image data, however, is only a means to an end, and scientists are often interested in the reconstruction of cell trajectories and cell lineage trees. Reliably tracking cells in crowded three-dimensional space over many time points remains an open problem, and many current approaches rely on tedious manual annotation or curation. In the Bionic Tracking approach, we substitute the usual 2D point-and-click interface for annotation or curation with eye tracking in a virtual reality headset, where users follow cells with their eyes in 3D space in order to track them. We detail the interaction design of our approach and explain the graph-based algorithm used to connect different time points, also taking occlusion and user distraction into account. We demonstrate Bionic Tracking using examples from two different biological datasets. Finally, we report on a user study with seven cell tracking experts, highlighting the benefits and limitations of Bionic Tracking compared to point-and-click interfaces.

This is a preview of subscription content, access via your institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • DOI: 10.1007/978-3-030-66415-2_18
  • Chapter length: 18 pages
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
eBook
USD   109.00
Price excludes VAT (USA)
  • ISBN: 978-3-030-66415-2
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
Softcover Book
USD   149.99
Price excludes VAT (USA)
Fig. 1.
Fig. 2.
Fig. 3.
Fig. 4.

Notes

  1. 1.

    https://github.com/pupil-labs/hmd-eyes.

References

  1. Amat, F., Höckendorf, B., Wan, Y., Lemon, W.C., McDole, K., Keller, P.J.: Efficient processing and analysis of large-scale light-sheet microscopy data. Nat. Protoc. 10(11) (2015). https://doi.org/10.1038/nprot.2015.111

  2. Amat, F., et al.: Fast, accurate reconstruction of cell lineages from large-scale fluorescence microscopy data. Nat. Methods 11(9) (2014). https://doi.org/10.1038/nmeth.3036

  3. Brooke, J.: SUS - a quick and dirty usability scale. In: Usability Evaluation in Industry, p. 7. CRC Press, June 1996

    Google Scholar 

  4. Bruder, V., Schulz, C., Bauer, R., Frey, S., Weiskopf, D., Ertl, T.: Voronoi-based foveated volume rendering. In: EUROVIS 2019, Porto, Portugal (2019)

    Google Scholar 

  5. Chenouard, N., et al.: Objective comparison of particle tracking methods. Nat. Methods 11(3), 281–289 (2014). https://doi.org/10.1038/nmeth.2808

    CrossRef  Google Scholar 

  6. Duchowski, A.T.: Eye Tracking Methodology: Theory and Practice, 3rd edn. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-57883-5

    CrossRef  MATH  Google Scholar 

  7. Günther, U., Harrington, K.I.S.: Tales from the trenches: developing sciview, a new 3D viewer for the ImageJ community. In: VisGap - The Gap between Visualization Research and Visualization Software at EuroGraphics/EuroVis 2020, p. 7 (2020). https://doi.org/10.2312/VISGAP.20201112

  8. Gunther, U., et al.: Scenery: flexible virtual reality visualization on the Java VM. In: 2019 IEEE Visualization Conference (VIS), Vancouver, BC, Canada, pp. 1–5. IEEE, October 2019. https://doi.org/10.1109/VISUAL.2019.8933605

  9. Hart, S.G., Staveland, L.E.: Development of NASA-TLX (task load index): results of empirical and theoretical research. Adv. Psychol. 52 (1988). https://doi.org/10.1016/s0166-4115(08)62386--9

  10. Huisken, J.: Optical sectioning deep inside live embryos by selective plane illumination microscopy. Science 305(5686) (2004). https://doi.org/10.1126/science.1100035

  11. Jacob, R.J.K.: Eye tracking in advanced interface design. In: Virtual Environments and Advanced Interface Design, pp. 258–290 (1995)

    Google Scholar 

  12. Kassner, M., Patera, W., Bulling, A.: Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction. In: Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing, Seattle, Washington, pp. 1151–1160. ACM Press (2014). https://doi.org/10.1145/2638728.2641695

  13. Kennedy, R.S., Lane, N.E., Berbaum, K.S., Lilienthal, M.G.: Simulator sickness questionnaire: an enhanced method for quantifying simulator sickness. Int. J. Aviat. Psychol. 3(3) (1993). https://doi.org/10.1207/s15327108ijap0303_3

  14. Khamis, M., Oechsner, C., Alt, F., Bulling, A.: VRpursuits: interaction in virtual reality using smooth pursuit eye movements. In: Proceedings of the 2018 International Conference on Advanced Visual Interfaces - AVI 2018, Castiglione della Pescaia, Grosseto, Italy, pp. 1–8. ACM Press (2018). https://doi.org/10.1145/3206505.3206522

  15. Klamka, K., Siegel, A., Vogt, S., Göbel, F., Stellmach, S., Dachselt, R.: Look & pedal: hands-free navigation in zoomable information spaces through gaze-supported foot input. In: Proceedings of the 2015 ACM on International Conference on Multimodal Interaction - ICMI 2015, Seattle, Washington, USA, pp. 123–130. ACM Press (2015). https://doi.org/10.1145/2818346.2820751

  16. Kosch, T., Hassib, M., Woźniak, P.W., Buschek, D., Alt, F.: Your eyes tell: leveraging smooth pursuit for assessing cognitive workload. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems - CHI 2018, Montreal QC, Canada, pp. 1–13. ACM Press (2018). https://doi.org/10.1145/3173574.3174010

  17. Kroes, T., Post, F.H., Botha, C.P.: Exposure render: an interactive photo-realistic volume rendering framework. PLoS ONE 7(7) (2012). https://doi.org/10.1371/journal.pone.0038586

  18. Laugwitz, B., Held, T., Schrepp, M.: Construction and evaluation of a user experience questionnaire. In: Holzinger, A. (ed.) USAB 2008. LNCS, vol. 5298, pp. 63–76. Springer, Heidelberg (2008). https://doi.org/10.1007/978-3-540-89350-9_6

    CrossRef  Google Scholar 

  19. Levoy, M., Whitaker, R.: Gaze-directed volume rendering. ACM SIGGRAPH Comput. Graph. 24(2) (1990). https://doi.org/10.1145/91385.91449

  20. Lutz, O.H.-M., Venjakob, A.C., Ruff, S.: SMOOVS: towards calibration-free text entry by gaze using smooth pursuit movements. J. Eye Mov. Res. 8(1) (2015). https://doi.org/10.16910/jemr.8.1.2

  21. Meena, Y.K., Cecotti, H., Wong-Lin, K., Prasad, G.: A multimodal interface to resolve the Midas-Touch problem in gaze controlled wheelchair. In: Conference Proceedings : ... Annual International Conference of the IEEE Engineering in Medicine and Biology Society, IEEE Engineering in Medicine and Biology Society, Annual Conference 2017 (2017). https://doi.org/10.1109/embc.2017.8036971

  22. Mirhosseini, S., Gutenko, I., Ojal, S., Marino, J., Kaufman, A.: Immersive virtual colonoscopy. IEEE Trans. Visual. Comput. Graph. 25(5) (2019). https://doi.org/10.1109/tvcg.2019.2898763

  23. Moen, E., Bannon, D., Kudo, T., Graf, W., Covert, M., Van Valen, D.: Deep learning for cellular image analysis. Nat. Methods 16(12), 1233–1246 (2019). https://doi.org/10.1038/s41592-019-0403-1

    CrossRef  Google Scholar 

  24. Pietzsch, T., Saalfeld, S., Preibisch, S., Tomancak, P.: BigDataViewer: visualization and processing for large image data sets. 12(6) (2015). https://doi.org/10.1038/nmeth.3392

  25. Pitrone, P.G., et al.: OpenSPIM: an open-access light-sheet microscopy platform. Nat. Methods 10(7) (2013). https://doi.org/10.1038/nmeth.2507

  26. Piumsomboon, T., Lee, G., Lindeman, R.W., Billinghurst, M.: Exploring natural eye-gaze-based interaction for immersive virtual reality. In: 2017 IEEE Symposium on 3D User Interfaces (3DUI), Los Angeles, CA, USA, pp. 36–39. IEEE (2017). https://doi.org/10.1109/3DUI.2017.7893315

  27. Reynaud, E.G., Peychl, J., Huisken, J., Tomancak, P.: Guide to light-sheet microscopy for adventurous biologists. Nat. Methods 12(1) (2014). https://doi.org/10.1038/nmeth.3222

  28. Schindelin, J., et al.: Fiji: an open-source platform for biological-image analysis. Nat. Methods 9(7) (2012). https://doi.org/10.1038/nmeth.2019

  29. Singla, A., Fremerey, S., Robitza, W., Raake, A.: Measuring and comparing QoE and simulator sickness of omnidirectional videos in different head mounted displays. In: 2017 Ninth International Conference on Quality of Multimedia Experience (QoMEX), pp. 1–6, May 2017. https://doi.org/10.1109/QoMEX.2017.7965658

  30. Slater, M., Sanchez-Vives, M.V.: Enhancing our lives with immersive virtual reality. Front. Robot. AI 3 (2016). https://doi.org/10.3389/frobt.2016.00074

  31. Stellmach, S., Dachselt, R.: Look & touch: gaze-supported target acquisition. In: Proceedings of the 2012 ACM Annual Conference on Human Factors in Computing Systems - CHI 2012, Austin, Texas, USA, p. 2981. ACM Press (2012). https://doi.org/10.1145/2207676.2208709

  32. Sun, X., Yeoh, W., Koenig, S.: Dynamic fringe-saving A*. In: Proceedings of the 8th International Conference on Autonomous Agents and Multiagent Systems, vol. 2, pp. 891–898. International Foundation for Autonomous Agents and Multiagent Systems, Richland, SC (2009)

    Google Scholar 

  33. Tinevez, J.-Y., et al.: TrackMate: an open and extensible platform for single-particle tracking. Methods 115 (2017). https://doi.org/10.1016/j.ymeth.2016.09.016. (IEEE Signal Proc. Mag. 23 3 2006)

  34. Ulman, V., et al.: An objective comparison of cell-tracking algorithms. Nat. Methods 14(12), 1141–1152 (2017). https://doi.org/10.1038/nmeth.4473

    CrossRef  Google Scholar 

  35. Usher, W., et al.: A virtual reality visualization tool for neuron tracing. IEEE Trans. Visual. Comput. Graph. 24(1) (2017). https://doi.org/10.1109/tvcg.2017.2744079

  36. Vidal, M., Bulling, A., Gellersen, H.: Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In: Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing - UbiComp 2013, Zurich, Switzerland, p. 439. ACM Press (2013). https://doi.org/10.1145/2493432.2493477

  37. Winnubst, J., et al.: Reconstruction of 1,000 projection neurons reveals new cell types and organization of long-range connectivity in the mouse brain. Cell 179(1), 268–281.e13 (2019). https://doi.org/10.1016/j.cell.2019.07.042

    CrossRef  Google Scholar 

  38. Wolff, C., et al.: Multi-view light-sheet imaging and tracking with the MaMuT software reveals the cell lineage of a direct developing arthropod limb. eLife 7 (2018). https://doi.org/10.7554/elife.34410

Download references

Acknowledgements

The authors thank all participants of the user study. Thanks to Mette Handberg-Thorsager for providing the Platynereis dataset and for feedback on the manuscript. Thanks to Vladimir Ulman and Jean-Yves Tinevez for helpful discussions regarding track comparison. Thanks to Bevan Cheeseman, Aryaman Gupta, and Stefanie Schmidt for helpful discussions. Thanks to Pupil Labs for help with the eye tracking calibration.

This work was partially funded by the Center for Advanced Systems Understanding (CASUS), financed by Germany’s Federal Ministry of Education and Research (BMBF) and by the Saxon Ministry for Science, Culture and Tourism (SMWK) with tax funds on the basis of the budget approved by the Saxon State Parliament.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ulrik Günther .

Editor information

Editors and Affiliations

1 Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (pdf 70 KB)

Supplementary material 2 (pdf 3204 KB)

Rights and permissions

Reprints and Permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Verify currency and authenticity via CrossMark

Cite this paper

Günther, U., Harrington, K.I.S., Dachselt, R., Sbalzarini, I.F. (2020). Bionic Tracking: Using Eye Tracking to Track Biological Cells in Virtual Reality. In: Bartoli, A., Fusiello, A. (eds) Computer Vision – ECCV 2020 Workshops. ECCV 2020. Lecture Notes in Computer Science(), vol 12535. Springer, Cham. https://doi.org/10.1007/978-3-030-66415-2_18

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-66415-2_18

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-66414-5

  • Online ISBN: 978-3-030-66415-2

  • eBook Packages: Computer ScienceComputer Science (R0)