Amat, F., Höckendorf, B., Wan, Y., Lemon, W.C., McDole, K., Keller, P.J.: Efficient processing and analysis of large-scale light-sheet microscopy data. Nat. Protoc. 10(11) (2015). https://doi.org/10.1038/nprot.2015.111
Amat, F., et al.: Fast, accurate reconstruction of cell lineages from large-scale fluorescence microscopy data. Nat. Methods 11(9) (2014). https://doi.org/10.1038/nmeth.3036
Brooke, J.: SUS - a quick and dirty usability scale. In: Usability Evaluation in Industry, p. 7. CRC Press, June 1996
Google Scholar
Bruder, V., Schulz, C., Bauer, R., Frey, S., Weiskopf, D., Ertl, T.: Voronoi-based foveated volume rendering. In: EUROVIS 2019, Porto, Portugal (2019)
Google Scholar
Chenouard, N., et al.: Objective comparison of particle tracking methods. Nat. Methods 11(3), 281–289 (2014). https://doi.org/10.1038/nmeth.2808
CrossRef
Google Scholar
Duchowski, A.T.: Eye Tracking Methodology: Theory and Practice, 3rd edn. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-57883-5
CrossRef
MATH
Google Scholar
Günther, U., Harrington, K.I.S.: Tales from the trenches: developing sciview, a new 3D viewer for the ImageJ community. In: VisGap - The Gap between Visualization Research and Visualization Software at EuroGraphics/EuroVis 2020, p. 7 (2020). https://doi.org/10.2312/VISGAP.20201112
Gunther, U., et al.: Scenery: flexible virtual reality visualization on the Java VM. In: 2019 IEEE Visualization Conference (VIS), Vancouver, BC, Canada, pp. 1–5. IEEE, October 2019. https://doi.org/10.1109/VISUAL.2019.8933605
Hart, S.G., Staveland, L.E.: Development of NASA-TLX (task load index): results of empirical and theoretical research. Adv. Psychol. 52 (1988). https://doi.org/10.1016/s0166-4115(08)62386--9
Huisken, J.: Optical sectioning deep inside live embryos by selective plane illumination microscopy. Science 305(5686) (2004). https://doi.org/10.1126/science.1100035
Jacob, R.J.K.: Eye tracking in advanced interface design. In: Virtual Environments and Advanced Interface Design, pp. 258–290 (1995)
Google Scholar
Kassner, M., Patera, W., Bulling, A.: Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction. In: Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing, Seattle, Washington, pp. 1151–1160. ACM Press (2014). https://doi.org/10.1145/2638728.2641695
Kennedy, R.S., Lane, N.E., Berbaum, K.S., Lilienthal, M.G.: Simulator sickness questionnaire: an enhanced method for quantifying simulator sickness. Int. J. Aviat. Psychol. 3(3) (1993). https://doi.org/10.1207/s15327108ijap0303_3
Khamis, M., Oechsner, C., Alt, F., Bulling, A.: VRpursuits: interaction in virtual reality using smooth pursuit eye movements. In: Proceedings of the 2018 International Conference on Advanced Visual Interfaces - AVI 2018, Castiglione della Pescaia, Grosseto, Italy, pp. 1–8. ACM Press (2018). https://doi.org/10.1145/3206505.3206522
Klamka, K., Siegel, A., Vogt, S., Göbel, F., Stellmach, S., Dachselt, R.: Look & pedal: hands-free navigation in zoomable information spaces through gaze-supported foot input. In: Proceedings of the 2015 ACM on International Conference on Multimodal Interaction - ICMI 2015, Seattle, Washington, USA, pp. 123–130. ACM Press (2015). https://doi.org/10.1145/2818346.2820751
Kosch, T., Hassib, M., Woźniak, P.W., Buschek, D., Alt, F.: Your eyes tell: leveraging smooth pursuit for assessing cognitive workload. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems - CHI 2018, Montreal QC, Canada, pp. 1–13. ACM Press (2018). https://doi.org/10.1145/3173574.3174010
Kroes, T., Post, F.H., Botha, C.P.: Exposure render: an interactive photo-realistic volume rendering framework. PLoS ONE 7(7) (2012). https://doi.org/10.1371/journal.pone.0038586
Laugwitz, B., Held, T., Schrepp, M.: Construction and evaluation of a user experience questionnaire. In: Holzinger, A. (ed.) USAB 2008. LNCS, vol. 5298, pp. 63–76. Springer, Heidelberg (2008). https://doi.org/10.1007/978-3-540-89350-9_6
CrossRef
Google Scholar
Levoy, M., Whitaker, R.: Gaze-directed volume rendering. ACM SIGGRAPH Comput. Graph. 24(2) (1990). https://doi.org/10.1145/91385.91449
Lutz, O.H.-M., Venjakob, A.C., Ruff, S.: SMOOVS: towards calibration-free text entry by gaze using smooth pursuit movements. J. Eye Mov. Res. 8(1) (2015). https://doi.org/10.16910/jemr.8.1.2
Meena, Y.K., Cecotti, H., Wong-Lin, K., Prasad, G.: A multimodal interface to resolve the Midas-Touch problem in gaze controlled wheelchair. In: Conference Proceedings : ... Annual International Conference of the IEEE Engineering in Medicine and Biology Society, IEEE Engineering in Medicine and Biology Society, Annual Conference 2017 (2017). https://doi.org/10.1109/embc.2017.8036971
Mirhosseini, S., Gutenko, I., Ojal, S., Marino, J., Kaufman, A.: Immersive virtual colonoscopy. IEEE Trans. Visual. Comput. Graph. 25(5) (2019). https://doi.org/10.1109/tvcg.2019.2898763
Moen, E., Bannon, D., Kudo, T., Graf, W., Covert, M., Van Valen, D.: Deep learning for cellular image analysis. Nat. Methods 16(12), 1233–1246 (2019). https://doi.org/10.1038/s41592-019-0403-1
CrossRef
Google Scholar
Pietzsch, T., Saalfeld, S., Preibisch, S., Tomancak, P.: BigDataViewer: visualization and processing for large image data sets. 12(6) (2015). https://doi.org/10.1038/nmeth.3392
Pitrone, P.G., et al.: OpenSPIM: an open-access light-sheet microscopy platform. Nat. Methods 10(7) (2013). https://doi.org/10.1038/nmeth.2507
Piumsomboon, T., Lee, G., Lindeman, R.W., Billinghurst, M.: Exploring natural eye-gaze-based interaction for immersive virtual reality. In: 2017 IEEE Symposium on 3D User Interfaces (3DUI), Los Angeles, CA, USA, pp. 36–39. IEEE (2017). https://doi.org/10.1109/3DUI.2017.7893315
Reynaud, E.G., Peychl, J., Huisken, J., Tomancak, P.: Guide to light-sheet microscopy for adventurous biologists. Nat. Methods 12(1) (2014). https://doi.org/10.1038/nmeth.3222
Schindelin, J., et al.: Fiji: an open-source platform for biological-image analysis. Nat. Methods 9(7) (2012). https://doi.org/10.1038/nmeth.2019
Singla, A., Fremerey, S., Robitza, W., Raake, A.: Measuring and comparing QoE and simulator sickness of omnidirectional videos in different head mounted displays. In: 2017 Ninth International Conference on Quality of Multimedia Experience (QoMEX), pp. 1–6, May 2017. https://doi.org/10.1109/QoMEX.2017.7965658
Slater, M., Sanchez-Vives, M.V.: Enhancing our lives with immersive virtual reality. Front. Robot. AI 3 (2016). https://doi.org/10.3389/frobt.2016.00074
Stellmach, S., Dachselt, R.: Look & touch: gaze-supported target acquisition. In: Proceedings of the 2012 ACM Annual Conference on Human Factors in Computing Systems - CHI 2012, Austin, Texas, USA, p. 2981. ACM Press (2012). https://doi.org/10.1145/2207676.2208709
Sun, X., Yeoh, W., Koenig, S.: Dynamic fringe-saving A*. In: Proceedings of the 8th International Conference on Autonomous Agents and Multiagent Systems, vol. 2, pp. 891–898. International Foundation for Autonomous Agents and Multiagent Systems, Richland, SC (2009)
Google Scholar
Tinevez, J.-Y., et al.: TrackMate: an open and extensible platform for single-particle tracking. Methods 115 (2017). https://doi.org/10.1016/j.ymeth.2016.09.016. (IEEE Signal Proc. Mag. 23 3 2006)
Ulman, V., et al.: An objective comparison of cell-tracking algorithms. Nat. Methods 14(12), 1141–1152 (2017). https://doi.org/10.1038/nmeth.4473
CrossRef
Google Scholar
Usher, W., et al.: A virtual reality visualization tool for neuron tracing. IEEE Trans. Visual. Comput. Graph. 24(1) (2017). https://doi.org/10.1109/tvcg.2017.2744079
Vidal, M., Bulling, A., Gellersen, H.: Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In: Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing - UbiComp 2013, Zurich, Switzerland, p. 439. ACM Press (2013). https://doi.org/10.1145/2493432.2493477
Winnubst, J., et al.: Reconstruction of 1,000 projection neurons reveals new cell types and organization of long-range connectivity in the mouse brain. Cell 179(1), 268–281.e13 (2019). https://doi.org/10.1016/j.cell.2019.07.042
CrossRef
Google Scholar
Wolff, C., et al.: Multi-view light-sheet imaging and tracking with the MaMuT software reveals the cell lineage of a direct developing arthropod limb. eLife 7 (2018). https://doi.org/10.7554/elife.34410