Abstract
Animal survival necessitates adaptive behaviors in volatile environmental contexts. Virtual reality (VR) technology is instrumental to study the neural mechanisms underlying behaviors modulated by environmental context by simulating the real world with maximized control of contextual elements. Yet current VR tools for rodents have limited flexibility and performance (e.g., frame rate) for context-dependent cognitive research. Here, we describe a high-performance VR platform with which to study contextual behaviors immersed in editable virtual contexts. This platform was assembled from modular hardware and custom-written software with flexibility and upgradability. Using this platform, we trained mice to perform context-dependent cognitive tasks with rules ranging from discrimination to delayed-sample-to-match while recording from thousands of hippocampal place cells. By precise manipulations of context elements, we found that the context recognition was intact with partial context elements, but impaired by exchanges of context elements. Collectively, our work establishes a configurable VR platform with which to investigate context-dependent cognition with large-scale neural recording.
Similar content being viewed by others
References
LeDoux JE. Emotion circuits in the brain. Annu Rev Neurosci 2000, 23: 155–184.
Maren S, Phan KL, Liberzon I. The contextual brain: Implications for fear conditioning, extinction and psychopathology. Nat Rev Neurosci 2013, 14: 417–428.
Fanselow MS, Poulos AM. The neuroscience of mammalian associative learning. Annu Rev Psychol 2005, 56: 207–234.
Maren S. Neurobiology of Pavlovian fear conditioning. Annu Rev Neurosci 2001, 24: 897–931.
Bohil CJ, Alicea B, Biocca FA. Virtual reality in neuroscience research and therapy. Nat Rev Neurosci 2011, 12: 752–762.
Laver KE, George S, Thomas S, Deutsch JE, Crotty M. Virtual reality for stroke rehabilitation. Cochrane Database Syst Rev 2011, 11: CD008349.
García-Betances RI, Arredondo Waldmeyer MT, Fico G, Cabrera-Umpiérrez MF. A succinct overview of virtual reality technology use in Alzheimer’s disease. Front Aging Neurosci 2015, 7: 80.
Dombeck DA, Harvey CD, Tian L, Looger LL, Tank DW. Functional imaging of hippocampal place cells at cellular resolution during virtual navigation. Nat Neurosci 2010, 13: 1433–1440.
Hölscher C, Schnee A, Dahmen H, Setia L, Mallot HA. Rats are able to navigate in virtual environments. J Exp Biol 2005, 208: 561–569.
Harvey CD, Collman F, Dombeck DA, Tank DW. Intracellular dynamics of hippocampal place cells during virtual navigation. Nature 2009, 461: 941–946.
Aronov D, Tank DW. Engagement of neural circuits underlying 2D spatial navigation in a rodent virtual reality system. Neuron 2014, 84: 442–456.
Liu L, Wang ZY, Liu Y, Xu C. An immersive virtual reality system for rodents in behavioral and neural research. Int J Autom Comput 2021, 18: 838–848.
Giovannucci A, Friedrich J, Gunn P, Kalfon J, Brown BL, Koay SA. CaImAn an open source tool for scalable calcium imaging data analysis. Elife 2019, 8: e38173.
Zhou PC, Resendez SL, Rodriguez-Romaguera J, Jimenez JC, Neufeld SQ, Giovannucci A, et al. Efficient and accurate extraction of in vivo calcium signals from microendoscopic video data. Elife 2018, 7: e28728.
Gonzalez WG, Zhang HW, Harutyunyan A, Lois C. Persistence of neuronal representations through time and damage in the Hippocampus. Science 2019, 365: 821–825.
Reis PM, Jung S, Aristoff JM, Stocker R. How cats lap: Water uptake by Felis catus. Science 2010, 330: 1231–1234.
Ruder L, Schina R, Kanodia H, Valencia-Garcia S, Pivetta C, Arber S. A functional map for diverse forelimb actions within brainstem circuitry. Nature 2021, 590: 445–450.
Wu Z, Litwin-Kumar A, Shamash P, Taylor A, Axel R, Shadlen MN. Context-dependent decision making in a premotor circuit. Neuron 2020, 106: 316-328.e6.
Tang YJ, Li L, Sun LQ, Yu JS, Hu Z, Lian KQ, et al. In vivo two-photon calcium imaging in dendrites of rabies virus-labeled V1 corticothalamic neurons. Neurosci Bull 2020, 36: 545–553.
Liu QQ, Yang X, Song R, Su JY, Luo MX, Zhong JL, et al. An infrared touch system for automatic behavior monitoring. Neurosci Bull 2021, 37: 815–830.
Huang K, Yang Q, Han YN, Zhang YL, Wang ZY, Wang LP, et al. An easily compatible eye-tracking system for freely-moving small animals. Neurosci Bull 2022, 38: 661–676.
Acknowledgements
We thank all members of the Xu lab for helpful discussions and comments. We thank Drs. Chengyu Li and Haohong Li for technical support and sharing resources. This work was supported by the National Science and Technology Innovation 2030 Major Program (2022ZD0205000), the National Key R&D Program of China, the Strategic Priority Research Program of the Chinese Academy of Sciences (XDB32010105, XDBS01010100), Shanghai Municipal Science and Technology Major Project (2018SHZDZX05), Lingang Lab (LG202104-01-08), the National Natural Science Foundation of China (31771180 and 91732106), and an International Collaborative Project of the Shanghai Science and Technology Committee (201978677).
Author information
Authors and Affiliations
Corresponding authors
Ethics declarations
Conflict of interest
The authors declare that there are no conflicts of interest.
Supplementary Information
Below is the link to the electronic supplementary material.
Supplementary file2 (MP4 15481 kb)
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Qu, XT., Wu, JN., Wen, Y. et al. A Virtual Reality Platform for Context-Dependent Cognitive Research in Rodents. Neurosci. Bull. 39, 717–730 (2023). https://doi.org/10.1007/s12264-022-00964-0
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12264-022-00964-0