An automated behavior analysis system for freely moving rodents using depth image
- 325 Downloads
A rodent behavior analysis system is presented, capable of automated tracking, pose estimation, and recognition of nine behaviors in freely moving animals. The system tracks three key points on the rodent body (nose, center of body, and base of tail) to estimate its pose and head rotation angle in real time. A support vector machine (SVM)-based model, including label optimization steps, is trained to classify on a frame-by-frame basis: resting, walking, bending, grooming, sniffing, rearing supported, rearing unsupported, micro-movements, and “other” behaviors. Compared to conventional red-green-blue (RGB) camera-based methods, the proposed system operates on 3D depth images provided by the Kinect infrared (IR) camera, enabling stable performance regardless of lighting conditions and animal color contrast with the background. This is particularly beneficial for monitoring nocturnal animals’ behavior. 3D features are designed to be extracted directly from the depth stream and combined with contour-based 2D features to further improve recognition accuracies. The system is validated on three freely behaving rats for 168 min in total. The behavior recognition model achieved a cross-validation accuracy of 86.8% on the rat used for training and accuracies of 82.1 and 83% on the other two “testing” rats. The automated head angle estimation aided by behavior recognition resulted in 0.76 correlation with human expert annotation.
KeywordsKinect sensor Depth image Animal tracking Pose estimation Feature extraction Behavior recognition Support vector machine
This work was supported in part by the National Institutes of Health award 1R21EB018561 and the National Science Foundation under award ECCS-1407880 and ECCS-1408318. The authors would like to thank Prof. Rainnie and members of his lab at the Emory University Department of Psychiatry for their assistance in conducting the in vivo experiments.
Compliance with ethical standards
The experiment, which was conducted as part of the evaluation of the EnerCage-HC2 system, was approved by the Institutional Animal Care and Use Committees (IACUC) at Emory University and Georgia Tech.
All procedures performed in studies involving animals were in accordance with the ethical standards of the institution or practice at which the studies were conducted.
- 5.Shi Q, Miyagishima S, Fumino S et al (2010) Development of a cognition system for analyzing rat’s behaviors. In: IEEE Int Conf Robotics and Biomimetics (ROBIO), pp 1399–1401Google Scholar
- 6.Shi Q, Ishii H, Konno S et al (2012) Image processing and behavior planning for robot-rat interaction. In: IEEE RAS & EMBS Int. Conf. Biomedical Robotics and Biomechatronics (BioRob), pp 967–973Google Scholar
- 8.Burgos-Artizzu XP, Dollár P, Lin D et al (2012) Social behavior recognition in continuous videos. In: IEEE Conf. Computer vision and pattern recognition (CVPR 2012), pp 1322–1329Google Scholar
- 12.Lorbach M, Kyriakou EI, Poppe R et al (2017) Learning to recognize rat social behavior: Novel dataset and cross-dataset application. J Neurosci Methods. in press, available onlineGoogle Scholar
- 13.Ren Z, Noronha A, Ciernia AV et al (2017) Who moved my cheese? Automatic annotation of rodent behaviors with convolutional neural networks. In: IEEE Winter Conf. Applications computer vision (WACV), pp 1277–1286Google Scholar
- 17.Whitmire E, Latif T, Bozkurt A (2013) Kinect-based system for automated control of terrestrial insect biobots. In: Int. Conf. IEEE EMBC, pp 1470–1473Google Scholar
- 19.Lee B, Kiani M, Ghovanloo M (2014) A smart homecage system with 3D tracking for long-term behavioral experiments. In: Int. Conf. IEEE EMBC, pp 2016-2019Google Scholar
- 23.Matsumoto J, Nishimaru H, Ono T, Nishijo H (2017) 3D-video-based computerized behavioral analysis for in vivo neuropharmacology and neurophysiology in rodents. In: Philippu A (ed) In Vivo Neuropharmacology and Neurophysiology. Neuromethods, vol 121. Humana Press, New YorkGoogle Scholar
- 26.Rezaei B, Lowe J, Yee JR et al (2016) Non-contact automatic respiration monitoring in restrained rodents. In: Int. Conf. IEEE EMBC, pp 4946–4950Google Scholar
- 27.Rezaei B, Huang X, Yee JR, Ostadabbas S (2017) Long-term non-contact tracking of caged rodents. In: 42nd IEEE Int. Conf. Acoustics, Speech and Signal Processing (ICASSP’17), pp 1952–1956Google Scholar
- 28.Monteiro JP, Oliveira HP, Aguirar P, Cardoso JS (2014) A depth-map approach for automatic mice behavior recognition. In: IEEE Int. Conf. Image Processing (ICIP), pp 2261–2265Google Scholar
- 30.Wang Z, Mirbozorgi SA, Ghovanloo M (2015) Towards a Kinect-based behavior recognition and analysis system for small animals. In: IEEE biomed. Circ. Sys. Conf. (BioCAS), pp 683–686Google Scholar
- 31.Jia Y, Wang Z, Canales D et al (2016) A wirelessly-powered homecage with animal behavior analysis and closed-loop power control. In: Int Conf IEEE EMBC, pp 6323–6326Google Scholar
- 33.Gonzalez RC, Woods RE (2008) Digital image processing, 3rd edn. Prentice Hall, Upper Saddle RiverGoogle Scholar
- 40.Goutte C, Gaussier E (2005) A probabilistic interpretation of precision, recall and F-score, with implication for evaluation. In: 27th European Conf. Advances inform. Retrieval research, pp 345–359Google Scholar