Abstract
A recovery room is a necessary unit of hospital nursing that should be continued in the operating room. The goal of recovery is to provide high-quality care for patients undergoing post-surgical recovery, in contrast to the effects of anesthetic drugs, such as sudden movements of the hands and legs, standing, or falling from the bed that may occur instantaneously. Due to the shortage of nurses in the recovery room that these units face, the need to use remote monitoring systems for patients is increasing to somehow compensate for the lack of service personnel and that assist staff in better monitoring of patients. In this study, using a combination of geometric features and depth data, a patient’s actions are recognized. Then, the actions that are at risk for patients in the recovery room will be identified and notified to the nursing unit before its occurrence to take necessary measures. For this purpose, RGB-D data is collected and analyzed. The proposed methodology steps in this study generally include recording video images using Kinect sensors (457 videos with 640 × 480 resolution), extracting features from video frames (color separation-based approach), training the Hidden Markov Model to classify the indicator vectors, and finally evaluation and validation of the model. Experimental results indicate that the proposed identification method can accurately detect moments that the patient is exposed to danger due to their changes in the hospital bed. The recognition rate for this approach is 91.36%.
Similar content being viewed by others
Notes
Post Anesthesia Care Unit
Time of Flight
Histogram of Oriented Gradients (HOG)
Edge Orientation. Histograms (EOH)
Hidden Markov Model (HMM)
References
Adamina M, Gié O, Demartines N, Ris F (2013) Contemporary perioperative care strategies. Br J Surg 100(1):38–54. https://doi.org/10.1002/bjs.8990
Ahad MAR, Antar AD, Ahmed M (n.d.) "Human Activity Recognition: Data Collection and Design Issues," in IoT Sensor-Based Activity Recognition: Springer, pp. 63–75
Alazrai R, Momani M, Daoud MI (2017) Fall detection for elderly from partially observed depth-map video sequences based on view-invariant human activity representation. Appl Sci 7(4):316. https://doi.org/10.3390/app7040316
Arivazhagan S, Shebiah RN, Harini R, Swetha S (2019) Human action recognition from RGB-D data using complete local binary pattern. Cognitive Syst Res 58:94–104. https://doi.org/10.1016/j.cogsys.2019.05.002
Barone CP, Pablo CS, Barone GW (2003) A history of the PACU. J Peri Anesth Nurs 18(4):237–241. https://doi.org/10.1016/S1089-9472(03)00130-8
Bellini V, Guzzon M, Bigliardi B, Mordonini M, Filippelli S, Bignami E (2020) Artificial intelligence: a new tool in operating room management. Role of machine learning models in operating room optimization. J Med Syst 44(1):1–10. https://doi.org/10.1007/s10916-019-1512-1
Berchtold M, Budde M, Schmidtke HR, Beigl M (2010) "An extensible modular recognition concept that makes activity recognition practical," in Annual Conference on Artificial Intelligence: Springer, pp. 400–409, https://doi.org/10.1007/978-3-642-16111-7_46.
Childers CP, Maggard-Gibbons M (2018) Understanding costs of care in the operating room. JAMA Surg 153(4):e176233–e176233. https://doi.org/10.1001/jamasurg.2017.6233
Davidson M, Litchfield K (2018) Patient recovery and the post-anaesthesia care unit (PACU). Anaesth Intensive Care Med 19(9):457–460. https://doi.org/10.1016/j.mpaic.2018.06.002
Ding I Jr, Chang C-W (2016) An adaptive hidden Markov model-based gesture recognition approach using Kinect to simplify large-scale video data processing for humanoid robot imitation. Multimed Tools Appl 75(23):15537–15551. https://doi.org/10.1007/s11042-015-2505-9
Diraco G, Leone A, Siciliano P (2010) "An active vision system for fall detection and posture recognition in elderly healthcare," in 2010 Design, Automation & Test in Europe Conference & Exhibition (DATE 2010): IEEE, pp. 1536–1541, https://doi.org/10.1109/DATE.2010.5457055.
Du Nguyen H, Tran KP, Zeng X, Koehl L, Tartare G (2020) "An improved ensemble machine learning algorithm for wearable sensor data based human activity recognition," Reliab Stat Comput, pp. 207–228
Dutta T (2012) Evaluation of the Kinect™ sensor for 3-D kinematic measurement in the workplace. Appl Ergon 43(4):645–649. https://doi.org/10.1016/j.apergo.2011.09.011
Fairley M, Scheinker D, Brandeau ML (2019) Improving the efficiency of the operating room environment with an optimization and machine learning model. Health Care Manag Sci 22(4):756–767. https://doi.org/10.1007/s10729-018-9457-3
Garcia-Agundez A, Folkerts AK, Konrad R, Caserman P, Tregel T, Goosses M, Göbel S, Kalbe E (2019) Recent advances in rehabilitation for Parkinson’s Disease with Exergames: A Systematic Review. J Neuroeng Rehab 16(1):17. https://doi.org/10.1186/s12984-019-0492-1
Han J, Kamber M (2006) "data mining concepts and techniques, published by Morgan Kauffman," ed
Han J, Shao L, Xu D, Shotton J (2013) Enhanced computer vision with microsoft kinect sensor: a review. IEEE Trans Cybern 43(5):1318–1334. https://doi.org/10.1109/TCYB.2013.2265378
Harris CG, Stephens M (1988) A combined corner and edge detector. In: Proceedings of the Alvey Vision Conference, pp 10–5244. https://doi.org/10.5244/C.2.23
Hochhausen N, Barbosa Pereira C, Leonhardt S, Rossaint R, Czaplik M (2018) Estimating Respiratory Rate in Post-Anesthesia Care Unit Patients Using Infrared Thermography: An Observational Study. Sensors 18(5):1618. https://doi.org/10.3390/s18051618
Hong P, Turk M, Huang TS (2000) "Gesture modeling and recognition using finite state machines," in Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580): IEEE, pp. 410–415, https://doi.org/10.1109/AFGR.2000.840667.
Imran J, Raman B (2020) Evaluating fusion of RGB-D and inertial sensors for multimodal human action recognition. J Ambient Intell Humaniz Comput 11(1):189–208. https://doi.org/10.1007/s12652-019-01239-9
Inoue M, Taguchi R (2020) Bed exit action detection based on patient posture with long short-term memory. In: 2020 42nd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), pp. 4390–4393. https://doi.org/10.1109/EMBC44109.2020.9175619
Ji X, Cheng J, Feng W, Tao D (2018) Skeleton embedded motion body partition for human action recognition using depth sequences. Signal Process 143:56–68. https://doi.org/10.1016/j.sigpro.2017.08.016
Karabulut N, Aktaş YY (2016) Nursing management of delirium in the postanesthesia care unit and intensive care unit. J Peri Anesth Nurs 31(5):397–405. https://doi.org/10.1016/j.jopan.2014.10.006
Ke Q, Bennamoun M, An S, Sohel F, Boussaid F (2017) "A new representation of skeleton sequences for 3d action recognition," in Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 3288–3297
Khoshelham K, Elberink SO (2012) Accuracy and resolution of kinect depth data for indoor mapping applications. Sensors 12(2):1437–1454. https://doi.org/10.3390/s120201437
Li G, Li C (2020) Learning skeleton information for human action analysis using Kinect. Signal Process Image Commun 84:115814. https://doi.org/10.1016/j.image.2020.115814
Li Y, Berkowitz L, Noskin G, Mehrotra S (2014) "Detection of patient's bed statuses in 3D using a Microsoft Kinect," in 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society: IEEE, pp. 5900–5903, https://doi.org/10.1109/EMBC.2014.6944971.
Li W-J, Hsieh C-Y, Lin L-F, Chu W-C (2017) "Hand gesture recognition for post-stroke rehabilitation using leap motion," in 2017 International Conference on Applied System Innovation (ICASI): IEEE, pp. 386–388, https://doi.org/10.1109/ICASI.2017.7988433.
Li S, Li W, Cook C, Zhu C, Gao Y (2018) "Independently recurrent neural network (indrnn): Building a longer and deeper rnn," in Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 5457–5466
Liu X, Yin J, Liu Y, Zhang S, Guo S, Wang K (2019) Vital signs monitoring with RFID: opportunities and challenges. IEEE Netw 33(4):126–132
Liu B, Cai H, Ju Z, Liu H (2019) RGB-D sensing based human action and interaction analysis: a survey. Pattern Recogn 94:1–12. https://doi.org/10.5573/IEIESPC.2015.4.4.281
Luckowski A (2019) Safety priorities in the PACU. Nursing 2020 49(4):62–65. https://doi.org/10.1097/01.NURSE.0000554246.74635.e0
Ludbrook G, Lloyd C, Story D, Maddern G, Riedel B, Richardson I, Scott D, Louise J, Edwards S (2021) The effect of advanced recovery room care on postoperative outcomes in moderate-risk surgical patients: a multicentre feasibility study. (in eng), Anaesth 76(4):480–488. https://doi.org/10.1111/anae.15260
Lun R, Zhao W (2015) A survey of applications and human motion recognition with microsoft kinect. Int J Pattern Recog Artif Intell 29(05):1555008. https://doi.org/10.1142/S0218001415550083
Malasinghe LP, Ramzan N, Dahal K (2019) Remote patient monitoring: a comprehensive study. J Ambient Intell Humaniz Comput 10(1):57–76. https://doi.org/10.1007/s12652-017-0598-x
Ong APR et al (2017) Application of KinectTM and wireless technology for patient data recording and viewing system in the course of surgery. AIP Conf Proceed 1817(1) AIP Publishing LLC:040004. https://doi.org/10.1063/1.4976789
Oreifej O, Liu Z (2013) "Hon4d: Histogram of oriented 4d normals for activity recognition from depth sequences," in Proceedings of the IEEE conference on computer vision and pattern recognition, , pp. 716–723, https://doi.org/10.1109/CVPR.2013.98.
Overhage JM, McCallie D Jr (2020) Physician time spent using the electronic health record during outpatient encounters: a descriptive study. Ann Intern Med 172(3):169–174. https://doi.org/10.7326/M18-3684
Pachoulakis I, Papadopoulos N, Analyti A (2018) Kinect-based exergames tailored to Parkinson patients. Int J Comput Games Technol 2018:1–14. https://doi.org/10.1155/2018/2618271
Patsadu O, Nukoolkit C, Watanapa B (2012) "Human gesture recognition using Kinect camera," in 2012 ninth international conference on computer science and software engineering (JCSSE): IEEE, pp. 28–32, https://doi.org/10.1109/JCSSE.2012.6261920.
Qiao R, Liu L, Shen C, van den Hengel A (2017) Learning discriminative trajectorylet detector sets for accurate skeleton-based action recognition. Pattern Recogn 66:202–212. https://doi.org/10.1016/j.patcog.2017.01.015
Rabiner L, Juang B (1986) An introduction to hidden Markov models. IEEE ASSP Mag 3(1):4–16. https://doi.org/10.1109/MASSP.1986.1165342
Rahmani H, Mahmood A, Huynh DQ, Mian A (2014) "HOPC: Histogram of oriented principal components of 3D pointclouds for action recognition," in European conference on computer vision: Springer, pp. 742–757, https://doi.org/10.1007/978-3-319-10605-2_48.
Rougier C, Meunier J, St-Arnaud A, Rousseau J (2011) Robust video surveillance for fall detection based on human shape deformation. IEEE Trans Circ Syst Vid Technol 21(5):611–622. https://doi.org/10.1109/TCSVT.2011.2129370
Rougier C, Auvinet E, Rousseau J, Mignotte M, Meunier J (2011) "Fall detection from depth map video sequences," in International conference on smart homes and health telematics: Springer, pp. 121–128, https://doi.org/10.1007/978-3-642-21535-3_16.
Sepehri MM, Mollaei H, Khatibi T (2014) A framework for monitoring patients in the recovery room using Kinect. In: 11th International Industrial Engineering Conference, pp. 141–152
Silverstein E, Snyder M (2017) Implementation of facial recognition with Microsoft Kinect v2 sensor for patient verification. Med Phys 44(6):2391–2399. https://doi.org/10.1002/mp.12241
Stoyanov T, Louloudi A, Andreasson H, Lilienthal AJ (2011) "Comparative evaluation of range sensor accuracy in indoor environments," in 5th European Conference on Mobile Robots, ECMR 2011, September 7–9, 2011, Örebro, Sweden, pp. 19–24. [Online]. Available: urn:nbn:se:oru:diva-24096. [Online]. Available: urn:nbn:se:oru:diva-24096
Surasak T, Takahiro I, Cheng C-H, Wang C-E, Sheng P-Y (2018) Histogram of oriented gradients for human detection in video. In: 2018 5th International conference on business and industrial research (ICBIR), pp. 172–176, https://doi.org/10.1109/icbir.2018.8391187
Trăscău M, Nan M, Florea AM (2019) Spatio-temporal features in action recognition using 3d skeletal joints. Sensors 19(2):423. https://doi.org/10.3390/s19020423
Van den Bergh M, Van Gool L (2011) "Combining RGB and ToF cameras for real-time 3D hand gesture interaction," in 2011 IEEE workshop on applications of computer vision (WACV): IEEE, pp. 66–72, https://doi.org/10.1109/WACV.2011.5711485.
Wang J, Liu Z, Wu Y, Yuan J (2012) "Mining actionlet ensemble for action recognition with depth cameras," in 2012 IEEE Conference on Computer Vision and Pattern Recognition: IEEE, pp. 1290–1297, https://doi.org/10.1109/CVPR.2012.6247813.
Wang L, Huynh DQ, Koniusz P (2019) A comparative review of recent kinect-based action recognition algorithms. IEEE Trans Image Process 29:15–28. https://doi.org/10.1109/TIP.2019.2925285
Yamato J, Ohya J, Ishii K (1992) "Recognizing human action in time-sequential images using hidden Markov model," in CVPR, vol. 92, pp. 379–385, https://doi.org/10.1109/CVPR.1992.223161.
Yan S, Xiong Y, Lin D (2018) Spatial temporal graph convolutional networks for skeleton-based action recognition. In: Proceedings of the AAAI Conference on Artificial Intelligence, 32(1). https://doi.org/10.1609/aaai.v32i1.12328
Yang X, Tian Y (2014) "Super normal vector for activity recognition using depth sequences," in Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 804–811, https://doi.org/10.1109/CVPR.2014.108.
Zhang Z (2012) Microsoft kinect sensor and its effect. IEEE Multimed 19(2):4–10. https://doi.org/10.1109/MMUL.2012.24
Zhang H-B et al (2019) A Comprehensive Survey of Vision-Based Human Action Recognition Methods. Sensors 19(5):1005. https://doi.org/10.3390/s19051005
Zhao W, Feng H, Lun R, Espy DD, Reinthal MA (2014) A Kinect-based rehabilitation exercise monitoring and guidance system. In: 2014 IEEE 5th International Conference on Software Engineering and Service Science, pp 762–765, https://doi.org/10.1109/ICSESS.2014.6933678
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendix 1
Appendix 1
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Mollaei, H., Sepehri, M.M. & Khatibi, T. Patient’s actions recognition in hospital’s recovery department based on RGB-D dataset. Multimed Tools Appl 82, 24127–24154 (2023). https://doi.org/10.1007/s11042-022-14200-4
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11042-022-14200-4