Video and accelerometer-based motion analysis for automated surgical skills assessment
- 264 Downloads
Basic surgical skills of suturing and knot tying are an essential part of medical training. Having an automated system for surgical skills assessment could help save experts time and improve training efficiency. There have been some recent attempts at automated surgical skills assessment using either video analysis or acceleration data. In this paper, we present a novel approach for automated assessment of OSATS-like surgical skills and provide an analysis of different features on multi-modal data (video and accelerometer data).
We conduct a large study for basic surgical skill assessment on a dataset that contained video and accelerometer data for suturing and knot-tying tasks. We introduce “entropy-based” features—approximate entropy and cross-approximate entropy, which quantify the amount of predictability and regularity of fluctuations in time series data. The proposed features are compared to existing methods of Sequential Motion Texture, Discrete Cosine Transform and Discrete Fourier Transform, for surgical skills assessment.
We report average performance of different features across all applicable OSATS-like criteria for suturing and knot-tying tasks. Our analysis shows that the proposed entropy-based features outperform previous state-of-the-art methods using video data, achieving average classification accuracies of 95.1 and 92.2% for suturing and knot tying, respectively. For accelerometer data, our method performs better for suturing achieving 86.8% average accuracy. We also show that fusion of video and acceleration features can improve overall performance for skill assessment.
Automated surgical skills assessment can be achieved with high accuracy using the proposed entropy features. Such a system can significantly improve the efficiency of surgical training in medical schools and teaching hospitals.
KeywordsSurgical skills assessment Computer vision Machine learning Multi-modal data
Compliance with ethical standards
Conflict of interest
The authors declare that they have no conflict of interest.
All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki Declaration and its later amendments or comparable ethical standards.
Informed consent was obtained from all individual participants included in the study.
- 2.Sharma Y, Bettadapura V, Plötz T, Hammerla N, Mellor S, McNaney R, Olivier P, Deshmukh S, McCaskie A, Essa I (2014) Video based assessment of OSATS using sequential motion textures. In: International workshop on modeling and monitoring of computer assisted interventions (M2CAI)—international conference on medical image computing and computer-assisted intervention—MICCAIGoogle Scholar
- 4.Bettadapura V, Schindler G, Plötz T, Essa I (2013) Augmenting bag-of-words: data-driven discovery of temporal and structural information for activity recognition. In: CVPR. IEEEGoogle Scholar
- 5.Sharma Y, Plötz T, Hammerla N, Mellor S, Roisin M, Olivier P, Deshmukh S, McCaskie A, Essa I (2014) Automated surgical OSATS prediction from videos. In: ISBI. IEEEGoogle Scholar
- 6.Zia A, Sharma Y, Bettadapura V, Sarin EL, Clements MA, Essa I (2015) Automated assessment of surgical skills using frequency analysis. In: International conference on medical image computing and computer-assisted intervention–MICCAI 2015. Springer, pp 430–438Google Scholar
- 7.Trejos A, Patel R, Naish M, Schlachta C (2008) Design of a sensorized instrument for skills assessment and training in minimally invasive surgery. In: 2nd IEEE RAS & EMBS international conference on biomedical robotics and biomechatronics, 2008. BioRob 2008. IEEE, pp 965–970Google Scholar
- 8.Nisky I, Che Y, Quek ZF, Weber M, Hsieh MH, Okamura AM (2015) Teleoperated versus open needle driving: kinematic analysis of experienced surgeons and novice users. In: 2015 IEEE international conference on robotics and automation (ICRA). IEEE, pp 5371–5377Google Scholar
- 9.Ershad M, Koesters Z, Rege R, Majewicz A (2016) Meaningful assessment of surgical expertise: semantic labeling with data and crowds. In: International conference on medical image computing and computer-assisted intervention. Springer, pp 508–515Google Scholar
- 12.Reiley C, Hager G (2009) Decomposition of robotic surgical tasks: an analysis of subtasks and their correlation to skill. In: International conference on medical image computing and computer-assisted intervention–MICCAIGoogle Scholar
- 13.Haro BB, Zappella L, Vidal R (2012) Surgical gesture classification from video data. In: International conference on medical image computing and computer-assisted intervention—MICCAI 2012. Springer, pp 34–41Google Scholar
- 16.DiPietro R, Lea C, Malpani A, Ahmidi N, Vedula SS, Lee GI, Lee MR, Hager GD (2016) Recognizing surgical activities with recurrent neural networks. In: International conference on medical image computing and computer-assisted intervention. Springer, pp 551–558Google Scholar
- 17.Krishnan S, Garg A, Patil S, Lea C, Hager G, Abbeel P, Goldberg K (2018) Transition state clustering: unsupervised surgical trajectory segmentation for robot learning. In: Robotics research. Springer, pp 91–110Google Scholar
- 20.Pirsiavash H, Vondrick C, Torralba A (2014) Assessing the quality of actions. In: European conference on computer vision. Springer, pp 556–571Google Scholar
- 21.Venkataraman V, Vlachos I, Turaga P (2015) Dynamical regularity for action analysis. In: Proceedings of the British machine vision conference (BMVC), pp 67–1Google Scholar
- 26.Sloetjes H, Wittenburg P (2008) Annotation by category: ELAN and ISO DCR. In: Language resources and evaluation conference—LRECGoogle Scholar
- 29.Gao Y, Vedula SS, Reiley CE, Ahmidi N, Varadarajan B, Lin HC, Tao L, Zappella L, Béjar B, Yuh DD, Chen CCG, Vidal R, Khundanpur S, Hager GD (2014) JHU-ISI gesture and skill assessment working set (JIGSAWS): a surgical activity dataset for human motion modeling. In: International workshop on modeling and monitoring of computer assisted interventions (M2CAI)—international conference on medical image computing and computer-assisted intervention—MICCAI, vol 3Google Scholar