An Accuracy Certified Augmented Reality System for Therapy Guidance
Our purpose is to provide an augmented reality system for Radio-Frequency guidance that could superimpose a 3D model of the liver, its vessels and tumors (reconstructed from CT images) on external video images of the patient. In this paper, we point out that clinical usability not only need the best affordable registration accuracy, but also a certification that the required accuracy is met, since clinical conditions change from one intervention to the other. Beginning by addressing accuracy performances, we show that a 3D/2D registration based on radio-opaque fiducials is more adapted to our application constraints than other methods. Then, we outline a lack in their statistical assumptions which leads us to the derivation of a new extended 3D/2D criterion. Careful validation experiments on real data show that an accuracy of 2 mm can be achieved in clinically relevant conditions, and that our new criterion is up to 9% more accurate, while keeping a computation time compatible with real-time at 20 to 40 Hz.
After the fulfillment of our statistical hypotheses, we turn to safety issues. Propagating the data noise through both our criterion and the classical one, we obtain an explicit formulation of the registration error. As the real conditions do not always fit the theory, it is critical to validate our prediction with real data. Thus, we perform a rigorous incremental validation of each assumption using successively: synthetic data, real video images of a precisely known object, and finally real CT and video images of a soft phantom. Results point out that our error prediction is fully valid in our application range. Eventually, we provide an accurate Augmented Reality guidance system that allows the automatic detection of potentially inaccurate guidance.
KeywordsAugmented Reality Video Image Compute Tomog Validation Index Registration Error
- 2.Soler, L., et al.: Fully automatic anatomical, pathological, and functional segmentation from CT-scans for hepatic surgery. Computer Aided Surgery 6(3) (2001)Google Scholar
- 3.Dhome, M., et al.: Determination of the attitude of 3D objects from a single perspective view. IEEE Trans. on PAMI 11(12), 1265–1278 (1989)Google Scholar
- 5.Grimson, W., et al.: An automatic registration method for frameless stereotaxy, image-guided surgery and enhanced reality visualization. IEEE TMI 15(2), 129–140 (1996)Google Scholar
- 9.Hel-Or, Y., Werman, M.: Pose estimation by fusing noisy data of different dimensions. IEEE Trans. on PAMI 17(2), 195–201 (1995)Google Scholar
- 10.Lu, C., Hager, G.D., Mjolsness, E.: Fast and globally convergent pose estimation from video images. IEEE Trans. on PAMI 22(6), 610–622 (2000)Google Scholar
- 11.McGahan, J.F., Dodd III, G.D.: Radiofrequency ablation of the liver: Current status. American Journal of Roentgenology 176(1), 3–16 (2001)Google Scholar
- 13.Pennec, X., Guttmann, C.R.G., Thirion, J.-P.: Feature-based registration of medical images: Estimation and validation of the pose accuracy. In: Wells, W.M., Colchester, A.C.F., Delp, S.L. (eds.) MICCAI 1998. LNCS, vol. 1496, pp. 1107–1114. Springer, Heidelberg (1998)Google Scholar
- 16.Press, W.H., Flannery, B.P., Teukolsky, S.A., Vetterling, W.T.: Numerical Recipices in C. Cambridge Univ. Press, Cambridge (1991)Google Scholar
- 17.Roche, A., Pennec, X., Malandain, G., Ayache, N.: Rigid registration of 3D ultrasound with MR images: a new approach combining intensity and gradient information. IEEE TMI 20(10), 1038–1049 (2001)Google Scholar