Marker-Less AR in the Hybrid Room Using Equipment Detection for Camera Relocalization

  • Nicolas Loy Rodas
  • Fernando Barrera
  • Nicolas Padoy
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9349)

Abstract

Augmented reality (AR) permits clinicians to visualize directly in their field of view key information related to the performance of a surgery. To track the user’s viewpoint, current systems often use markers or register a reconstructed mesh with an a priori model of the scene. This only allows for a limited set of viewpoints and positions near the patient. Indeed, markers can be intrusive and interfere with the procedure. Furthermore, changes in the positions of equipment or clinicians can invalidate a priori models. Instead, we propose a marker-free mobile AR system based on a KinectFusion-like approach for camera tracking and equipment detection for camera relocalization. Our approach relies on the use of multiple RGBD cameras: one camera is rigidly attached to a hand-held screen where the AR visualization is displayed, while two others are rigidly fixed to the ceiling. The inclusion of two static cameras enables us to dynamically recompute the 3D model of the room, as required for relocalization when changes occur in the scene. Fast relocalization can be performed by looking at an equipment that is not required to remain static. This is particularly of advantage during hybrid surgeries, where an obvious choice for such an equipment is the intraoperative imaging device, which is large, can be seen in all views, but can also move. We propose to detect the equipment using a template based approach and further make use of the static cameras to speed-up the detection in the moving view by dynamically adapting the subset of tested templates according to the actual room layout. The approach is illustrated in a hybrid room through a radiation monitoring application where a virtual representation of the radiation cone beam, main X-ray scattering direction and dose distribution deposited on the surface of the patient are displayed on the hand-held screen.

Keywords

Augmented reality RGBD cameras Equipment detection Camera relocalization Radiation safety monitoring 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Glocker, B., Izadi, S., Shotton, J., Criminisi, A.: Real-time rgb-d camera relocalization. In: ISMAR, pp. 173–179 (2013)Google Scholar
  2. 2.
    Hinterstoisser, S., Lepetit, V., Ilic, S., Holzer, S., Bradski, G., Konolige, K., Navab, N.: Model based training, detection and pose estimation of texture-less 3D objects in heavily cluttered scenes. In: Lee, K.M., Matsushita, Y., Rehg, J.M., Hu, Z. (eds.) ACCV 2012, Part I. LNCS, vol. 7724, pp. 548–562. Springer, Heidelberg (2013)CrossRefGoogle Scholar
  3. 3.
    Ladikos, A., Cagniart, C., Ghotbi, R., Reiser, M., Navab, N.: Estimating radiation exposure in interventional environments. In: Jiang, T., Navab, N., Pluim, J.P.W., Viergever, M.A. (eds.) MICCAI 2010, Part III. LNCS, vol. 6363, pp. 237–244. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  4. 4.
    Linte, C., Davenport, K., Cleary, K., Peters, C., Vosburgh, K., Navab, N., Edwards, P., Jannin, P., Peters, T., Holmes, D., Robb, R.: On mixed reality environments for minimally invasive therapy guidance: Systems architecture, successes and challenges in their implementation from laboratory to clinic. Comput. Med. Imaging Graph. 37(2), 83–97 (2013)CrossRefGoogle Scholar
  5. 5.
    Loy Rodas, N., Padoy, N.: Seeing is believing: increasing intraoperative awareness to scattered radiation in interventional procedures by combining augmented reality, monte carlo simulations and wireless dosimeters. International Journal of Computer Assisted Radiology and Surgery (2015)Google Scholar
  6. 6.
    Maier-Hein, L., Franz, A., Fangerau, M., Schmidt, M., Seitel, A., Mersmann, S., Kilgus, T., Groch, A., Yung, K., dos Santos, T., Meinzer, H.P.: Towards mobile augmented reality for on-patient visualization of medical images. In: Bildverarbeitung für die Medizin, pp. 389–393. Springer (2011)Google Scholar
  7. 7.
    Newcombe, R.A., Izadi, S., Hilliges, O., Molyneaux, D., Kim, D., Davison, A.J., Kohli, P., Shotton, J., Hodges, S., Fitzgibbon, A.W.: Kinectfusion: Real-time dense surface mapping and tracking. In: ISMAR, pp. 127–136 (2011)Google Scholar
  8. 8.
    Salas-Moreno, R.F., Newcombe, R.A., Strasdat, H., Kelly, P.H.J., Davison, A.J.: SLAM++: simultaneous localisation and mapping at the level of objects. In: CVPR, pp. 1352–1359 (2013)Google Scholar
  9. 9.
    Sauer, F., Khamene, A., Bascle, B., Rubino, G.J.: A head-mounted display system for augmented reality image guidance: Towards clinical evaluation for imri-guided neurosurgery. In: Niessen, W.J., Viergever, M.A. (eds.) MICCAI 2001. LNCS, vol. 2208, pp. 707–716. Springer, Heidelberg (2001)CrossRefGoogle Scholar
  10. 10.
    Sielhorst, T., Feuerstein, M., Navab, N.: Advanced medical displays: A literature review of augmented reality. Journal of Display Technology 4(4), 451–467 (2008)CrossRefGoogle Scholar
  11. 11.
    Vanhavere, F., Carinou, E., Gualdrini, G., Clairand, I., Merce, M., Ginjaume, M.: The oramed project: Optimisation of radiation protection for medical staff. IFMBE Proceedings, vol. 25/3, pp. 470–473 (2009)Google Scholar
  12. 12.
    Wagner, M., Dresing, K., Wolfram, L., Ahrens, C.A., Bott, O.J.: Siscar-gpu: fast simulation and visualization of intraoperative scattered radiation to support radiation protection training. MIE 180, 968–972 (2012)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Nicolas Loy Rodas
    • 1
  • Fernando Barrera
    • 1
  • Nicolas Padoy
    • 1
  1. 1.ICube, University of Strasbourg, CNRS, IHUStrasbourgFrance

Personalised recommendations