Advertisement

Multimodal Interface Towards Smartphones: The Use of Pico Projector, Passive RGB Imaging and Active Infrared Imaging

  • Thitirat Siriborvornratanakul
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8563)

Abstract

This paper proposes a study regarding smartphone-oriented mobile devices capable of simultaneous passive RGB imaging and active infrared imaging for both projection and image sensing. Using RGB and infrared wavelengths together enables foreground interactive projection and background vision-based analysis to be done without unwanted crosstalk between the two spectrums or visible interruption to audiences. Our proposal includes detachable and rotatable mobile configuration designs, general computing paradigm and multimodal interface strategy; all are presented in a smartphone-oriented manner. Experiments are conducted to clarify efficiency and limitation of our proposal using a proof-of-concept setup. Despite of internal optic and mechanism which requires cooperation from technologys owner to fully accomplish, we believe that our proposal is useful and sustainable, enabling easy compatibility and maintenance with future mobile devices.

Keywords

Projector-camera infrared multimodal interface smartphone 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Akasaka, K., Sagawa, R., Yagi, Y.: A sensor for simultaneously capturing texture and shape by projecting structured infrared light. In: Procs. of the 6th International Conference on 3-D Digital Imaging and Modeling, pp. 375–381 (2007)Google Scholar
  2. 2.
    Cotting, D., Gross, M.: Interactive environment-aware display bubbles. In: Procs. of the ACM Symposium on User Interface Software and Technology (UIST 2006), pp. 245–254 (2006)Google Scholar
  3. 3.
    Cotting, D., Naef, M., Gross, M., Fuchs, H.: Embedding imperceptible patterns into projected images for simultaneous acquisition and display. In: Procs. of the 3rd IEEE/ACM International Symposium on Mixed and Augmented Reality (ISMAR 2004), pp. 100–109 (2004)Google Scholar
  4. 4.
    Follmer, S., Johnson, M., Adelson, E., Ishii, H.: deform: An interactive malleable surface for capturing 2.5d arbitrary objects, tools and touch. In: Procs. of the 24th Annual ACM Symposium on User Interface Software and Technology (UIST 2011), pp. 527–536 (2011)Google Scholar
  5. 5.
    Lee, J., Hudson, S., Dietz, P.: Hybrid infrared and visible light projection for location tracking. In: Procs. of the 20th Annual ACM Symposium on User Interface Software and Technology (UIST 2007), pp. 57–60 (2007)Google Scholar
  6. 6.
    Lochtefeld, M., Gehring, M., Schoning, J., Kruger, A.: Shelftorchlight: Augmenting a shelf using a camera projector unit. In: Internation Conference on Pervasive Computing, Workshop on Personal Projection (UbiProjection 2010), pp. 1–4 (2010)Google Scholar
  7. 7.
    Lowe, D.: Distinctive image features from scale-invariant keypoints. International Journal of Computer Vision 60(2), 91–110 (2004)CrossRefGoogle Scholar
  8. 8.
    Mistry, P., Maes, P., Chang, L.: Wuw - wear ur world: A wearable gestural interface. In: Procs. of the CHI Extended Abstracts on Human Factors in Computing Systems (CHI 2009), pp. 4111–4116 (2009)Google Scholar
  9. 9.
    Molyneaux, D., Izadi, S., Kim, D., Hilliges, O., Hodges, S., Cao, X., Butler, A., Gellersen, H.: Interactive environment-aware handheld projectors for pervasive computing spaces. In: Kay, J., Lukowicz, P., Tokuda, H., Olivier, P., Krüger, A. (eds.) Pervasive 2012. LNCS, vol. 7319, pp. 197–215. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  10. 10.
    Raskar, R., Welch, G., Cutts, M., Lake, A., Stesin, L., Fuchs, H.: The office of the future: A unified approach to image-based modeling and spatially immersive displays. In: Procs. of the 25th Annual Conference on Computer Graphics and Interactive Techniques (SIGGRAPH 1998), pp. 179–188 (1998)Google Scholar
  11. 11.
    Siriborvornratanakul, T., Sugimoto, M.: Clutter-aware adaptive projection inside a dynamic environment. In: Procs. of the 15th ACM Symposium on Virtual Reality Software and Technology (VRST 2008), pp. 241–242 (2008)Google Scholar
  12. 12.
    Willis, K., Poupyrev, I., Hudson, S., Mahler, M.: Sidebyside: Ad-hoc multi-user interaction with handheld projectors. In: Procs. of the 24th Annual ACM Symposium on User Interface Software and Technology (UIST 2011), pp. 431–440 (2011)Google Scholar
  13. 13.
    Wilson, A.: Playanywhere: A compact interactive tabletop projection-vision system. In: Procs. of the 18th Annual ACM Symposium on User Interface Software and Technology (UIST 2005), pp. 83–92 (2005)Google Scholar
  14. 14.
    Yoshida, T., Hirobe, Y., Nii, H., Kawakami, N., Tachi, S.: Twinkle: Interacting with physical surfaces using handheld projector. In: Procs. of IEEE Virtual Reality Conference, VR 2010, pp. 87–90 (2010)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Thitirat Siriborvornratanakul
    • 1
  1. 1.Graduate School of Applied StatisticsNational Institute of Development Administration (NIDA)BangkokThailand

Personalised recommendations