Skip to main content

Robo-Stim: Modes of Human Robot Collaboration for Design Exploration

  • Conference paper
  • First Online:
Impact: Design With All Senses (DMSB 2019)

Included in the following conference series:

Abstract

Augmented and virtual reality in combination with robotics offers unique design opportunities centered around novel human machine collaboration. The coordination of these tools allows for unique modes of collaborative design, sensory stimulation and strategic deception. In this paper, we outline the components and calibration of a collaborative environment which coordinates industrial robotics, mobile augmented reality, and virtual reality. This technical setup allows for multiple modes of experimentation, encouraging reflection on the larger domain of human-robotic collaboration. As such, this work facilitates the definition of four discrete modes of robotic collaboration: Stop-Gap Collaboration, Manual-Assist Collaboration, Creative Collaboration, and Environmental Collaboration. Stop-Gap Collaboration uses humans to bridge technical gaps in automated systems. Manual-Assist Collaboration uses digital tools to augment the human execution of technical tasks. Creative Collaboration prioritizes creative expression, while Environmental Collaboration considers humans as agents in occupied and continuously evolving robotic environments. As robotics gains prominence in design and manufacturing, it becomes increasingly important to examine the role of human beings in partially automated workflows—prioritizing creativity and environmental adaptivity in design applications.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Gramazio, F., Kohler, M., Willmann, J.: The Robotic Touch: How Robots Change Architecture. Park Books (2014)

    Google Scholar 

  2. Felbrich, B., Jahn, G., Newnham, C., Menges, A.: Self-organizing maps for intuitive gesture-based geometric modelling in augmented reality. In: IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR) 2018, Taichung, Taiwan (2018)

    Google Scholar 

  3. Johns, R.L.: Augmented materiality: modelling with material indeterminacy. In: Gramazio, F., Kohler, M., Langenberg, S. (eds.) Fabricate, gta Verlag, Zurich, pp 216–223 (2014)

    Google Scholar 

  4. Mueller, S., Seufert, A., Peng, H., Kovacs, R., Reuss, K., Guimbretière, F., Baudisch, P.: FormFab: continuous interactive fabrication. In: Proceedings of the Thirteenth International Conference on Tangible, Embedded, and Embodied Interaction, TEI 2019, pp. 315–323. ACM, New York (2019). https://doi.org/10.1145/3294109.3295620

  5. Horster. http://www.food4rhino.com/app/horster-camera-control-grasshopper. Accessed 10 June 2019

  6. Firefly Plugin. http://fireflyexperiments.com/. Accessed 10 June 2019

  7. Unity Game Engine. https://unity3d.com. Accessed 10 June 2019

  8. ARKit. https://developer.apple.com/documentation/arkit. Accessed 10 June 2019

  9. ARCore. https://developers.google.com/ar. Accessed 10 June 2019

  10. Karlsson, N., di Bernardo, E., Ostrowski, J., Goncalves, L., Pirjanian, P., Munich, M.E.: The vSLAM algorithm for robust localization and mapping. In: Proceedings of the 2005 IEEE International Conference on Robotics and Automation ICRA 2005 (2005)

    Google Scholar 

  11. Vuforia. https://developer.vuforia.com/downloads/sdk. Accessed 10 June 2019

  12. Ban, Y., Narumi, T., Tanikawa, T., Hirose, M.: Magic pot: interactive metamorphosis of the perceived shape. In: ACM SIGGRAPH 2012 Emerging Technologies (SIGGRAPH 2012), vol. 14, p. 1. ACM, New York (2012). https://doi.org/10.1145/2343456.2343470

  13. Lopes, P., You, S., Cheng, L.-P., Marwecki, S., Baudisch, P.: Providing haptics to walls and other heavy objects in virtual reality by means of electrical muscle stimulation. In: CHI 2017, pp. 1471–1482 (2017)

    Google Scholar 

  14. Matsumoto, K., Ban, Y., Narumi, T., Yanase, Y., Tanikawa, T., Hirose, M.: Unlimited corridor: redirected walking techniques using visuo haptic interaction. In: ACM SIGGRAPH 2016 Emerging Technologies, Anaheim, California, p. 20 (2016)

    Google Scholar 

  15. Pizarro, R., Berkers, K.-O., Slater, M., Friedman, D.: How to time travel in highly immersive virtual reality, Eurographics (2015)

    Google Scholar 

  16. HTC Vive. https://www.vive.com/. Accessed 10 June 2019

  17. Hallenberg, J.: Robot tool center point calibration using computer vision, Master’s thesis, Linkopings University, p. 7 (2007)

    Google Scholar 

  18. Photon Unity Networking. https://www.photonengine.com/pun. Accessed 10 June 2019

  19. Jaikumar, R.: Postindustrial manufacturing. Harvard Bus. Rev. 64(6), 69–76 (1986)

    Google Scholar 

  20. Johns, R.L., Foley, N.: Irregular substrate tiling: the robotic poché. In: Brell-Cokcan, S., Braumann, J. (eds.) Rob—Arch 2012: Robotic Fabrication in Architecture, Art and Design, pp. 222–229. Springer (2013)

    Google Scholar 

  21. Bonwetsch, T., Gramazio, F., Kohler, M.: Digitally fabricating non-standardized brick walls. In: Rotterdam, D.M. (ed.) Proceedings of the Conference on Sharp ManuBuild, pp. 191–196 (2007)

    Google Scholar 

  22. Lafreniere, B., Coelho, M.H., Cote, N., Li, S., Nogueira, A., Nguyen, L., Schwinn, T., Stoddart, J., Thomasson, D., Wang, R., White, T., Grossman, T., Benjamin, D., Conti, M., Menges, A., Fitzmaurice, G., Anderson, F., Matejka, J., Kerrick, H., Nagy, D., Vasey, L., Atherton, E., Beirne, N.: Crowdsourced fabrication. In: Proceedings of the 29th Annual Symposium on User Interface Software and Technology – UIST, vol. 16 (2016)

    Google Scholar 

  23. Giordano, P.R., Masone, C., Tesch, J., Breidt, M., Pollini, L., Bulthoff, H.H.: A novel framework for closed-loop robotic motion simulation - Part I: inverse kinematics design. In: 2010 IEEE International Conference on Robotics and Automation, Anchorage Convention District, Anchorage, Alaska, USA, 3–8 May 2010 (2010)

    Google Scholar 

  24. Zhang, J., Fiers, P., Witte, K.A., Jackson, R.W., Poggensee, K.L., Atkeson, C.G., Collins, S.H.: Human-in-the-loop optimization of exoskeleton assistance during walking. Science 356(6344), 1280–1284 (2017). https://doi.org/10.1126/science.aal5054

    Article  Google Scholar 

  25. Holographic Assembly of Regular Elements (Fologram). https://vimeo.com/231986294. Accessed 10 June 2019

  26. Rivers, A., Moyer, I.E., Durand, F.: Position-correcting tools for 2D digital fabrication. In: SIGGRAPH (2012)

    Google Scholar 

  27. Lopes, P., Yueksel, D., Guimbretière, F., Baudisch, P.: Muscle-plotter: an interactive system based on electrical muscle stimulation that produces spatial output. In: UIST 2017, pp. 207–217 (2017)

    Google Scholar 

  28. Prévost, R., Jacobson, A., Jarosz, W., Sorkine-Hornung, O.: Large-scale painting of photographs by interactive optimization. Comput. Graph. 55, 108–117 (2015)

    Article  Google Scholar 

  29. Rosenberg, L.B.: Virtual fixtures: perceptual tools for telerobotic manipulation. In: Proceedings Virtual Reality Annual International Symposium, pp. 76–82 (1993)

    Google Scholar 

  30. Installation (Simon Greenwold). http://acg.media.mit.edu/people/simong/installationNew/hardware.html. Accessed 10 June 2019

  31. Johns, R.L.: Augmented reality and the fabrication of gestural form. In: Brell-Cokcan, S., Braumann, J. (eds.) Rob—Arch 2012: Robotic Fabrication in Architecture, Art and Design, pp. 248–255. Springer (2013)

    Google Scholar 

  32. Tilt Brush by Google. https://www.tiltbrush.com/. Accessed 10 June 2019

  33. Chung, S.: Drawing Operations. https://sougwen.com/project/drawing-operations. Accessed 10 June 2019

  34. Kilian, A.: The flexing room architectural robot an actuated active-bending robotic structure using human feedback. In: ACADIA (2018)

    Google Scholar 

Download references

Acknowledgements

The collaborative multi-robot environment was developed as part of a workshop for the Rob|Arch 2018 conference, hosted by NCCR Digital Fabrication and ETH Zurich. We would like to thank those organizations for arranging for the necessary infrastructure used in the workshop (and for providing the three UR robots). Thanks also to the workshop participants for engaging and developing upon the experimental setup: Miguel Aflalo, Susana Alarcon Licona, Albert Maksoudian, Max Maxwell, Curime Batliner, Rushi Dai, Dana Luo, Loren Adams, Santiago Perez, Mitchell Page, Neda Rafizade Askari, and Margot Warre.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ryan Luke Johns .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Johns, R.L., Anderson, J., Kilian, A. (2020). Robo-Stim: Modes of Human Robot Collaboration for Design Exploration. In: Gengnagel, C., Baverel, O., Burry, J., Ramsgaard Thomsen, M., Weinzierl, S. (eds) Impact: Design With All Senses. DMSB 2019. Springer, Cham. https://doi.org/10.1007/978-3-030-29829-6_52

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-29829-6_52

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-29828-9

  • Online ISBN: 978-3-030-29829-6

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics