RealityBrush: an AR authoring system that captures and utilizes kinetic properties of everyday objects


This study introduces RealityBrush, a novel augmented reality (AR) authoring system that allows designers to quickly and easily create realistic virtual objects by capturing and utilizing the kinetic properties of everyday physical objects in the early stages of design. The RealityBrush system consists of a handheld device, a data analysis module and an AR feedback module. The handheld device, which is made in the shape of a rod, is equipped with a depth camera and a force sensor at the tip. When a user holds the device and pokes a physical object, the local force applied to the object and the resulting deformations of the object are measured simultaneously. By analyzing the relationship between measured force and deformations, the RealityBrush system can identify two kinetic properties of the poked object: stiffness and motion resistance. The user can then use the handheld device as a 3D brush to create a virtual object in the air and assign the measured kinetic properties to the created virtual object. Finally, the system’s physics engine allows the user to interact with the created object by using the device to poke or push the object. The technical evaluation showed that the system can successfully extract the stiffness and motion resistance of everyday objects. We also report initial user feedback on AR authoring using the RealityBrush system.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13


  1. 1.

    3D Systems Phantom Premium. Available at (2020/04/12)

  2. 2.

    Adobe Aero. Retrieved from (2020/04/24)

  3. 3.

    Bickel B, Bächer M, Otaduy MA, Matusik W, Pfister H, Gross M (2009) Capture and modeling of non-linear heterogeneous soft tissue. In: ACM transactions on graphics (TOG), vol 28. ACM, p 89

  4. 4.

    Bickel B, Bächer M, Otaduy MA, Lee HR, Pfister H, Gross M, Matusik W (2010) Design and fabrication of materials with desired deformation behavior. ACM Trans Graph (TOG) 29(4):63

    Article  Google Scholar 

  5. 5.

    Bowden FP, Bowden FP, Tabor D (2001) The friction and lubrication of solids, vol 1. Oxford University Press, Oxford

    Google Scholar 

  6. 6.

    Carfagni M, Furferi R, Governi L, Servi M, Uccheddu F, Volpe Y (2017) On the performance of the intel sr300 depth camera: metrological and critical characterization. IEEE Sensors J 17(14):4508–4519

    Article  Google Scholar 

  7. 7.

    Fernandez-Sanchez E, Diaz J, Ros E (2013) Background subtraction based on color and depth using active sensors. Sensors 13(7):8895–8915

    Article  Google Scholar 

  8. 8.

    Follmer S, Ishii H (2012) Kidcad: digitally remixing toys through tangible tools. In: Proceedings of the SIGCHI conference on human factors in computing systems, CHI ’12. ACM, New York, NY, USA, pp 2401–2410

  9. 9.

    Hartmann B, Morris MR, Benko H, Wilson AD (2010) Pictionaire: supporting collaborative design work by integrating physical and digital artifacts. In: Proceedings of the 2010 ACM conference on computer supported cooperative work, CSCW ’10. ACM, New York, NY, USA, pp 421–424

  10. 10.

    Hettiarachchi A, Nanayakkara S, Yeo KP, Shilkrot R, Maes P (2013) Fingerdraw: more than a digital paintbrush. In: Proceedings of the 4th augmented human international conference, AH ’13. ACM, New York, NY, USA, pp 1–4

  11. 11.

    Holm R, Stauder E, Wagner R, Priglinger M, Volkert J (2002) A combined immersive and desktop authoring tool for virtual environments. In: Proceedings IEEE Virtual Reality 2002, pp 93–100

  12. 12.

    Hong S, Jeong E, Heo S, Lee B (2018) Fdsense: estimating young’s modulus and stiffness of end effectors to facilitate kinetic interaction on touch surfaces. In: Proceedings of the 31th annual ACM symposium on user interface software and technology, UIST ’18. ACM, New York, NY, USA

  13. 13.

    Israel JH, Wiese E, Mateescu M, Zöllner C, Stark R (2009) Investigating three-dimensional sketching for early conceptual design—results from expert discussions and user studies. Comput Graph 33(4):462–473

    Article  Google Scholar 

  14. 14.

    Jensen SQ, Fender A, Müller J (2018) Inpher: inferring physical properties of virtual objects from mid-air interaction. In: Proceedings of the 2018 CHI conference on human factors in computing systems, CHI ’18. ACM, New York, NY, USA, pp 530:1–530:5

  15. 15.

    Jeon S, Choi S (2009) Haptic augmented reality: taxonomy and an example of stiffness modulation. Presence: Teleoperators and Virtual Environments 18(5):387–408

    Article  Google Scholar 

  16. 16.

    Kattinakere RS, Grossman T, S Subramanian (2007) Modeling steering within above-the-surface interaction layers. In: Proceedings of the SIGCHI conference on human factors in computing systems, CHI ’07. Association for Computing Machinery, New York, NY, USA, pp 317–326

  17. 17.

    Keefe DF, Feliz DA, Moscovich T, Laidlaw DH, LaViola JrJ J (2001) Cavepainting: a fully immersive 3d artistic medium and interactive experience. In: Proceedings of the 2001 symposium on Interactive 3D graphics. Citeseer, pp 85–93

  18. 18.

    Kim H, Kim S, Lee B, Pak J, Sohn M, Lee G, Lee W (2008) Digital rubbing: playful and intuitive interaction technique for transferring a graphic image onto paper with pen-based computing. In: CHI ’08 extended abstracts on human factors in computing systems, CHI EA ’08. ACM, New York, NY, USA, pp 2337–2342

  19. 19.

    Kuchenbecker KJ, Fiene J, Niemeyer G (2006) Improving contact realism through event-based haptic feedback. IEEE Trans Vis Comput Graph 12 (2):219–230

    Article  Google Scholar 

  20. 20.

    Lee GA, Kim GJ, Park C-M (2002) Modeling virtual object behavior within virtual environment. In: Proceedings of the ACM symposium on virtual reality software and technology, VRST ’02. ACM, New York, NY, USA, pp 41–48

  21. 21.

    Lee GA, Kim GJ, Billinghurst M (2005) Immersive authoring: what you experience is what you get (wyxiwyg). Commun ACM 48(7):76–81

    Article  Google Scholar 

  22. 22.

    Liu H, Philipose M, Sun M -T (2014) Automatic objects segmentation with rgb-d cameras. J Vis Commun Image Represent 25(4):709–718

    Article  Google Scholar 

  23. 23.

    Lorensen WE, Cline HE (1987) Marching cubes: a high resolution 3d surface construction algorithm. In: ACM siggraph computer graphics, vol 21. ACM, pp 163–169

  24. 24.

    Ma ZZJ (2018) Current-based force input/output control for novel haptic interaction using the inFORCE shape display. PhD thesis, Massachusetts Institute of Technology

  25. 25.

    Nguyen C, DiVerdi S, Hertzmann A, Liu F (2017) Vremiere: in-headset virtual reality video editing. Association for Computing Machinery, New York, NY, USA, pp 5428–5438

    Google Scholar 

  26. 26.

    Property information, young’s modulus and specific stiffness. Retrieved from (2020/04/27)

  27. 27.

    Ryokai K, Marti S, Ishii H (2004) I/o brush: drawing with everyday objects as ink. In: Proceedings of the SIGCHI conference on human factors in computing systems, CHI ’04. ACM, New York, NY, USA, pp 303–310

  28. 28.

    Ryokai K, Marti S, Ishii H (2005) Designing the world as your palette. In: CHI ’05 extended abstracts on human factors in computing systems, CHI EA ’05. ACM, New York, NY, USA, pp 1037–1049

  29. 29.

    Ryokai K, Marti S, Ishii H (2007) I/o brush: beyond static collages. In: CHI ’07 extended abstracts on human factors in computing systems, CHI EA ’07. ACM, New York, NY, USA, pp 1995–2000

  30. 30.

    Tilt Brush by Google. Retrieved from (2020/04/24)

  31. 31.

    Umakatsu A, Kiyokawa K, Mashita T, Takemura H (2014) Implementation and evaluation of pinch-n-paste : direct texture transfer interaction in augmented reality. Transactions of the Virtual Reality Society of Japan 19(2):141–151

    Google Scholar 

  32. 32.

    Unreal Engine VR Mode. Retrieved from (2020/04/24)

  33. 33.

    Wasenmüller O, Stricker D (2016) Comparison of kinect v1 and v2 depth images in terms of accuracy and precision. In: Asian conference on computer vision. Springer, pp 34–45

  34. 34.

    Wickiewicz TL, Roy RR, Powell PL, Perrine JJ, Edgerton VR (1984) Muscle architecture and force-velocity relationships in humans. J Appl Physiol 57 (2):435–443

    Article  Google Scholar 

  35. 35.

    Wozniewski M, Warne P (2011) Towards in situ authoring of augmented reality content. In: Proceedings ISMAR

  36. 36.

    Wu J, Yildirim I, Lim JJ, Freeman B, Tenenbaum J, Garnett R (2015) Galileo: perceiving physical object properties by integrating a physics engine with deep learning. In: Cortes C, Lawrence N D, Lee D D, Sugiyama M (eds) Advances in neural information processing systems, vol 28. Curran Associates, Inc., pp 127–135

  37. 37.

    Yamauchi J, Mishima C, Nakayama S, Ishii N (2010) Aging-related differences in maximum force, unloaded velocity and power of human leg multi-joint movement. Gerontology 56(2):167–174

    Article  Google Scholar 

Download references


This research was funded by the National Research Foundation of Korea (2020R1A2C4002146), the Korea Creative Content Agency (R2019020010), and partly by Ministry of Trade, Industry and Energy (10077849).

Author information



Corresponding author

Correspondence to Byungjoo Lee.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Kim, H., Hong, S., Kim, J. et al. RealityBrush: an AR authoring system that captures and utilizes kinetic properties of everyday objects. Multimed Tools Appl (2020).

Download citation


  • Augmented reality
  • Kinetic property
  • AR authoring