Skip to main content

Software Design for Virtual Reality Applications

  • Chapter
  • First Online:
Engineering Haptic Devices

Part of the book series: Springer Series on Touch and Haptic Systems ((SSTHS))

  • 2517 Accesses

Abstract

This chapter addresses the main steps in the development of software for virtual reality applications. After a definition of virtual reality, the general design and architecture of VR systems are presented. Several algorithms widely used in haptic applications such as the definition of virtual walls, penalty- and constraint-based methods, collision detection, and the Voxmap-PointShell Algorithm for 6 DoF interaction are presented. The chapter further includes a brief overview about existing software packages that can be used to develop virtual reality applications with haptic interaction. The concepts of event-based haptics and pseudo-haptics are introduced as perception-based approaches that have to be considered in the software design.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    There are concepts “augmented reality” (AR), “mixed reality” (MR), or even “augmented virtuality” as well, which name different kinds of mixture and embedding of real and virtual objects into a real or virtual environment. This differentiation is not needed in our context, though.

  2. 2.

    The term rendering generally denotes the presentation process done by a computer system. Without further definition, often the image generation process for the graphical representation is meant, but analogously, the production of structured, information-carrying stimuli for other modalities is called acoustic, olfactory, or just haptic rendering.

  3. 3.

    The term protocol refers to a scheme which describes the semantics of an otherwise abstract data stream and determines the order of its elements.

  4. 4.

    The SpaceMouse consists of a knob roughly of the size of the palm of the hand. It includes sensors, which measure forces and torques the user exerts on it. The internal software converts these to relative movements.

  5. 5.

    Very often, one distinguishes between the system developer, who creates the software infrastructure (in our case, the VR system libraries and executables), and the application developer, who uses this infrastructure in order to model a virtual environment with object data and behavioral descriptions.

  6. 6.

    On UNIX-like systems with graphical user interface based on the X-Windows system, this concept is well established. Any computer with a running X-Server can display graphical output for all computers in the network which use the X protocol.

  7. 7.

    There are some efforts to develop software libraries which define a standard server interface for various input devices, e.g., the free open source VRPN (virtual reality peripheral network) system.

  8. 8.

    Apple’s FireWire, SONYs i.Link.

  9. 9.

    Basically, this process is the same for every interactive software system including all widely used programs that have graphical user interfaces—from word processors up to computer games.

  10. 10.

    Cluster in this context denotes a network of computers which collectively perform a task.

  11. 11.

    This basic graphics library usually is DirectX on Windows systems and on others the platform-independent OpenGL (open graphics library).

  12. 12.

    It is spoken of “polygon pumps” in this context as well.

  13. 13.

    For realistic surround sound simulation, geometry information has to be included additionally.

  14. 14.

    For example, platform-independent OpenAL (open audio library) or specific for Microsoft DirectSound3D.

  15. 15.

    Strictly speaking, even in this case, there is a feedback loop which is closed via the interaction of the user with the virtual world. The instabilities, which occur in practice, usually do not result from specific properties of the graphical or acoustical rendering, but from unstable physics simulations. Though resonances occur in the presentation, but not because of the representation. An object may jitter on the monitor, but the luminous flux from the monitor does not get chaotic thereby or overdrives and endangers the user either.

  16. 16.

    Although the limit frequencies for acoustic rendering are above those of haptic rendering, it could stay in the main application loop, when the sample-based approach is used. Only a physically based sound simulation would have to be decoupled, but this should not be discussed further here.

  17. 17.

    The velocity of the haptic probe can be calculated from the position and duration of the last cycle in the main thread of course, but it is more accurate, if this is done in the haptic thread also.

  18. 18.

    E.g., nonuniform rational b-spline tensor surfaces (NURBS).

  19. 19.

    In the implementation of time-critical systems such as haptic control loops, care should be taken not to make repetitive calculations of similar or unnecessary values. Modern compilers are able to optimize such calculations out of the machine code. Some manual “optimizations” may even hinder the compiler to generate an optimized code. This makes it necessary to optimize the algorithm first by some complexity analysis. Afterward, it is implemented, and only after careful considerations, detail optimizations should be made to the code. Anyway tests should be made to prove that the optimized code is performing better than the nonoptimized version. Only very experienced programmers are able to compete against good compilers in this discipline.

  20. 20.

    The exceptional case of the HIP being exactly on the virtual surface at the moment of measure is very unlikely due to the time discretization. Nevertheless, formally, it is part of the contact situation.

  21. 21.

    The haptic probe is called “god-object” by the authors, and another typical term is virtual proxy.

  22. 22.

    The supporting plane is the infinitely extended plane the points of a flat polygon lie on. If the vertices of a polygon are not in one plane, it has to be subdivided into flat polygons before, usually into triangles. This step should go into a preprocess and will not be discussed in further detail in this book.

  23. 23.

    For larger systems of equations, this approach leads to inefficient code though.

  24. 24.

    The calculation of this distance field could be done via an Euclidian distance transformation (EDT).

  25. 25.

    A quite common correlation: The more the memory can be used, the better the runtime performance will be.

  26. 26.

    In these cases, hierarchic bounding volume methods (Sect. 12.3.5) provide some room for optimization, which, however, are not discussed further in the context of this book.

  27. 27.

    For a very helpful introduction into the implementation of rigid body dynamics, simulation based on NewtonEuler equations [1] is recommended.

  28. 28.

    A tree is a cycle-free interconnected graph, consisting of nodes and connecting edges. Directed trees have one root. Leaves are nodes which do not have any “child nodes” and no outgoing branches. Usually, hierarchic collision detection algorithms use binary trees, i.e., trees where all nodes except for the leaves have two child nodes. Octrees, trees with eight child nodes, are another frequent variation.

  29. 29.

    For the choice of the next feature pair, the Voronoi regions of the objects are analyzed.

  30. 30.

    An object modeled of triangles representing a solid body is such a closed polyhedron. For nonclosed polyhedrons with a “hole” in their surface, an inner region cannot be defined so easily.

References

  1. Baraff D, Witkin A, Kass M (1999) Physically based modelling, course notes 36. In: ACM SIGGRAPH ’99, Aug 1999. https://graphics.stanford.edu/courses/cs448b-00-winter/papers/phys_model.pdf

  2. Colgate JE et al (1993) Implementation of stiff virtual walls in force-reflecting interfaces. In: Proceedings of the IEEE virtual reality annual international symposium (VRAIS). Seattle, Sept 1993, pp 202–2008. doi:10.1109/VRAIS.1993.380777

  3. Consortium W X3D, H-Anim, and VRML 97 (2014) specifications. Website. http://www.web3d.org/x3d/specifications/

  4. Ericson C (2004) Real-time collision detection. Morgan Kaufmann, Burlington. ISBN: 1558607323

    Google Scholar 

  5. Goldstein EB (2006) Sensation and perception, 7th edn. Wadsworth Publishing Co., Belmont

    Google Scholar 

  6. Hopp TH, Reeve CP (1996) An algorithm for computing the minimum covering sphere in any dimension. In: Jan 1996. http://www.mel.nist.gov/msidlibrary/doc/hopp95.pdf

  7. Kadlecek P (2010) A practical survey of haptic APIs. Bachelor thesis, Charles University, Prague. http://www.ms.mff.cuni.cz/~kadlp7am/kadlecek_petr_bachelor_thesis.pdf

  8. Langetepe E, Zachmann G (2005) Geometric data structures for computer graphics. AK Peters, Natick. ISBN: 9781568812359. http://akpeters.com/product.asp?ProdCode=2353

  9. Lécuyer A (2009) Simulating haptic feedback using vision: a survey of research and applications of pseudo-haptic feedback. Presence: Teleoperators Virtual Environ 18(1):39–53. doi:10.1162/pres.18.1.39

  10. Lecuyer A et al (2000) Pseudo-haptic feedback: can isometric input devices simulate force feedback? In: Proceedings of virtual reality, 2000. IEEE, pp 83–90. doi:10.1109/VR.2000.840369

  11. Ledo D et al (2012) The haptictouch toolkit: enabling exploration of haptic interactions. In: Proceedings of the 6th international conference on tangible, embedded and embodied interaction, Kingston, Ontario, Canada, pp 115–122. doi:10.1145/2148131.2148157

  12. Lin MC, Canny JF (1991) Efficient algorithms for incremental distance computation. In: IEEE conference on robotics and automation, pp 1008–1014. doi:10.1109/ROBOT.1991.131723

  13. McNeely WA, Puterbaugh KD, Troy JJ (1999) Six degree-of-freedom haptic rendering using voxel sampling. In: SIGGRAPH, pp 401–408. doi:10.1145/1198555.1198605

  14. Pusch A, Lécuyer A (2011) Pseudo-haptics: from the theoretical foundations to practical system design guidelines. In: Proceedings of the 13th international conference on multimodal interfaces. ACM, pp 57–64. doi:10.1145/2070481.2070494

  15. Reissell L-M, Pai DK (2007) High resolution analysis of impact sounds and forces. In: EuroHaptics conference, 2007 and symposium on haptic interfaces for virtual environment and teleoperator systems. World haptics 2007. Second joint. Tsukaba, pp 255–260. doi:10.1109/WHC.2007.70

  16. Ruspini DC, Kolarov K, Khatib O (1997) The haptic display of complex graphical environments. In: Proceedings of the 24th annual conference on computer graphics and interactive techniques, SIGGRAPH ’97. ACM Press/Addison-Wesley Publishing Co., New York, NY, USA, pp 345–352. ISBN: 0-89791-896-7. http://doi.acm.org/10.1145/258734.258878

  17. Sutherland IE (1965) The ultimate display. In: Proceedings of IFIP congress, pp 506–508. http://www.cs.utah.edu/classes/cs6360/Readings/UltimateDisplay.pdf

  18. Zachmann G (2000) Virtual reality in assembly simulation—collision detection. Dissertation, Department of Computer Science, Darmstadt University of Technology, Germany. http://citeseer.ist.psu.edu/zachmann00virtual.html

  19. Zilles CB, Salisbury JK (1995) A constraint-based god-object method for haptic display. In: Proceedings of the international conference on intelligent robots and systems, IROS ’95, vol 03. IEEE Computer Society, Washington, DC, USA, p 3146. ISBN: 0-8186-7108-4. doi: 10.1109/IROS.1995.525876

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Alexander Rettig .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer-Verlag London

About this chapter

Cite this chapter

Rettig, A. (2014). Software Design for Virtual Reality Applications. In: Hatzfeld, C., Kern, T. (eds) Engineering Haptic Devices. Springer Series on Touch and Haptic Systems. Springer, London. https://doi.org/10.1007/978-1-4471-6518-7_12

Download citation

  • DOI: https://doi.org/10.1007/978-1-4471-6518-7_12

  • Published:

  • Publisher Name: Springer, London

  • Print ISBN: 978-1-4471-6517-0

  • Online ISBN: 978-1-4471-6518-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics