Advertisement

Driver in the Loop: Best Practices in Automotive Sensing and Feedback Mechanisms

Chapter
Part of the Human–Computer Interaction Series book series (HCIS)

Abstract

Given the rapid advancement of technologies in the automotive domain, driver--vehicle interaction has recently become more and more complicated. The amount of research applied to the vehicle cockpit is increasing, with the advent of (highly) automated driving, as the range of interaction that is possible in a driving vehicle expands. However, as opportunities increase, so does the number of challenges that automotive user experience designers and researchers will face. This chapter focuses on the instrumentation of sensing and displaying techniques and technologies to make better user experience while driving. In the driver--vehicle interaction loop, the vehicle can sense driver states, analyze, estimate, and model the data, and then display it through the appropriate channels for intervention purposes. To improve the interaction, a huge number of new/affordable sensing (EEG, fNIRS, IR imaging) and feedback (head-up displays, auditory feedback, tactile arrays, etc.) techniques have been introduced. However, little research has attempted to investigate this area in a systematic way. This chapter provides an overview of recent advances of input and output modalities to be used for timely, appropriate driver--vehicle interaction. After outlining relevant background, we provide information on the best-known practices for input and output modalities based on the exchange results from the workshop on practical experiences for measuring and modeling drivers and driver--vehicle interactions at AutomotiveUI 2015. This chapter can help answer research questions on how to instrument a driving simulator or realistic study to gather data and how to place interaction outputs to enable appropriate driver interactions.

Keywords

Secondary Task Haptic Feedback Tactile Feedback Driving Simulator Mental Demand 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. Barrow, J. H., & Baldwin, C. L. (2009). Verbal-spatial cue conflict: implications for the design of collision-avoidance warning systems. In Proceedings of the International Driving Symposium on Human Factors in Driver Assessment, Training and Vehicle Design.Google Scholar
  2. Belz, S. M., Robinson, G. S., & Casali, J. G. (1999). A new class of auditory warning signals for complex systems: Auditory icons. Human Factors, 41(4), 608–618.CrossRefGoogle Scholar
  3. Bicchi, A., Dente, D., & Scilingo, E. P. (2003). Haptic illusions induced by tactile flow. In EuroHaptics Conference (pp. 2412–2417).Google Scholar
  4. Billman, G. E. (2013). The LF/HF ratio does not accurately measure cardiac sympathovagal balance. Frontiers in Physiology, 4(26), 5.Google Scholar
  5. Boverie, S., Giralt, A., Lequellec, J., & Hirl, A. (1998). Intelligent system for video monitoring of vehicle cockpit, SAE Technical Paper, 980613. doi: 10.4271/980613.
  6. Brooke, J. (1996). SUS-A quick and dirty usability scale. Usability Evaluation in Industry, 189(194), 4–7.Google Scholar
  7. Chappell, D., et al. (2006). Priority, market-ready technologies and innovations: Rumble strips, US DOT, Federal Highway Administration, Technical report FHWA-HRT-06-048. http://www.fhwa.dot.gov/resourcecenter/teams/safety/saf_8RUM.pdf.
  8. Chuang, L., & Bülthoff, H. (2015). Towards a better understanding of gaze behavior in the automobile. In Adjunct Proceedings of Automotive UI 2015, Workshop Practical Experiences in Measuring and Modeling Drivers and Driver-Vehicle Interactions (p. 4).Google Scholar
  9. Courage, C., & Baxter, K. (2005). Understanding your users: A practical guide to user requirements: methods, tools, and techniques. Gulf Professional Publishing.Google Scholar
  10. DIN Deutsches Institut fur Normung. (2003). Road vehicles—Ergonomic aspects of transport information and control systems—Specifications and compliance procedures for invehicle visual presentation. Ref No: DIN EN ISO 15008:2003-10.Google Scholar
  11. Fakhrhosseini, S. M., Landry. S., Tan, Y-Y., Bhattarai, S., & Jeon, M. (2014). If you’re angry, turn the music on: Music can mitigate anger effects on driving performance. In Proceedings of the 6th International Conference on Automotive User Interfaces and Vehicular Applications (AutomotiveUI’14), Seattle, WA, USA, September 17–19.Google Scholar
  12. Fischer, E., Haines, R. F., & Price, T. A. (1980). Cognitive issues in head-up displays. NASA Technical Paper 1711, p. 32.Google Scholar
  13. Gable, T. M., Raja, S. R., Samuels, D. P., & Walker, B. N. (2015). Exploring and evaluating the capabilities of Kinect v2 in a driving simulator environment. Proceedings of AutoUI’15 (pp. 297–304). NY, USA: ACM.Google Scholar
  14. Gaver, W. W. (1986). Auditory icons: Using sound in computer interfaces. Human-Computer Interaction, 2, 167–177.CrossRefGoogle Scholar
  15. Hammerschmidt, J., Tünnermann, R., & Hermann, T. (2014). EcoSonic: Auditory displays supporting fuel-efficient driving. In Proceedings of the 8th Nordic Conference on Human-Computer Interaction: Fun, Fast, Foundational (pp. 979–982). ACM.Google Scholar
  16. Harris, H., & Nass, K. (2011). Emotion regulation for frustrating driving contexts. In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems (CHI 2011). Vancouver, BC, Canada: ACM Press.Google Scholar
  17. Hassenzahl, M., Schöbel, M., & Trautmann, T. (2008). How motivational orientation influences the evaluation and choice of hedonic and pragmatic interactive products: The role of regulatory focus. Interacting with Computers, 20(4), 473–479.CrossRefGoogle Scholar
  18. Hermann, T. (2008). Taxonomy and definitions for sonification and auditory display. In Proceedings of the 14th International Conference on Auditory Display (ICAD2008). Paris, France.Google Scholar
  19. Jeon, M., Park, J., Heo, U., & Yun, J. (2009). Enhanced turning point displays facilitate drivers’ interaction with navigation devices. In The 1st International Conference on Automotive User Interfaces and Vehicular Applications (AutomotiveUI2009), September 2009 (pp. 145–148). Essen, Germany: ACM Press.Google Scholar
  20. Jeon, M., Davison, B., Wilson, J., Nees, M., & Walker, B. N. (2009). Enhanced auditory menu cues improve dual task performance and are preferred with in-vehicle technologies. In The 1st International Conference on Automotive User Interfaces and Vehicular Applications (AutomotiveUI2009), September, 2009 (pp. 91–98). Essen, Germany: ACM Press.Google Scholar
  21. Jeon, M., Walker, B. N., & Srivastava, A. (2012). “Spindex” (speech index) enhances menu navigation on touch screen devices with tapping, wheeling, and flicking gestures. ACM Transactions on Computer-Human Interaction (TOCHI), 19(2); 14, 1–27.Google Scholar
  22. Jeon, M., Walker, B. N., & Gable, T. M. (2015). The effects of social interactions with in-vehicle agents on a driver’s angry level, driving performance, situation awareness, and perceived workload. Applied Ergonomics, 50, 185–199.CrossRefGoogle Scholar
  23. Joyce, J. D., & Cianciolo, M. J. (1967). Reactive displays: Improving man-machine graphical communication. In Proceedings of the November 14–16, 1967, Fall Joint Computer Conference (AFIPS’67 (Fall)) (pp. 713–721). New York, NY, USA: ACM. doi: 10.1145/1465611.1465705.
  24. Kern, D., & Schmidt, A. (2009). Design space for driver-based automotive user interfaces. In Proceedings of the First International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI). Essen, Germany.Google Scholar
  25. Kun, A. L., Boll, S., & Schmidt, A. (2016). Shifting gears: User interfaces in the age of autonomous driving. IEEE Pervasive Computing, 15(1), 32–38.CrossRefGoogle Scholar
  26. Lazar, J. (Ed.). (2007). Universal usability: Designing computer interfaces for diverse user populations. Wiley.Google Scholar
  27. Lazar, J., Feng, J. H., & Hochheiser, H. (2010). Research methods in human-computer interaction. Wiley.Google Scholar
  28. Lequellec, J.-M., & Lerasle, F. (2000). Car cockpit 3D reconstruction by a structured light sensor. In Proceedings IEEE Intelligent Vehicles Symposium (pp. 87–92).Google Scholar
  29. Manseer, M., & Riener, A. (2014). Evaluation of driver stress while transiting road tunnels. In 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI’14), September 17–20 (p. 6). Seattle, WA, USA: ACM.Google Scholar
  30. Matthews, T., Dey, A. K., Mankoff, J., Carter, S., & Rattenbury, T. (2004). A toolkit for managing user attention in peripheral displays. In Proceedings of the 17 th Annual ACM Symposium on User Interface Software and Technology (UIST’04) (pp. 247–256). New York, NY, USA: ACM.Google Scholar
  31. McCrickard, D. S., Chewar, C. M., Somervell, J. P., & Ndiwalana, A. (2003). A model for notification systems evaluation—assessing user goals for multitasking activity. ACM Transactions on Computer-Human Interaction (TOCHI’03), 10(4), 312–338.Google Scholar
  32. Moldovan, L., & German-Sallo, Z. (2014). Wavelet transform based HRV analysis. Procedia Technology, 12(2014), 105–111.Google Scholar
  33. Nass, C., Jonsson, I.-M., Harris, H., Reaves, B., Endo, J., Brave, S., & Takayama, L. (2005). Improving automotive safety by pairing driver emotion and car voice emotion. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI05). Portland, Oregon, USA: ACM Press.Google Scholar
  34. Nees, M. A., & Walker, B. N. (2011). Auditory displays for in-vehicle technologies. In P. DeLucia (Ed.), Reviews of human factors and ergonomics (Vol. 7, No. 1, pp. 58–99). Santa Monica, CA: Human Factors and Ergonomics Society.Google Scholar
  35. Nees, M. A., Gable, T., Jeon, M., & Walker, B. N. (2014) Prototype auditory displays for a fuel efficiency driver interface. In Proceedings of the 20th International Conference on Auditory Display (ICAD2014). NY, USA.Google Scholar
  36. Pauzié, A. (2008). A method to assess the driver mental workload: The driving activity load index (DALI). IET Intelligent Transport Systems, 2(4), 315–322.CrossRefGoogle Scholar
  37. Pelaez, G., Garcia, F., de la Escalera, A., & Armingol, J. (2014). Driver monitoring based on low-cost 3-D sensors. IEEE Transactions on Intelligent Transportation Systems, 15(4), 1855–1860.CrossRefGoogle Scholar
  38. Pousman, Z., & Stasko, J. (2006). A taxonomy of ambient information systems: Four patterns of design. In Proceedings of the Working Conference on Advanced Visual Interfaces (AVI’06) (pp. 67–74). New York, NY, USA: ACM.Google Scholar
  39. Repperger, D. W., Phillips, C. A., Berlin, J. E., Neidhard-Doll, A. T., & Haas, M. W. (2005). Human-machine haptic interface design using stochastic resonance methods. IEEE Transactions on Systems, Man, and Cybernetics, Part A, 35(4), 574–582.Google Scholar
  40. Riener, A. (2009). Sensor-actuator supported implicit interaction in driver assistance systems, Ph.D. thesis, Johannes Kepler University Linz, Austria, 329 pp.Google Scholar
  41. Riener, A. (2010). Sensor-actuator supported implicit interaction in driver assistance systems (p. 316). Vieweg+Teubner.Google Scholar
  42. Riener, A. (2011). Sitting postures & electrocardiograms: A method for continuous and unobtrusive driver authentication. In I. Traore & A. A. E. Ahmed (Eds.), Continuous authentication based on biometrics: Data, models, and metrics (pp. 137–168). 701 E. Chocolate Ave. Hershey, PA 17033, USA: IGI Global.Google Scholar
  43. Riener, A., & Anzengruber, B. (2012). “FaceLight”—potentials and drawbacks of thermal imaging to infer driver stress. In 4th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI’12), October 17–19 (p. 8). Portsmouth, NH, USA: ACM.Google Scholar
  44. Riener, A., & Ferscha, A. (2008). Supporting implicit human-to-vehicle interaction: Driver identification from sitting postures. In International Symposium on Vehicular Computing Systems (ISVCS 2008). Dublin, Ireland: ACM Digital Library, Trinity College, July 22–24, 2008.Google Scholar
  45. Riener, A., & Noldi, J. (2015). Cognitive load estimation in the car: Practical experience from lab and on-road tests. In Adjunct Proceedings of Automotive UI 2015, Workshop Practical Experiences in Measuring and Modeling Drivers and Driver-Vehicle Interactions (p. 4).Google Scholar
  46. Riener, A., Aly, M., & Ferscha, A. (2009). Heart on the road: HRV analysis for monitoring a driver’s affective state. In 1 st International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI 2009), September 21–22 (p. 8). Essen, Germany: ACM.Google Scholar
  47. Riener, A., Ferscha, A., Frech, P., Hackl, M., & Kaltenberger, M. (2010). Subliminal vibrotactile based notification of CO2 economy while driving. In Proceedings AutomotiveUI 2010 (pp. 92–101). Pittsburgh, Pennsylvania, USA: ACM, November 11–12, 2010.Google Scholar
  48. Robles-De-La-Torre, G. (2006). The importance of the sense of touch in virtual and real environments. IEEE Multimedia, 13(3), 24–30.CrossRefGoogle Scholar
  49. Russell, J. A., Weiss, A., & Mendelsohn, G. A. (1989). Affect grid: A single-item scale of pleasure and arousal. Journal of Personality and Social Psychology, 57(3), 493.Google Scholar
  50. Scholles, M. (2016). ECG monitoring of motorists without skin contact, Fraunhofer Group for Microelectronics, April 2016. http://www.mikroelektronik.fraunhofer.de/en/press-and-media/microelectronics-news/news/detail/News/ecg-monitoring-of-motorists-without-skin-contact-1943.html.
  51. Stanton, N. A., Salmon, P. M., Rafferty, L. A., Walker, G. H., Baber, Ch., & Jenkins, D. P. (2013). Human factors methods: A practical guide for engineering and design (2nd ed.). Taylor & Francis Ltd.Google Scholar
  52. Stevens, A., et al. (2002). Design guidelines for safety of in-vehicle information systems. A Report by Transport Research Laboratory in the Department of Tansport, Local, Government and Regions.Google Scholar
  53. Tawari, S. M., & Trivedi, M. (2014). Continuous head movement estimator for driver assistance: Issues, algorithms, and on-road evaluations. IEEE Transactions on Intelligent Transportation Systems, 15(2), 818–830.CrossRefGoogle Scholar
  54. Toney, A., Dunne, L., Thomas, B. H., & Ashdown, S. P. (2003). A shoulder pad insert vibrotactile display. In Seventh IEEE International Symposium on Wearable Computers (ISWC’03) (pp. 35–44), White Plains, New York.Google Scholar
  55. University of Florida. (2016). Drivers’ motion depth database—DMDDB. Retrieved May 2nd 206 from http://research.dwi.ufl.edu/dmddb/.
  56. Walker, B. N., Lindsay, J., Nance, A., Nakano, Y., Palladino, D. K., Dingler, T., et al. (2013). Spearcons (Speech-based earcons) improve navigation performance in advanced auditory menus. Human Factors, 55(1), 157–182.CrossRefGoogle Scholar
  57. Watson, D., Clark, L. A., & Tellegen, A. (1988). Development and validation of brief measures of positive and negative affect: the panas scales. Journal of Personality and Social Psychology, 54(6), 1063.CrossRefGoogle Scholar
  58. Weinstein, S. (1968). Intensive and extensive aspects of tactile sensitivity as a function of body part, sex, and laterality. In D. R. Kenshalo (Ed.), The skin senses (pp. 195–222). Springfield, IL: Charles C. Thomas.Google Scholar
  59. Wintersberger, P, Riener, A., & Frison, A. K. (2016). Automated vehicle, male or female driver: Who’d You Prefer? Comparative analysis of passengers’ mental conditions, emotional states and qualitative feedback. In 8th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI’16), October 24–26 (p. 9). ACM: Ann Arbor, Michigan, USA. doi: http://dx.doi.org/10.1145/3003715.3005410
  60. Wolfe, J. M., Kluender, K. R., & Levi, D. M. (2014). Sensation and perception (4th ed.). Sinauer Associates, Inc.Google Scholar
  61. Wuhe, Z., Lei, Z., & Ning, D. (2014). Sensing driver awareness by combining fisheye camera and Kinect. In Proceedings of SPIE 9276, Optical Metrology and Inspection for Industrial Applications III, 927624, November 13, 2014.Google Scholar
  62. Xu, L., & Fujimura, K. (2014). Real-time driver activity recognition with random forests. In Proceedings of AutoUI’14 (8 p.). NY, USA: ACM.Google Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.Faculty of Electrical Engineering and Computer Science & CARISSMA (Center of Automotive Research on Integrated Safety Systems and Measurement Area)University of Applied Sciences IngolstadtIngolstadtGermany
  2. 2.Department of Cognitive and Learning Sciences & Department of Computer ScienceMichigan Technological UniversityHoughtonUSA
  3. 3.Intel LabsIntel CorporationHillsboroUSA
  4. 4.Faculty of Electrical Engineering and Computer ScienceUniversity of Applied Sciences IngolstadtIngolstadtGermany

Personalised recommendations