Advertisement

Cybernics pp 111-131 | Cite as

Augmented Human Technology

Chapter

Abstract

In order to create a future society where assisted lifestyles will become widely available, we need technology that will support, strengthen, and enhance limited human capabilities. The supports for both physical and cognitive functions are definitely needed for the future rehabilitation and physical exercise. In this chapter, a cognitive neuroscience approach for realizing augmented human technology in order to enhance, strengthen, and support human cognitive capabilities is described. Wearable devices allow the subject high mobility and broaden the spectrum of environments in which bodily motion and physiological signal recognition can be carried out. In this scenario, augmented human technology is regarded as a wearable device technology that enhances human capabilities, particularly cognitively assisted action and perception.

Keywords

Wearable device Bioelectrical signal processing Biofeedback Kinematic and physiological cues Biomechanical analysis 

References

  1. 1.
    Volpe BT, Krebs HI, Hogan N, Edelstein L, Diels C, Aisen M (2000) A novel approach to stroke rehabilitation – robot-aided sensorimotor stimulation. Neurology 54(10):1938–1944CrossRefGoogle Scholar
  2. 2.
    Hogan N, Krebs HI, Charnnarong J, Srikrishna P, Sharon A (1992) MIT-MANUS: a workstation for manual therapy and training. In: IEEE international workshop on robot and human communication. Tokyo, Japan, pp 161–165Google Scholar
  3. 3.
    Krewer C, Heller S, Husemann B, Mller F, Koenig E (2007) Effects of locomotion training with assistance of a robot-driven gait orthosis in hemiparetic patients after stroke: a randomized controlled pilot study. Stroke 38:349–354CrossRefGoogle Scholar
  4. 4.
    Wirz M, Zemon DH, Rupp R, Scheel A, Colombo G, Dietz V, Hornby TG (2005) Effectiveness of automated locomotor training in patients with chronic incomplete spinal cord injury: a multicenter trial. Arch Phys Med Rehabil 86(4):672–680CrossRefGoogle Scholar
  5. 5.
    Suzuki K, Mito G, Kawamoto H, Hasegawa Y, Sankai Y (2007) Intention-based walking support for paraplegia patients with robot suit HAL. Adv Robot 21(12):1441–1469Google Scholar
  6. 6.
    Tsukahara A, Kawanishi R, Hasegawa Y, Sankai Y (2010) Sit-to-stand and stand-to-sit transfer support for complete paraplegic patients with robot suit HAL. Adv Robot 24(11):1615–1638CrossRefGoogle Scholar
  7. 7.
    Association for Applied Psychophysiology and Biofeedback (2008) http://www.aapb.org/
  8. 8.
    Wallace MT, Wilkinson LK, Stein BE (1996) Representation and integration of multiple sensory inputs in primate superior colliculus. J Neurophysiol 76(2):1246–1266Google Scholar
  9. 9.
    Nakamura Y, Yamane K, Suzuki I, Fujita Y (2004) Somatosensory computation for man-machine interface from motion capture data and musculoskeletal human model. IEEE Trans Robot 21(1):58–66CrossRefGoogle Scholar
  10. 10.
    Delp SL, Loan JP, Hoy MG, Zajac FE, Topp EL, Rosen JM (1990) An interactive graphic-based model of the lower extremity to study orthopaedic surgical procedures. IEEE Trans Biomed Eng 37(10):757–767CrossRefGoogle Scholar
  11. 11.
    Murai A, Kurosawa K, Yamane K, Nakamura Y (2009) Computationally fast estimation of muscle tension for realtime bio-feedback. In: Annual international conference of the IEEE EMBS. Minnesota, US, pp 6546–6549Google Scholar
  12. 12.
    Narumi T, Nishizaka S, Kajinami T, Tanikawa T, Hirose M (2011) Meta cookie+: an illusion-based gustatory display. Lecture notes in computer science, vol 6773/2011. Springer, pp 260–269Google Scholar
  13. 13.
    Hamanaka M, Lee S (2006) Sound scope headphones: controlling an audio mixer through natural movement. In: 2006 international computer music conference. New Orleans, USA, pp 155–158Google Scholar
  14. 14.
    Scheirer J, Fernandez R, Picard W (1999) Expression glasses: a wearable device for facial expression recognition. In: CHI ‘99 extended abstracts on human factors in computing systems, pp 262–263Google Scholar
  15. 15.
    Mistry W, Maes P (2009) SixthSense–a wearable gestural interface. In: SIGGRAPH Asia 2009, Emerging Technologies. YokohamaGoogle Scholar
  16. 16.
    Azuma RT (1997) A survey of augmented reality. Presence 6(4):355–385Google Scholar
  17. 17.
    Tamura H, Yamamoto H, Katayama A (2001) Mixed reality: future dreams seen at the border between real and virtual worlds. IEEE Comput Graph Appl Mag 21(6):64–70CrossRefGoogle Scholar
  18. 18.
    Zeng H, Zhao Y (2011) Sensing movement: microsensors for body motion measurement. Sensors 11:638–660CrossRefGoogle Scholar
  19. 19.
    Fridlund A, Cacioppo JT (1986) Guidelines for human electromyographic research. Psychophysiology 23:567–589CrossRefGoogle Scholar
  20. 20.
    Tsenov G, Zeghbib A, Palis F, Soylev N, Mladenov V (2008) Visualization of an on-line classification and recognition algorithm of EMG signals. J Univ Chem Technol Metall 43(1):154–158Google Scholar
  21. 21.
    Naik G, Kumar D, Singh V, Palaniswam M (2006) Hand gestures for HCI using ICA of EMG. In: HCSNet workshop on the use of vision in human-computer interaction, vol 56, Canberra, AustraliaGoogle Scholar
  22. 22.
    Igarashi N, Suzuki K, Kawamoto H, Sankai Y (2010) BioLights: light emitting wear for visualizing lower-limb muscle activity. In: Annual international conference of the IEEE EMBS. Buenos Aires, Argentina, pp 6393–6396Google Scholar
  23. 23.
    Stroeve S (1999) Impedance characteristics of a neuromusculoskeletal model of the human arm I. posture control. Biol Cybern 81:475–494CrossRefMATHGoogle Scholar
  24. 24.
    Hill A (1938) The heat of shortening and the dynamic constants of muscle. Royal Soc Lond B126:136–195CrossRefGoogle Scholar
  25. 25.
    Winters JM, Stark L (1985) Analysis of fundamental human movement patterns through the use of in-depth antagonistic muscle models. IEEE Trans Biomed Eng 32(10):826–839CrossRefGoogle Scholar
  26. 26.
    Tsubouchi Y, Suzuki K (2010) BioTones: a wearable device for EMG auditory biofeedback. In: Annual international conference of the IEEE EMBS, Buenos Aires, Argentina, pp 6543–6546Google Scholar
  27. 27.
    Knapp RB, Lusted HS (1990) A bioelectric controller for computer music applications. Comput Music J 14:42–47CrossRefGoogle Scholar
  28. 28.
    Atau T (2000) Musical performance practice on sensor-based instruments, trends in gestural control of music. Science et Musique 14:389–405Google Scholar
  29. 29.
    Budzynski TH, Stoyva JM (1969) An instrument for producing deep muscle relaxation by means of analog information feedback. J Appl Behav Anal 2:231–237CrossRefGoogle Scholar
  30. 30.
    Epstein LH, Hersen M, Hemphill DP (1974) Music feedback in the treatment of tension headache: an experimental case study. J Behav Ther Exp Psychiatry 5(1):59–63CrossRefGoogle Scholar
  31. 31.
    Alexander AB, French CA, Goodman NJ (1975) A comparison of auditory and visual feedback in biofeedback assisted muscular relaxation training. Soc Psychophysiol Res 12:119–124CrossRefGoogle Scholar
  32. 32.
    Iida K, Suzuki K (2011) Enhanced touch: a wearable device for social playware. In: ACM 8th advances in computer entertainment technology conference. doi: 10.1145/2071423.2071524
  33. 33.
    Pardew EM, Bunse C (2005) Enhancing interaction through positive touch. Young Except Child 8(2):21–29CrossRefGoogle Scholar
  34. 34.
    Baba T, Ushiama T, Tomimatsu K (2007) Freqtric drums: a musical instrument that uses skin contact as an interface. In: International conference on new interfaces for musical expression, New York, USA, pp 386–387Google Scholar
  35. 35.
    Zimmerman TG (1996) Personal area networks: near-field intrabody communication. IBM Syst J 35(3/4):609–617CrossRefGoogle Scholar
  36. 36.
    Suzuki K, Iida K, Shimokakimoto T (2012) Social playware for supporting and enhancing social interaction. In: 17th international symposium on artificial life and robotics. Oita, Japan, pp.39–42Google Scholar
  37. 37.
    Pozzo T, Berthoz A, Lefort L (1990) Head stabilization during various locomotor tasks in humans. I. Normal subjects. Exp Brain Res 82:97–106CrossRefGoogle Scholar
  38. 38.
    Pozzo T, Berthoz A, Lefort L, Vitte E (1991) Head stabilization during various locomotor tasks in humans. II. Patients with bilateral peripheral vestibular deficits. Exp Brain Res 85:208–217CrossRefGoogle Scholar
  39. 39.
    Kadone H, Bernardin D, Bennequin D, Berthoz A (2010) Gaze anticipation during human locomotion – top-down organization that may invert the concept of locomotion in humanoid robots. Int Symp Robot Hum Interact Commun 19:587–592Google Scholar
  40. 40.
    Ekman P, Friesen WV, Ellsworth P (1982) What emotion categories or dimensions can observers judge from facial behavior? In: Ekman P (ed) Emotion in the human face. Cambridge University Press, CambridgeGoogle Scholar
  41. 41.
    Gruebler A, Suzuki K (2010) Measurement of distal EMG signals using a wearable device for reading facial expressions. In: Annual international conference of the IEEE EMBS. Buenos Aires, Argentina, pp 4594–4597Google Scholar
  42. 42.
    Jayatilake D, Suzuki K (2012) Robot assisted facial expressions with segmented shape memory alloy actuators. Int J Mech Autom 1(3/4):224–235Google Scholar
  43. 43.
    Gruebler A, Berenz V, Suzuki K (2012) Emotionally assisted human-robot interaction using a wearable device for reading facial expressions. Adv Robot 26(10):1143–1159Google Scholar
  44. 44.
    Gems D (1999) Enhancing human traits: ethical and social implications. Nature 396:222–223CrossRefGoogle Scholar

Copyright information

© Springer Japan 2014

Authors and Affiliations

  1. 1.Center for Cybernics Research/Faculty of Engineering, Information and SystemsUniversity of TsukubaTsukubaJapan

Personalised recommendations