Abstract
In order to create a future society where assisted lifestyles will become widely available, we need technology that will support, strengthen, and enhance limited human capabilities. The supports for both physical and cognitive functions are definitely needed for the future rehabilitation and physical exercise. In this chapter, a cognitive neuroscience approach for realizing augmented human technology in order to enhance, strengthen, and support human cognitive capabilities is described. Wearable devices allow the subject high mobility and broaden the spectrum of environments in which bodily motion and physiological signal recognition can be carried out. In this scenario, augmented human technology is regarded as a wearable device technology that enhances human capabilities, particularly cognitively assisted action and perception.
Keywords
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Volpe BT, Krebs HI, Hogan N, Edelstein L, Diels C, Aisen M (2000) A novel approach to stroke rehabilitation – robot-aided sensorimotor stimulation. Neurology 54(10):1938–1944
Hogan N, Krebs HI, Charnnarong J, Srikrishna P, Sharon A (1992) MIT-MANUS: a workstation for manual therapy and training. In: IEEE international workshop on robot and human communication. Tokyo, Japan, pp 161–165
Krewer C, Heller S, Husemann B, Mller F, Koenig E (2007) Effects of locomotion training with assistance of a robot-driven gait orthosis in hemiparetic patients after stroke: a randomized controlled pilot study. Stroke 38:349–354
Wirz M, Zemon DH, Rupp R, Scheel A, Colombo G, Dietz V, Hornby TG (2005) Effectiveness of automated locomotor training in patients with chronic incomplete spinal cord injury: a multicenter trial. Arch Phys Med Rehabil 86(4):672–680
Suzuki K, Mito G, Kawamoto H, Hasegawa Y, Sankai Y (2007) Intention-based walking support for paraplegia patients with robot suit HAL. Adv Robot 21(12):1441–1469
Tsukahara A, Kawanishi R, Hasegawa Y, Sankai Y (2010) Sit-to-stand and stand-to-sit transfer support for complete paraplegic patients with robot suit HAL. Adv Robot 24(11):1615–1638
Association for Applied Psychophysiology and Biofeedback (2008) http://www.aapb.org/
Wallace MT, Wilkinson LK, Stein BE (1996) Representation and integration of multiple sensory inputs in primate superior colliculus. J Neurophysiol 76(2):1246–1266
Nakamura Y, Yamane K, Suzuki I, Fujita Y (2004) Somatosensory computation for man-machine interface from motion capture data and musculoskeletal human model. IEEE Trans Robot 21(1):58–66
Delp SL, Loan JP, Hoy MG, Zajac FE, Topp EL, Rosen JM (1990) An interactive graphic-based model of the lower extremity to study orthopaedic surgical procedures. IEEE Trans Biomed Eng 37(10):757–767
Murai A, Kurosawa K, Yamane K, Nakamura Y (2009) Computationally fast estimation of muscle tension for realtime bio-feedback. In: Annual international conference of the IEEE EMBS. Minnesota, US, pp 6546–6549
Narumi T, Nishizaka S, Kajinami T, Tanikawa T, Hirose M (2011) Meta cookie+: an illusion-based gustatory display. Lecture notes in computer science, vol 6773/2011. Springer, pp 260–269
Hamanaka M, Lee S (2006) Sound scope headphones: controlling an audio mixer through natural movement. In: 2006 international computer music conference. New Orleans, USA, pp 155–158
Scheirer J, Fernandez R, Picard W (1999) Expression glasses: a wearable device for facial expression recognition. In: CHI ‘99 extended abstracts on human factors in computing systems, pp 262–263
Mistry W, Maes P (2009) SixthSense–a wearable gestural interface. In: SIGGRAPH Asia 2009, Emerging Technologies. Yokohama
Azuma RT (1997) A survey of augmented reality. Presence 6(4):355–385
Tamura H, Yamamoto H, Katayama A (2001) Mixed reality: future dreams seen at the border between real and virtual worlds. IEEE Comput Graph Appl Mag 21(6):64–70
Zeng H, Zhao Y (2011) Sensing movement: microsensors for body motion measurement. Sensors 11:638–660
Fridlund A, Cacioppo JT (1986) Guidelines for human electromyographic research. Psychophysiology 23:567–589
Tsenov G, Zeghbib A, Palis F, Soylev N, Mladenov V (2008) Visualization of an on-line classification and recognition algorithm of EMG signals. J Univ Chem Technol Metall 43(1):154–158
Naik G, Kumar D, Singh V, Palaniswam M (2006) Hand gestures for HCI using ICA of EMG. In: HCSNet workshop on the use of vision in human-computer interaction, vol 56, Canberra, Australia
Igarashi N, Suzuki K, Kawamoto H, Sankai Y (2010) BioLights: light emitting wear for visualizing lower-limb muscle activity. In: Annual international conference of the IEEE EMBS. Buenos Aires, Argentina, pp 6393–6396
Stroeve S (1999) Impedance characteristics of a neuromusculoskeletal model of the human arm I. posture control. Biol Cybern 81:475–494
Hill A (1938) The heat of shortening and the dynamic constants of muscle. Royal Soc Lond B126:136–195
Winters JM, Stark L (1985) Analysis of fundamental human movement patterns through the use of in-depth antagonistic muscle models. IEEE Trans Biomed Eng 32(10):826–839
Tsubouchi Y, Suzuki K (2010) BioTones: a wearable device for EMG auditory biofeedback. In: Annual international conference of the IEEE EMBS, Buenos Aires, Argentina, pp 6543–6546
Knapp RB, Lusted HS (1990) A bioelectric controller for computer music applications. Comput Music J 14:42–47
Atau T (2000) Musical performance practice on sensor-based instruments, trends in gestural control of music. Science et Musique 14:389–405
Budzynski TH, Stoyva JM (1969) An instrument for producing deep muscle relaxation by means of analog information feedback. J Appl Behav Anal 2:231–237
Epstein LH, Hersen M, Hemphill DP (1974) Music feedback in the treatment of tension headache: an experimental case study. J Behav Ther Exp Psychiatry 5(1):59–63
Alexander AB, French CA, Goodman NJ (1975) A comparison of auditory and visual feedback in biofeedback assisted muscular relaxation training. Soc Psychophysiol Res 12:119–124
Iida K, Suzuki K (2011) Enhanced touch: a wearable device for social playware. In: ACM 8th advances in computer entertainment technology conference. doi:10.1145/2071423.2071524
Pardew EM, Bunse C (2005) Enhancing interaction through positive touch. Young Except Child 8(2):21–29
Baba T, Ushiama T, Tomimatsu K (2007) Freqtric drums: a musical instrument that uses skin contact as an interface. In: International conference on new interfaces for musical expression, New York, USA, pp 386–387
Zimmerman TG (1996) Personal area networks: near-field intrabody communication. IBM Syst J 35(3/4):609–617
Suzuki K, Iida K, Shimokakimoto T (2012) Social playware for supporting and enhancing social interaction. In: 17th international symposium on artificial life and robotics. Oita, Japan, pp.39–42
Pozzo T, Berthoz A, Lefort L (1990) Head stabilization during various locomotor tasks in humans. I. Normal subjects. Exp Brain Res 82:97–106
Pozzo T, Berthoz A, Lefort L, Vitte E (1991) Head stabilization during various locomotor tasks in humans. II. Patients with bilateral peripheral vestibular deficits. Exp Brain Res 85:208–217
Kadone H, Bernardin D, Bennequin D, Berthoz A (2010) Gaze anticipation during human locomotion – top-down organization that may invert the concept of locomotion in humanoid robots. Int Symp Robot Hum Interact Commun 19:587–592
Ekman P, Friesen WV, Ellsworth P (1982) What emotion categories or dimensions can observers judge from facial behavior? In: Ekman P (ed) Emotion in the human face. Cambridge University Press, Cambridge
Gruebler A, Suzuki K (2010) Measurement of distal EMG signals using a wearable device for reading facial expressions. In: Annual international conference of the IEEE EMBS. Buenos Aires, Argentina, pp 4594–4597
Jayatilake D, Suzuki K (2012) Robot assisted facial expressions with segmented shape memory alloy actuators. Int J Mech Autom 1(3/4):224–235
Gruebler A, Berenz V, Suzuki K (2012) Emotionally assisted human-robot interaction using a wearable device for reading facial expressions. Adv Robot 26(10):1143–1159
Gems D (1999) Enhancing human traits: ethical and social implications. Nature 396:222–223
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer Japan
About this chapter
Cite this chapter
Suzuki, K. (2014). Augmented Human Technology. In: Sankai, Y., Suzuki, K., Hasegawa, Y. (eds) Cybernics. Springer, Tokyo. https://doi.org/10.1007/978-4-431-54159-2_7
Download citation
DOI: https://doi.org/10.1007/978-4-431-54159-2_7
Published:
Publisher Name: Springer, Tokyo
Print ISBN: 978-4-431-54158-5
Online ISBN: 978-4-431-54159-2
eBook Packages: EngineeringEngineering (R0)