Control of word processing environment using myoelectric signals

Abstract

This paper shows how myoelectric signals (EMG) can be used to generate control signals for further use in human–machine interfaces. Our custom-built portable USB device is able to capture multi-channel highspeed surface EMG signals from muscles and its software counterpart is capable to control common PC interface including ordinary text editors such as MS Word. At the time of the study the system utilized three parallel EMG channels to control user interfaces. The interaction was based on series of 1-of-N selections which specify rows and columns in on-screen keyboards. The selection was performed by quantification of selected muscle activity of the user. The system was further tested by a disabled person who provided input during a participatory design session. Our study has demonstrated that the system and the user interface can be used for effective text input and editing also in disabled people.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21
Fig. 22
Fig. 23
Fig. 24
Fig. 25
Fig. 26
Fig. 27

Notes

  1. 1.

    Thalmic Labs’ Myo is an example of commercially developed human interface device (HID) based on the EMG detection targeted on users with no disabilities to control their computer via gestures.

  2. 2.

    For instance Thalmic Labs’ Myo alpha version main purpose was to detect a set of discrete gestures rather than to provide a continuous stream of measurements at the time of the study. Current version offers 200 Hz data output.

  3. 3.

    The system allows up to 20 kHz.

References

  1. 1.

    Alimardani M, Shuichi N, Ishiguro H (2014) The effect of feedback presentation on motor imagery performance during bci-teleoperation of humanlike robot. In: 2014 5th IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics (BioRob), BioRob, pp 403–408. IEEE Engineering in Medicine & Biology Society

  2. 2.

    Barbero A, Grosse-Wentrup M (2010) Biased feedback in brain-computer interfaces. J Neuroeng Rehabil 7. doi:10.1186/1743-0003-7-34

  3. 3.

    Felzer T, Freisleben B (2002) Hawcos: the ”hands-free” wheelchair control system. In: Proceedings of the fifth international ACM conference on assistive technologies, Assets ’02. ACM, New York, pp 127–134

  4. 4.

    Felzer T, MacKenzie I, Beckerle P, Rinderknecht S (2010) Qanti: a software tool for quick ambiguous non-standard text input. In: Miesenberger K, Klaus J, Zagler W, Karshmer A (eds) Computers helping people with special needs. Lecture notes in computer science, vol 6180.Springer, Berlin, pp 128–135

  5. 5.

    Felzer T, Nordmann R (2006) Alternative text entry using different input methods. In: Proceedings of the 8th international ACM SIGACCESS conference on computers and accessibility, Assets ’06. ACM, New York, pp 10–17

  6. 6.

    Felzer T, Rinderknecht S (2009) 3dscan: an environment control system supporting persons with severe motor impairments. In: Proceedings of the 11th international ACM SIGACCESS conference on computers and accessibility, Assets ’09. ACM, New York, pp 213–214

  7. 7.

    Felzer T, Strah B, Nordmann R (2008) Automatic and self-paced scanning for alternative text entry. In: Proceedings of the IASTED international conference on Telehealth/Assistive Technologies, Telehealth/AT ’08. ACTA Press, Anaheim, pp 1–6

  8. 8.

    MacKenzie IS, Felzer T (2010) Sak: Scanning ambiguous keyboard for efficient one-key text entry. ACM Trans Comput Hum Interact 17(3):11:1–11:39

    Article  Google Scholar 

  9. 9.

    Miró-Borrás J, Bernabeu-Soler P (2008) E-everything for all: Text entry for people with severe motor disabilities. In: Proceedings of the Collaborative Electronic Communications and eCommerce Technology and Research Iberoamerica, 6th CollECTeR, pp 1–7

  10. 10.

    Pošusta A, Otáhal J (2012) Recording and conditioning of surface emg signal for decomposition. Bull Appl Mech 8:28–31

    Google Scholar 

  11. 11.

    Pošusta A, Poláček O, Sporka A, Flek T, Otáhal J (2013) Text entry methods controlled by myoelectric signals. Bull Appl Mech 9:23–30

    Google Scholar 

  12. 12.

    Sporka AJ (2009) Pitch in non-verbal vocal input. SIGACCESS Access Comput 94:9–16. doi:10.1145/1595061.1595063

    Article  Google Scholar 

  13. 13.

    Sporka AJ, Felzer T, Kurniawan SH, Poláček O, Haiduk P, MacKenzie IS (2011) Chanti: Predictive text entry using non-verbal vocal input. In: Proceedings of the SIGCHI conference on human factors in computing systems, CHI’11. ACM, New York, pp 2463–2472. doi:10.1145/1978942.1979302

  14. 14.

    Sporka AJ, Posusta A, Poláček O, Flek T, Otáhal J (2014) Text entry via discrete and analog myoelectric signals. In: Proceedings of the 16th international ACM SIGACCESS conference on computers & Accessibility, ASSETS ’14. ACM, New York, pp 361–362. doi:10.1145/2661334.2661426

Download references

Author information

Affiliations

Authors

Corresponding author

Correspondence to Adam J. Sporka.

Additional information

This work has been supported by (1) grant LH12070 (TextAble) awarded by the Ministry of Education, Youth and Sports of the Czech Republic, funding PROGRAM LH KONTAKT II, (2) grant SGS10/290/OHK3/3T/13 awarded by the CTU Prague, (3) grant P304/12/G069 awarded by the Czech Science Foundation, and (4) project AV0Z50110509 of the Academy of Sciences of the Czech Republic.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Pošusta, A., Sporka, A.J., Poláček, O. et al. Control of word processing environment using myoelectric signals. J Multimodal User Interfaces 9, 299–311 (2015). https://doi.org/10.1007/s12193-015-0200-9

Download citation

Keywords

  • Assistive technology
  • Text input
  • Myoelectric signals
  • User study