Software switches: novel hands-free interaction techniques for quadriplegics based on respiration–machine interaction
The majority of interaction techniques for individuals with disabilities to control a computer require an additional dedicated hardware like switches beyond standard computer peripherals. Furthermore, each traditional hardware switch emulates different keyboard characters or mouse clicks depending on its manufacturer. There is no commonly agreed standard on this; while some switch accessible programs expect to receive enter character, other programs expect to receive mouse left click. Switch manufacturers, therefore, distribute software drivers, which allow users to assign expected characters, with switches to overcome this standardization problem by making them adaptable. As software drivers are developed for a specific switch, they are not compatible with the other manufacturers’ switches. On the other hand, for the ones with a very limited motor activity but a complete respiration activity, sip-and-puff switches are efficient solutions. However, they are expensive and invasive systems with tubes, that have to be changed regularly due to the hygiene concern, inside users’ mouth. Although invasive respiration-based systems (sip-and-puff devices) caught enough attention by researchers over the years, using non-invasive respiration-based interaction techniques is an underestimated approach in general. In this study, we proposed two novel non-invasive interaction techniques as software switches (PuffCam and PuffMic) compatible with any switch accessible software. Both techniques are respiration operated, where a hard puff, detected by a microphone or an adapted camera, is considered as ‘switch-on’ like a puff switch. We have also collected the objective data by an evaluation software namely TestBed. A user study (conducted with 46 participants with/out disabilities) revealed that the accuracy, precision, recall, and false positive rate of our interaction techniques were quite impressive, and PuffCam performed better than PuffMic for all metrics. According to questionnaire findings, comfort assessment of interaction techniques by participants was rated quite satisfactory. All participants agreed that the idea of controlling a computer via breathing without purchasing any dedicated hardware sounded very promising. Because most interaction techniques for computer control require extra dedicated devices and there is not any adaptable (i.e., compatible with most switch accessible software) software switch to the best of our knowledge, proposed interaction techniques can help community in an open access manner without purchasing any device.
KeywordsInteraction techniques Switch access Computer access Sip-and-puff device Software switch Switch accessible software Respiration–machine interface
We would like to thank Dr. Brijnesh-Johannes Jain and Dr. Fikret Sivrikaya for their valuable suggestions. The first author holds the Ministry of National Education Scholarship of the Turkish Republic.
- 8.Dai, L., Goldman, R., Sears, A., Lozier, J.: Speech-based cursor control: a study of grid-based solutions. In: ACM SIGACCESS Accessibility and Computing, 77–78, pp. 94–101. ACM (2004)Google Scholar
- 9.Gerdtman, C., Lindén, M.: Six-button click interface for a disabled user by an adjustable multi-level sip-and-puff switch. In: Proceedings of SIGRAD 2010: Content Aggregation and Visualization; November 25–26; 2010; Västerås; Sweden, 052, pp. 59–63. Linkoping University Electronic Press (2010)Google Scholar
- 11.Guan, C., Thulasidas, M., Wu, J.: High performance p300 speller for brain–computer interface. In: 2004 IEEE International Workshop on Biomedical Circuits and Systems, pp. S3–S5. IEEE (2004)Google Scholar
- 12.Harada, S., Landay, J.A., Malkin, J., Li, X., Bilmes, J.A.: The vocal joystick: evaluation of voice-based cursor control techniques. In: Proceedings of the 8th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 197–204. ACM (2006)Google Scholar
- 14.Karimullah, A.S., Sears, A.: Speech-based cursor control. In: Proceedings of the Fifth International ACM Conference on Assistive Technologies, pp. 178–185. ACM (2002)Google Scholar
- 15.Kim, E.Y., Kang, S.K., Jung, K., Kim, H.J.: Eye mouse: mouse implementation using eye tracking. In: International Conference on Consumer Electronics, pp. 207–208. IEEE (2005)Google Scholar
- 16.Kim, H., Ryu, D.: Computer control by tracking head movements for the disabled. In: Miesenberger, K., Klaus, J., Zagler, W.L., Karshmer, A.I. (eds) International Conference on Computers for Handicapped Persons, pp. 709–715. Springer (2006).Google Scholar
- 18.Lv, Z., Wu, X., Li, M., Zhang, C.: Implementation of the eog-based human computer interface system. In: 2nd International Conference on Bioinformatics and Biomedical Engineering, pp. 2188–2191. IEEE (2008)Google Scholar
- 20.Manaris, B., McCauley, R., MacGyvers, V.: An intelligent interface for keyboard and mouse control. In: Proc. 14th Intl Florida AI Research Symposium (FLAIRS-01), pp. 182–188. Citeseer (2001)Google Scholar
- 23.Smith, J.D., Graham, T.: Use of eye movements for video game control. In: Proceedings of the 2006 ACM SIGCHI International Conference on Advances in Computer Entertainment Technology, p. 20. ACM (2006)Google Scholar
- 24.Sporka, A., Kurniawan, S., Slavík, P.: Non-speech operated emulation of keyboard. In: Clarkson, J., Langdon, P., Robinson, P. (eds) Designing Accessible Technology, pp. 145–154. Springer (2006)Google Scholar
- 27.Sturtz, C.R.: Mouth-operated computer input device and associated methods (2010). US Patent 7,768,499Google Scholar