This study demonstrates the capabilities of bio-potentials especially Electrooculography (EOG) for smart environmental control applications. Furthermore, In this study, we have proposed a novel, user-friendly, low cost, and wearable Man Machine Interface (MMI) for practical use out of a controlled laboratory environment. EOG based MMI framework is designed by using consumer-grade single channel Brain-Computer Interface (BCI) device. Acquired signals are filtered and processed for feature extraction. Eye blink signals are detected and decoded using Application Program Interface (API) developed in MATLAB. Four setups for real-time control experiments spanning over 300 trials are conducted to test the efficiency of EOG signals for smart environmental control applications. Additionally, real-time wheelchair control experiments are performed by five volunteers for testing and cross-validating the quantitative and qualitative factors of designing MMI. Finally, results of wheelchair control experiments are compared and contrasted with similar established MMI frameworks. Overall, Four setups of smart control experiments conveyed an average precision of 96.44%, 99.30%, 97.11%, and 95.78%, respectively, with a good response time ranging between 1.65 s to 2.23 s for two participants with 300 trials. Whereas for wheelchair control experiment, wherein five subjects volunteered, an average accuracy of 93.89% and 62.29 bits/min of Information Transfer Rate (ITR) were obtained with zero collision and zero False Positive Rate (FPR). From the results it can be inferred that, EOG signals can be applied to different smart environment control application with great potential; Proposed single channel EOG based MMI framework can provide a robust, practical, user-friendly yet at the same time precise and responsive interface for control applications at a moderate cost. In this investigation, an effective, low cost, wearable inter has been presented for the variety of users and various applications. When contrasted with the recently established comparative MMIs, the displayed framework gives a novel and proficient interfaces without Graphical User Interface (GUI) with more prominent exactness and better ITR.
Electrooculography Biopotentials Wearable HMI Assistive technology Smart control Single channel MMI Capability testing
This is a preview of subscription content, log in to check access.
This study did not receive any kind of funding or research grant from any institution/organization/individual.
Compliance with Ethical Standards
Conflict of interests
Authors Ajit Madhukerrao Choudhari and Venkatesh Jonnalagedda declare that they have no conflict of interest.
This article does not contain any studies with animals performed by any of the authors. All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.
Informed consent was obtained from all individual participants included in the study.
Punsawad Y, Wongsawat Y, Parnichkun M. Hybrid EEG-EOG brain-computer interface system for practical machine control. 2010 annual international conference of the IEEE engineering in medicine and biology. IEEE; 2010.Google Scholar
Farwell L, Donchin E. Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials. Electroencephalogr Clin Neurophysiol 1988;70(6):510–523.CrossRefGoogle Scholar
Long J, Li Y, Wang H, Yu T, Pan J, Li F. A hybrid brain computer interface to control the direction and speed of a simulated or real wheelchair. IEEE Trans Neural Syst Rehabil Eng 2012;20(5):720–729.CrossRefGoogle Scholar
Pfurtscheller G, da Silva FL. Event-related EEG/MEG synchronization and desynchronization: basic principles. Clin Neurophysiol 1999;110(11):1842–1857.CrossRefGoogle Scholar
Bin G, Gao X, Yan Z, Hong B, Gao S. An online multi-channel SSVEP-based brain–computer interface using a canonical correlation analysis method. J Neural Eng 2009;6(4):046002.CrossRefGoogle Scholar
Middendorf M, McMillan G, Calhoun G, Jones K. Brain-computer interfaces based on the steady-state visual-evoked response. IEEE Trans Rehabil Eng 2000;8(2):211–214.CrossRefGoogle Scholar
Birbaumer N, Ghanayim N, Hinterberger T, Iversen I, Kotchoubey B, Kübler A., Perelmouter J, Taub E, Flor H. A spelling device for the paralysed. Nature 1999;398(6725):297–298.CrossRefGoogle Scholar
Doud AJ, Lucas JP, Pisansky MT, He B. Continuous three-dimensional control of a virtual helicopter using a motor imagery based brain-computer interface. PLoS One 2011;6(10):e26322.CrossRefGoogle Scholar
Duan F, Lin D, Li W, Zhang Z. Design of a multimodal EEG-based hybrid BCI system with visual servo module. IEEE Trans Auton Ment Dev 2015;7(4):332–341.CrossRefGoogle Scholar
Ma J, Zhang Y, Cichocki A, Matsuno F. A novel EOG/EEG hybrid human machine interface adopting eye movements and erps: Application to robot control. IEEE Trans Biomed Eng 2015;62(3):876–889.CrossRefGoogle Scholar
Royer AS, Doud AJ, Rose ML, He B. EEG control of a virtual helicopter in 3-dimensional space using intelligent control strategies. IEEE Trans. Neural Syst Rehabil Eng 2010;18(6):581–589.CrossRefGoogle Scholar
Galán F., Nuttin M, Lew E, Ferrez P, Vanacker G, Philips J, del Millán JR. A brainactuated wheelchair: Asynchronous and non-invasive brain-computer interfaces for continuous control of robots,. Clin Neurophysiol 2008;119(9):2159–2169.CrossRefGoogle Scholar
Li Z, Lei S, Su C-Y, Li G. Hybrid brain/muscle-actuated control of an intelligent wheelchair. International conference on robotics and biomimetics (ROBIO). IEEE; 2013. https://doi.org/10.1109/robio.2013.6739429.
Rebsamen B, Guan C, Zhang H, Wang C, Teo C, Ang MH, Burdet E. A brain controlled wheelchair to navigate in familiar environments. IEEE Trans Neural Syst Rehabil Eng 2010;18(6):590–598.CrossRefGoogle Scholar
Yu Y, Zhou Z, Yin E, Jiang J, Tang J, Liu Y, Hu D. Toward brain-actuated car applications: Self-paced control with a motor imagery-based brain-computer interface. Comput Biol Med 2016;77:148–155.CrossRefGoogle Scholar
Bastos-Filho TF, Cheein FA, Muller SMT, Celeste WC, de la Cruz C, Cavalieri DC, Sarcinelli-Filho M, Amaral PFS, Perez E, Soria CM, Carelli R. Towards a new modality-independent interface for a robotic wheelchair. IEEE Trans Neural Syst Rehabil Eng 2014;22(3):567–584. https://doi.org/10.1109/tnsre.2013.2265237.CrossRefGoogle Scholar
Al-Haddad A, Sudirman R, Omar C, Hui KY, Jimin MR. Wheelchair motion control guide using eye gaze and blinks based on PointBug algorithm. 2012 third international conference on intelligent systems modelling and simulation. IEEE; 2012. https://doi.org/10.1109/isms.2012.23.
Nakanishi M, Mitsukura Y. Wheelchair control system by using electrooculogram signal processing. The 19th Korea-Japan joint workshop on frontiers of computer vision. IEEE; 2013. https://doi.org/10.1109/fcv.2013.6485476.
Shen H-M, Hu L, Lee KM, Fu X. Multi-motion robots control based on bioelectric signals from single-channel dry electrode. Proc Inst Mech Eng H J Eng Med 2015;229(2):124–136.CrossRefGoogle Scholar
Deng LY, Hsu C-L, Lin T-C, Tuan J-S, Chang S-M. EOG-Based human-computer interface system development. Expert Syst Appl 2010;37(4):3337–3343.CrossRefGoogle Scholar
El-Halabi M, Haidar R, Kadri RE, Lahoud C. Eye-blinks communication vehicle: a prototype. 2017 fourth international conference on advances in biomedical engineering (ICABME). IEEE; 2017. https://doi.org/10.1109/icabme.2017.8167567.
Borghetti D, Bruni A, Fabbrini M, Murri L, Sartucci F. A low-cost interface for control of computer functions by means of eye movements. Comput Biol Med 2007;37(12):1765–1770.CrossRefGoogle Scholar
Yamagishi K, Hori J, Miyakawa M. Development of EOG-based communication system controlled by eight-directional eye movements. 2006 international conference of the IEEE engineering in medicine and biology society. IEEE; 2006, https://doi.org/10.1109/iembs.2006.259914.
Ning B, Li M, Liu T, Shen H, Hu L, Fu X. Human brain control of electric wheelchair with eye-blink electrooculogram signal. Intelligent robotics and applications. Berlin: Springer; 2012. p. 579–588. https://doi.org/10.1007/978-3-642-33509-958.
Iturrate I, Antelis J, Minguez J. Synchronous EEG brain-actuated wheelchair with automated navigation. 2009 IEEE international conference on robotics and automation. IEEE; 2009. https://doi.org/10.1109/robot.2009.5152580.
Wolpaw J, Ramoser H, McFarland D, Pfurtscheller G. EEG-based communication: improved accuracy by response verification. IEEE Trans Rehabil Eng 1998;6(3):326–333.CrossRefGoogle Scholar