Skip to main content
Log in

Non-verbal Communication and Touchless Activation of a Radio-Controlled Car via Facial Activity Recognition

  • Regular Paper
  • Published:
International Journal of Precision Engineering and Manufacturing Aims and scope Submit manuscript

Abstract

Many smart glasses technologies are being developed to improve the working efficiency or quality of life in various fields. In some enterprises, these technologies are used to help improve the working quality and productivity and minimize data loss. In real life, smart glasses are applied as an entertainment device with augmented/virtual reality or as an assistive manipulator for the physically challenged. Thus, these technologies have mainly adopted various operating systems depending on usages, such as a touchpad, remote control, and voice recognition. However, conventional operating methods have limitations in non-verbal and noisy situations where people cannot use both hands. In this study, we present a method of detecting a facial signal for touchless activation using a transducer. We acquired a facial signal amplified by a lever mechanism using a load cell on the hinge of an eyewear. We then classified the signal and obtained their accuracy by calculating the confusion matrix with classified categories through a machine learning technique, i.e., the support vector machine. We can activate an actuator, such as a radio-controlled car, through a classified facial signal by using an eyewear-type signal transducer. Overall, our operating system can be useful for activating the actuator or transmitting a message through the classified facial activities in non-verbal situations and in situations where both hands cannot be used.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Abbreviations

RF:

Random forest

SVM:

Support vector machine

LVQ:

Learning vector quantization

10-fold CV:

Tenfold cross validation

BFS:

Backward feature selection

linearity:

Linearity

curvature:

Curvature

trend:

Trend

spike:

Spike

entropy:

Entropy

e_acf:

Extend ACF

v_acf:

ACF of variable

diff_acf:

Differentiation ACF

diff2_acf:

2nd differentiation ACF

e_Ac:

Autocorrelation

v_Ac:

Autocorrelation of variable

diff_Ac:

Differentiation autocorrelation

diff2_Ac:

2nd differentiation autocorrelation

References

  1. Adapa, A., et al. (2018). Factors influencing the adoption of smart wearable devices. International Journal of Human Computer Interaction,34(5), 399–409.

    Google Scholar 

  2. Cho, J., Jang, S. H., et al. (2018). Scoliosis screening through a machine learning based gait analysis test. International Journal of Precision Engineering and Manufacturing,19(12), 1861–1872.

    Google Scholar 

  3. Lee, C., & Park, S. (2018). Estimation of unmeasured golf swing of arm based on the swing dynamics. International Journal of Precision Engineering and Manufacturing,19(5), 745–751.

    Google Scholar 

  4. Park, J. H., Park, B. O., & Lee, W. G. (2019). Parametric design and analysis of the arc motion of a user-interactive rollator handlebar with hall sensors. International Journal of Precision Engineering and Manufacturing,20(11), 1979–1988.

    Google Scholar 

  5. Lee, L. H., & Hui, P. (2018). Interaction methods for smart glasses: A survey. IEEE Access,6, 28712–28732.

    Google Scholar 

  6. Quint, F., Loch, F., & Bertram, P. (2017). The challenge of introducing AR in industry—Results of a participative process involving maintenance engineers. Procedia Manufacturing,11, 1319–1323.

    Google Scholar 

  7. Horejsi, P. (2015). Augmented reality system for virtual training of parts assembly. Procedia Engineering,100, 699–706.

    Google Scholar 

  8. Chung, J. M., Lee, W. G., et al. (2017). A glasses-type wearable device for monitoring the patterns of food intake and facial activity. Scientific Reports,7, 41690.

    Google Scholar 

  9. Chung, J. M., Lee, W. G., et al. (2018). Design and evaluation of smart glasses for food intake and physical activity classification. Journal of Visualized Experiments,132, e56633.

    Google Scholar 

  10. Zhang, Y. S., Khademhosseini, A., et al. (2016). Google glass-directed monitoring and control of microfluidic biosensors and actuators. Scientific Reports,6, 22237.

    Google Scholar 

  11. Wang, S., Wan, J., et al. (2016). Towards smart factory for industry 4.0: A self-organized multi-agent system with big data based feedback and coordination. Computer Networks,101, 158–168.

    Google Scholar 

  12. Kagermann, H. (2013). Recommendations for implementing the strategic initiative INDUSTRIES 4.0. National Academy of Science and Engineering,1, 158–168.

    Google Scholar 

  13. Stock, T., & Seliger, G. (2016). Opportunities of sustainable manufacturing in industry 4.0. Procedia CIRP,40, 536–541.

    Google Scholar 

  14. Choi, W. Y., Kim, H. G., Kim, S., & Lee, W. G. (2017). Component analysis of gait patterns using machine learning algorithms. Proceedings of Institute of Control, Robotics and Systems,1, 395–396.

    Google Scholar 

  15. Park, J. H., Kim, S., & Lee, W. G. (2018). Design of a smart rollator handle equipped with hall sensors: A machine learning approach. Proceedings of Institute of Control, Robotics and Systems,1, 552–553.

    Google Scholar 

  16. Wang, S., Wan, J., et al. (2016). Implementing smart factory of Industrie 4.0: An outlook. International Journal of Distributed Sensor Networks,12, 1–10.

    MathSciNet  Google Scholar 

  17. Hao, Y., & Helo, P. (2017). The role of wearable devices in meeting the needs of cloud manufacturing: A case study. Robotics and Computer-Integrated Manufacturing,45, 168–179.

    Google Scholar 

  18. Kim, Y., Lee, K., & Lee, K. (2016). Middleware techniques for high available plant production management services in the IoT environments. Journal of KIISE,12, 325–327.

    Google Scholar 

  19. Haque, M. M., Chin, H. C., & Debnath, A. K. (2013). Sustainable, safe, smart—Three key elements of Singapore’s evolving transport policies. Transport Policy,27, 20–31.

    Google Scholar 

  20. Binetti, N., et al. (2019). Assumptions about the positioning of virtual stimuli affect gaze direction estimates during Augmented Reality based interactions. Scientific Reports,9, 2566.

    Google Scholar 

  21. Vasudhevareddy, N., Haribabu, K., et al. (2018). Robotic assistive device for physically disabled people. International Journal of Pure and Applied Mathematics,120, 9635–9646.

    Google Scholar 

  22. Lin, B. (2019). Wearable smart devices for P4 medicine in heart disease: Ready for medical cyber-physical systems? A Journal of Integrative Biology,23, 291–292.

    Google Scholar 

  23. Romare, C., Hass, U., et al. (2018). Healthcare professionals’ views of smart glasses in intensive care: A qualitative study. Intensive and Critical Care Nursing,45, 66–71.

    Google Scholar 

  24. Bai, J., Lian, S., et al. (2017). Smart guiding glasses for visually impaired people in indoor environment. IEEE Transactions on Consumer Electronics,33, 258–266.

    Google Scholar 

  25. Damopoulos, D., & Kambouraks, G. (2019). Hands-free one-time and continuous authentication using glass wearable devices. Journal of Information Security and Applications,46, 138–150.

    Google Scholar 

  26. Yang, K.-W., et al. (2018). Robot-assisted intelligent emergency system for individual elderly independent living. Journal of the Korea Institute of Information and Communication Engineering,22, 1659–1666.

    Google Scholar 

  27. Tiberkak, A., et al. (2018). A novel approach for generic home emergency management and remote monitoring. Journal of Software: Practice and Experience,48, 761–774.

    Google Scholar 

Download references

Acknowledgements

The authors have no conflicts of interest to declare. This research was supported by the National Research Foundation of Korea (NRF-2019R1F1A1062434).

Author information

Authors and Affiliations

Authors

Contributions

DYH and WGL conceived the idea for this study, WGL designed and initiated the project, DYH fabricated the experimental device, BOP performed the data analysis via machine learning, JWK conducted the experiments, and JHL performed the image analysis. All authors wrote the final version of the manuscript.

Corresponding author

Correspondence to Won Gu Lee.

Ethics declarations

Conflict of interest

There authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (DOCX 509 kb)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Han, D.Y., Park, B.O., Kim, J.W. et al. Non-verbal Communication and Touchless Activation of a Radio-Controlled Car via Facial Activity Recognition. Int. J. Precis. Eng. Manuf. 21, 1035–1046 (2020). https://doi.org/10.1007/s12541-019-00291-x

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12541-019-00291-x

Keywords

Navigation