Skip to main content

Auditory Evoked Potentials (AEPs) Response Classification: A Fast Fourier Transform (FFT) and Support Vector Machine (SVM) Approach

  • Conference paper
  • First Online:
Proceedings of the 12th National Technical Seminar on Unmanned System Technology 2020

Part of the book series: Lecture Notes in Electrical Engineering ((LNEE,volume 770))

  • 1072 Accesses

Abstract

Hearing loss has become the world's most widespread sensory impairment. The applicability of a traditional hearing test is limited as it allows the subject to provide a direct response. The main aim of this study is to build an intelligent hearing level evaluation method using possible auditory evoked signals (AEPs). AEP responses are subjected to fixed acoustic stimulation strength for usual auditory and abnormal ear subjects to detect the hearing disorder. In this paper, the AEP responses have been captured from the sixteen subjects when the subject hears the auditory stimulus in the left or right ear. Then, the features have extracted with the help of Fast Fourier Transform (FFT), Power Spectral Density (PSD), Spectral Centroids, Standard Deviation algorithms. To classify the extracted features, the Support Vector Machine (SVM) approach using Radial Basis Kernel Function (RBF) has been used. Finally, the performance of the classifier in terms of accuracy, confusion matrix, true positive and false negative rate, precision, recall, and Cohen-Kappa-Score have been evaluated. The maximum classification accuracy of the developed SVM model with FFT feature was observed 95.29% (10 s time windows) which clearly indicates that the method provides a very encouraging performance for detecting the AEPs responses.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 219.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 279.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 379.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Deafness and hearing loss. https://www.who.int/news-room/fact-sheets/detail/deafness-and-hearing-loss, last accessed 2020/09/13

  2. Sanjay HS, Hiremath BV, Prithvi BS, Dinesh PA (2020) Machine learning based assessment of auditory threshold perception in human beings. SN Appl Sci 2:1–10. https://doi.org/10.1007/s42452-019-1929-7

    Article  Google Scholar 

  3. Ibrahim IA, Ting HN, Moghavvemi M (2019) Formulation of a novel classification indices for classification of human hearing abilities according to cortical auditory event potential signals. Arab J Sci Eng 44:7133–7147. https://doi.org/10.1007/s13369-019-03835-5

    Article  Google Scholar 

  4. Dietl H, Weiss S (2004) Detection of cochlear hearing loss applying wavelet packets and support vector machines. In: Conference record—Asilomar conference on signals, systems and computers, pp 1575–1579. https://doi.org/10.1109/acssc.2004.1399421

  5. Deckers L, Das N, Ansari AH, Bertrand A, Francart T (2018) EEG-based detection of the attended speaker and the locus of auditory attention with convolutional neural networks. bioRxiv. 475673. https://doi.org/10.1101/475673

  6. Thorpe B, Dussard T (2018) Classification of speech using MATLAB and K-nearest neighbour model: aid to the hearing impaired. In: Conference proceedings—IEEE SOUTHEASTCON. Institute of Electrical and Electronics Engineers Inc. https://doi.org/10.1109/SECON.2018.8479223

  7. Geirnaert S, Francart T, Bertrand A (2020) An interpretable performance metric for auditory attention decoding algorithms in a context of neuro-steered gain control. IEEE Trans Neural Syst Rehabil Eng 28:307–317. https://doi.org/10.1109/TNSRE.2019.2952724

    Article  Google Scholar 

  8. O’Sullivan JA, Power AJ, Mesgarani N, Rajaram S, Foxe JJ, Shinn-Cunningham BG, Slaney M, Shamma SA, Lalor EC (2015) Attentional selection in a cocktail party environment can be decoded from single-trial EEG. Cereb Cortex 25:1697–1706. https://doi.org/10.1093/cercor/bht355

    Article  Google Scholar 

  9. Biesmans W, Das N, Francart T, Bertrand A (2017) Auditory-inspired speech envelope extraction methods for improved EEG-based auditory attention detection in a cocktail party scenario. IEEE Trans Neural Syst Rehabil Eng 25:402–412. https://doi.org/10.1109/TNSRE.2016.2571900

    Article  Google Scholar 

  10. Das N, Francart T, Bertrand A (2020) Auditory attention detection dataset KULeuven. https://doi.org/10.5281/ZENODO.3997352

  11. Rashid M, Sulaiman N, Mustafa M, Khatun S, Bari BS, Hasan MJ, Al-Fakih NMMA (2020) Investigating the possibility of brain actuated mobile robot through single-channel EEG headset. Lect Notes Electr Eng 632:579–590. https://doi.org/10.1007/978-981-15-2317-5_49

    Article  Google Scholar 

  12. Rashid M, Sulaiman N, Mustafa M, Jadin MS, Najib MS, Bari BS, Khatun S (2020) Analysis of EEG features for brain computer interface application. Lect Notes Electr Eng 632:529–540. https://doi.org/10.1007/978-981-15-2317-5_45

    Article  Google Scholar 

  13. Hortal E, Iáñez E, Úbeda A, Planelles D, Costa Á, Azorín JM (2014) Selection of the best mental tasks for a SVM-based BCI system. In: Conference on Proceedings—IEEE international conference on system man cybernetics, pp 1483–1488. https://doi.org/10.1109/smc.2014.6974125

  14. Zhu W, Zeng N, Wang N (2010) Sensitivity, specificity, accuracy, associated confidence interval and ROC analysis with practical SAS® implementations. Northeast SAS Users Gr. 2010 Health Care Life Science, pp 1–9

    Google Scholar 

Download references

Acknowledgements

The author would like to acknowledge the magnificent supports from the Faculty of Electrical and Electronics Engineering Technology and Universiti Malaysia Pahang to provide fundamental research grant scheme to support this research, RDU190109.

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Islam, M.N., Sulaiman, N., Rashid, M., Mustafa, M., Hasan, M.J. (2022). Auditory Evoked Potentials (AEPs) Response Classification: A Fast Fourier Transform (FFT) and Support Vector Machine (SVM) Approach. In: Isa, K., et al. Proceedings of the 12th National Technical Seminar on Unmanned System Technology 2020. Lecture Notes in Electrical Engineering, vol 770. Springer, Singapore. https://doi.org/10.1007/978-981-16-2406-3_41

Download citation

Publish with us

Policies and ethics