Advertisement

Tsallis entropy: as a new single feature with the least computation time for classification of epileptic seizures

  • M. Thilagaraj
  • M. Pallikonda Rajasekaran
  • N. Arun Kumar
Open Access
Article

Abstract

Epilepsy is a neurological disease condition in which one experience seizures. These can be recorded using Electroencephalogram which is by nature long term recordings. A lot of work has been done towards automatic detection of these seizures in the literature with both short term and long term recordings. In this work, we have used the standard database of University of Bonn for a four different two class classification problem as addressed by various others in the literature. We have used novel single feature namely Tsallis entropy along with five different classifiers. Comparing with other literatures, we find that our method has the least computation time as low as 0.9 ms. We achieved a highest accuracy of 92.67–100% with Decision tree classifier for the four types of two class classification problem considered. Our method being very simple and also has fastest computation time in comparison with other features in the literature and thus can form as a software tool that can be installed easily and also opens future opportunities towards real time detection and prediction of epileptic seizures.

Keywords

Epilepsy Tsallis entropy Ictal Preictal EEG 

1 Introduction

Epilepsy is common disease experienced by at least 50 million people in the world, which surmounts about 1% of the world’s population [1, 2]. One with epilepsy as disease is not well received by the society. People from south India hide this disease in relation to marriage on the fear that they will be rejected or prohibited from marring. Factually the disease itself does not prompt threat, but it can become hazardous if the individual is swimming or driving as one loses control and consciousness during certain types of epilepsy.

Seizures are of different types. They are classified primarily based on the source of the seizure within the brain namely localized or distributed. The localized seizures are named as partial or focal seizures. Partial seizures are further classified as simple partial seizure and complex partial seizure. If ones awareness is unaffected then it is called as simple partial seizure and if ones awareness is affected it is called as complex partial seizure. And generally seizures are classified according to the effect on the body but all of them involve loss of consciousness. These include absence (petit mal), myoclonic, clonic, tonic, tonic–clonic (grand mal), and atonic seizures. In general they have an ictal period during which the patient experiences the seizures and a period preceding that called pre ictal period. Epileptic seizures can be recorded along with recording of the Electroencephalogram (EEG).

Volumes of work are done towards automatic detection of epileptic seizures. In a broad way, researchers have used long term EEG data or EEG data segments for detecting the epileptic seizures. Here we are considering EEG data segments for the classification of epileptic seizures. Bonn University database [3] has EEG segments of a fixed duration of three different categories namely normal, ictal and preictal. A lot of work has been done using this database for classification of epileptic seizures.

In terms of techniques, starting from the 1970s different methods has been employed. Initially heuristic and descriptive methods were used for the detection of epileptic seizures [4]. Later time domain methods, frequency domain methods, time frequency domain methods and other nonlinear methods were all attempted for seizure detection [5]. Linear discriminant analysis (LDA), histogram methods were also used for automatically detecting the epileptic seizures [6, 7]. Seizure termination was identified using sample entropy [8]. Differential operation was also used to identify the seizures [9]. Recurrence quantification Analysis (RQA) [10], Higher order spectra (HOS) [11], Hurst exponent (H) [12], different entropies [1] were all employed for the identification of epileptic seizures. There are also works where they have employed wavelet transforms with single level analysis for detecting the epileptic seizures [13, 14]. Multilevel wavelet approach was also employed by Indiradevi et al. [15] for automatically detect the epileptic spikes. Sharma et al. [16] has used Fast Fourier transform(FFT) for extracting features from EEG as a first step and employed these features to the neural network for identifying the seizure.

Different methods of machine learning algorithms with pattern recognition such as neural networks [17, 18, 19, 20, 21, 22, 23], support vector machines [24, 25], KNN classifier [26], Bayesian classifier [27] have been implemented. Features such as correlation dimension [28], correlation density [29], Lyapunov exponent [30], Kolmogorov entropy [31] have been used for onset detection. Fourier transform was also implemented for onset detection of epileptic seizures [32, 33, 34]. Some researchers have also used wavelet transforms [35, 36, 37] to detect the seizure onsets [61].

Fundamentally, the researchers have used this database as a two class or three class problem [38, 39, 40]. We are going to use this database for a two class classification problem. Various methods which use the same database for a two class classification namely normal and abnormal are shown in the Table 8. It can be noted that few works have led to 100% classification accuracy [41, 42, 43, 44, 45]. It can also be seen that expect a few, no one talks about the computation time that is involved in these works [46, 47, 48, 49]. Here, in this work, we are bringing the importance of computation time and accuracy towards real time detection or classification of epileptic seizures [50, 51, 52, 53, 54, 55].

The paper is sorted out as it takes after. Section 2 exhibits the subtle elements of the information database, brief data about the feature that we have utilized and classifier which we have utilized as a part of this work. Segment 3 introduces the outcomes acquired in this work. Section 4 exhibits a discourse on related investigations of this database and contrasts our outcomes and different techniques and results in the writing. The conclusion is given in segment 5.

2 Methods and materials

In this section, we describe the methods that we have employed for automatic classification of the epileptic seizures.

2.1 Data

We have used the artefact free EEG data available at the Bonn University [see Ref. 38 for more details]. The dataset contains three classes namely normal (Set A (eyes open) and Set B(eyes closed)), pre-ictal (Set C and Set D) and seizure (Set E) each with 200, 200 and 100 data respectively. Each of the data is a single channel EEG signals with the duration of 23.6 s. In this work, we are taking the first 6 s of duration of the data from all the sets. Thus the normal class comprises of EEG data Set A and Set B. Thus the normal class contains 200 data. Similarly the pre-ictal class (Set C and Set D) contains 200 data. The seizure class (Set E) consists of 100 cases with same subjects when they were having epilepsy. Thus it forms a database of 500 datasets with 6 s of the data with size of 1024 samples. All these segments of EEG were recorded using 128 channel system with the sampling rate of 173.61 Hz and digitized with 12 bit A/D resolution. Further it was filtered with a band pass filter with 0.5340 Hz at 12 dB/octave [3]. A sample of these data under three classes namely normal, preictal and ictal is shown in Fig. 1. As in the literature, we are constructing 4 different types of 2 class problem from the data considered, which is described as follows. I Type: Set A versus Set E, II Type: Set A, B versus Set E, III Type: Set C, D versus Set E, and IV Type: Set A, B, C, D versus Set E.
Fig. 1

EEG data—normal, preictal and ictal

2.2 Tsallis entropy as the feature

The idea of entropy was started in 1803 by a mathematician Lazare Carnot as he found that vitality is lost because of scattering and friction [56, 57]. This thermodynamic entropy was later brought into the field of data hypothesis with the name data entropy just in 1948 by Shannon [58]. Since at that point there are part of assortments of entropies have come in the writing like approximate entropy, sample entropy, permutation entropy etc. A short blueprint with only one entropy called Tsallis entropy (TsE) as a single feature. A short blueprint of this entropy is given below.

2.2.1 Tsallis-entropy(TsE)

Constantino Tsallis introduced this entropy in 1988 as a basis for generalizing standard statistical mechanics [59]. For a given discrete set of probabilities, {pi} with sum of their probabilities equal to one and q any real number, the Tsallis entropy (TsE) is given by
$$Sq(pi) = \frac{1}{q - 1}\left( {1 - \sum\limits_{i} {p_{i}^{q} } } \right)$$
(1)

From the mathematical equation, we can see that it has only a summation part and thereby the computation time for this must be significantly low in comparison with other methods.

Table 1 shows the computation time of TsE for a single data. From this it can be noticed that the calculation time for TsE is very low which in this way opens the gate towards ongoing preparing. It can be observed that TsE has a computation time of 0.0009 s in a 1.6 GHz, 4 GB Ram computer with MATLAB 2008. Table 2 shows the computation time of various other features in the literature used for classification of epileptic seizures. The computation time shown in Tables 1 and 2 are average values over various single EEG data considered.
Table 1

Computation time for TsE considered

Feature

Computation time (seconds)

TsE

0.0009

Table 2

Computation time for various features considered in other studies

Features

Computation time (seconds)

Lyapanuv exponent

11.07

Higuchi FD

2.5

Hurst exponent

0.75

Approximate entropy (ApEn)

0.65

Sample entropy (SampEn)

0.64

Permutation entropy (PE) (m = 3)

0.07

PE (m = 4)

0.153

Kolmogorov measure

0.450

Table 3

Range (Mean ± Variance) of TsE for various 2 class problems considered

2 class problem type

Normal

Abnormal

p value

Type I

Set A (normal) versus Set E (abnormal)

− 2329102 ± 1.67E+12

− 1.2E+08 ± 1.25E+16

1.4E−18

Type II

Set A, B (normal) versus Set E (abnormal)

− 3788185 ± 7.33E+12

− 1.2E+08 ± 1.25E+16

2.59E−38

Type III

Set A, B, C, D (normal) versus Set E (abnormal)

− 2.8E+07 ± 4.8E+15

− 1.2E+08 ± 1.25E+16

9.67E−26

Type IV

Set C, D (normal) versus Set E (abnormal)

− 4.5E+07 ± 7.33E+15

− 1.2E+08 ± 1.25E+16

2.56E−12

In this work, we have used five classifiers to be specific Naïve Bayes classifier (NBC), Radial basis function neural network classifier (RBF), Decision tree—Decision tree classifier (DT), K nearest neighbourhood classifier (KNN) and Support vector machines (SVM) which are briefly explained as follows.

2.2.2 Naive Bayes Classifier (NBC)

This classifier is based on Bayes theorem. It accept that factors are autonomous irregular factors. With this presumption, it registers the probabilities required by the Bayes hypothesis for the given information.

2.2.3 Radial basis function (RBF)

These are again probabilistic neural systems having outspread premise initiation works in its first layer and focused layer tailing it for arrangement.

2.2.4 Decision trees: functional tree (DTFT)

Functional trees [60] are one type of multivariate trees, which are classification trees that have logistic regression functions at the inner nodes and/or leaves. They can deal with multiclass target variables, nominal and numeric attributes and missing values.

2.2.5 K-nearest neighbour classifier (KNN)

This is a supervised learning technique where a new coming instance is classified based on the closest training samples present in feature space. It does not have any model for fitting the new instance. When the test data is given, it is mapped to the class that is most common among the K neighbors. In this work, various values of k ranging from 2 to 6 were considered and the distance were computed using Euclidean distance. We got the highest accuracy when k was 5.

2.2.6 Adaboost

Adaboost is short of Adaptive boosting. This classification algorithm was formulated by Yoav Freund. It is a machine learning meta algorithm. It can be used in conjunction with the other weak learners to enhance their performance. Here Adaboost is used independently.

2.3 Performance measures

The performance of these classifiers is assessed using the accuracy. Accuracy is given by the ratio of the number of correctly classified segments to the total number of segments [61].

3 Results

The TsE values are calculated for the EEG data. The mean and variance of these entropy values for the data of 4 different two class problems considered is shown in the Tables 4, 5, 6 and 7. Students T test is performed for each of them with TsE feature. The determined p value is also shown in the table which is very less than 0.00001 suggesting that TsE has high discriminating capacity for each of the problems considered.
Table 4

Performance measures of the classifiers for type I

Classifiers

Accuracy (%)

NBC

100

RBF

100

DT

100

KNN

100

Adaboost

100

Table 5

Performance measures of the classifiers for type II

Classifiers

Accuracy (%)

NBC

97.67

RBF

97.33

DT

98

KNN

97.67

Adaboost

97

Table 6

Performance measures of the classifiers for type III

Classifiers

Accuracy (%)

NBC

88.33

RBF

91

DT

92.67

KNN

89.33

Adaboost

89

Table 7

Performance measures of the classifiers for type IV

Classifiers

Accuracy (%)

NBC

94

RBF

94.2

DT

95.8

KNN

92.8

Adaboost

95.2

TsE being the single feature in our work is given as input to the five different classifiers. Tenfold cross validation technique was used for evaluating the classifiers. In this system, the whole dataset is partitioned into 10 sets, each having same proportions of instances in each class. Nine sets are utilized for preparing and the staying one set is utilized for testing and evaluating the performance of the classifier. This system is rehashed ten times using new training and testing set each time. The average of all these ten performance measures is considered as the resultant final value. The performance measures thus obtained for each of the classifiers for various entropies considered are shown in Tables 4, 5, 6 and 7.

From the Tables 4, 5, 6 and 7 we can observe that the following. For the problem Type I: All the classifiers considered gave 100% accuracy. This shows the robustness and versatile nature of the TsE as feature that we have considered. For the problem Type II: DT classifier gave the highest classification accuracy of 98% accuracy. It can also be noted that the other classifiers also gave accuracies above 97%. This again proves the robust and versatile nature of the TsE feature across all the classifiers yielding consistent accuracies irrespective of the classifier used. For the problem Type III: It can be observed that the DT classifier gave the highest accuracy of 92.67%, whereas other classifiers gave accuracies above 88%. The reduction in the accuracy for this type is because we are classifying between preictal type and ictal type. Preictal stage is the region that comes before the onset of epileptic seizure. This is a transition period from normal to the ictal. Nevertheless we can see that TsE as a single feature has the ability to discriminate fairly well with the highest accuracy of 92.67%. For the problem Type IV: It can be observed that decision tree classifier achieved the highest classification accuracy of 95.8% with other classifiers accuracies being above 94%. This again shows that TsE as a single feature has a good discrimination capacity with fair good accuracy across all classifiers. Thus in a overall way, it can be pointed out that TsE as a single feature has the capacity for classifying all these four types of problems with fairly good levels of accuracies and with its significantly very less computation time, it is all capable for building towards a real time epilepsy detection system. The various accuracies obtained with different methods in the literature for these three classes are discussed in the following section.

4 Discussion

In this section, we present the various techniques used in the literature towards classification of the EEG data. We have only reviewed those studies which used Bonn University dataset towards classification of two classes and hence our results are comparable. A summary of the previous studies towards classification of various two class problems along with the accuracies obtained is listed in the Table 8.
Table 8

Comparison of accuracy of previous works with two class classification

Authors (year)

Features and classifier

Classification problem

Accuracy (%)

Computation time (ms)

Nigam and grape (2004)

Nonlinear preprocessing filter + neural network

Type I

97.2

NA

Srinivasan et al. (2005)

Time–frequency features + recurrent neural networks

Type I

99.6

NA

Kannathal et al. (2005)

Entropy (approximate, sample) + adaptive neuro-fuzzy inference system

Type I

92.2

See Table 2

Polat and Gunes (2007)

Fourier features + decision tree

Type I

98.72

NA

Subasi (2007)

Wavelet features + expert system

Type I

95

NA

Tzallas et al. (2007)

Time frequency analysis + neural networks

Type I

100

NA

Type IV

97.73

NA

Guo et al. (2010)

Multiple-wavelet approximate entropy features + neural networks

Type I

100

See Table 2

Type IV

98.27

NA

Orhan et al. (2011)

Wavelet transforms + KNN + neural network

Type I

100

NA

Type IV

100

NA

Iscan et al. (2011)

Cross correlation + PSD + SVM

Type I

100

10.5 (only for finding cross correlation)

Wang et al. (2011)

Wavelet transform + Shannon entropy + KNN

Type I

100

24 (only for shannon entropy)

Xie and Krishnan (2013)

Wavelet variances + KNN

Type I

100

NA

Type IV

100

NA

Chen (2014)

Dual tree complex wavelet + fourier features

Type I

100

6.7

Type II

100

9.4

Type III

100

5.7

Type IV

100

14.4

This work

Tsallis entropy + decision tree

Type I

100

0.9

Type II

98

0.9

Type III

92.67

0.9

Type IV

95.8

0.9

Primarily it can be seen from Table 8 that TsE as single feature was not used by anyone and we are first to use this as a feature towards the four different types of two class problems for this database, as done earlier in various papers in the literature.

From the initial observation from the Table 3, we can see that the TsE feature values for four different types of problems considered are visibly having a contrast between the two classes. It has also been found in the literature that entropy value changes for ictal and normal EEG [1, 5]. This is also confirmed in our study.

Earlier permutation entropy (PE) was also used for epileptic detection [2]. In their work, they split the 23.6 s duration of EEG data into non overlapping windows of 1 s duration length and PE is calculated for that. They calculated PE for m = 3 and m = 4. Although they calculated the PE values to be a feature for classification, they used it only for 4 different two class classification problem. They used SVM classifier and achieved an accuracy of 86.1%.

Recently G. Chen [62] used dual tree complex wavelet with Fourier features and achieved 100% accuracy for these 4 different types of 2 class problem. He achieved a computation time fairly less as 5.7 ms as shown in the Table 8. In this work, we have used only TsE as feature whose calculation time is significantly very less as 0.9 ms.

In addition to the above, we have shown different algorithms in literature in the Table 8 showing the different levels of accuracies they obtained. It can be seen that the authors have not given their computation time. To have some comparison for the computation time of ours with other methods in the literature, we have calculated some of the computation time of different algorithms in the literature with the same computer configuration and it is shown in the Table 2 and Table 8. For the wavelet based features and Fourier features we have not given the computation time and we take those values to be definitely larger than computation time of TsE—the reason being the mathematical equation for TsE calculation in comparison to the wavelet transform and Fourier transform. Moreover we can see from G. Chen’s work where the minimum computation time given is to be 5.7 ms. We have marked ‘NA’ in Table 8 meaning that the corresponding authors have not given the computation time or we have not specifically calculated the computation time of their features in our computer. We take for granted from G. Chen’s work in general for those works that involves wavelet and Fourier features. From all of the above, it can be noted that our feature TsE has the least computation time. Although our accuracies achieved are not 100% for problem of Type II, III and IV, our computation time is significantly very less comparing with other methods in the literature, thereby opening the window towards real time detection. More appropriate features are planned to be added to increase the accuracies without further increasing the computation time.

Our method has the following specific significance and novelty.
  1. (a)

    We are the first one to use TsE for this dataset and also towards epileptic seizure detection.

     
  2. (b)

    The computation time for this TsE feature considered in this work is significantly very less as shown in the Table 1 in comparison with the computation time of other features used in the literature as shown in the Table 2. It should also be noted that our method did not involve any preprocessing of the data.

     
  3. (c)

    We have used only a single feature i.e., TsE for the classification of 4 different types of two class classification problem. Also we would like to bring out that we have used tenfold cross validation technique for the accuracy calculation, which is not the case with the study of G. Chen.

     
  4. (d)

    Having the least computation time, our strategy opens the window towards continuous detection which is one of important need of hour of this area of research.

     

In any case, it ought to be noticed that this technique is executed just to a restricted size of information. The method needs to be checked with other large databases for consistent results. Additionally it ought to be noticed that the EEG information considered here for arrangement are commotion and ancient rarity free information portions and they are not the constant running EEG. As stated by Acharya et al. [5]., such strategies should be additionally tried and checked with substantial database gathered at various focuses of clinical trials for the consistency in the outcomes, previously they can be actualized for reasonable social purpose.

5 Conclusion

EEG signals can be used to differentiate between the two different states namely normal and ictal. In this work, we have proposed novel single feature TsE along with five classifiers for classification of four different types of two class problems. It can be seen our work achieved a highest accuracy of 92.67–100% for the different types of problems considered as described earlier. More importantly, our method has the least computation time of 0.9 ms in solving these four different types of classification problem. Thus proposed method is simple yet fastest and therefore can be planned towards real time detection. More selective new features need to be appended to increase the accuracy without further increase of the computation time. It can also form as a software tool and be installed in the epileptic diagnostic centres. However before such installations, the method has to be tested thoroughly with various databases for consistency in the accuracy.

References

  1. 1.
    Acharya, U.R., Molinari, F., Sree, S.V., Chattopadhyay, S., Ng, K.H., Suri, J.S.: Automated diagnosis of epileptic EEG using entropies. Biomed. Signal Process. Control 7(4), 401–408 (2012)CrossRefGoogle Scholar
  2. 2.
    Nicolaou, N., Georgiou, J.: Detection of epileptic electroencephalogram based on permutation entropy and support vector machines. Expert Syst. Appl. 39(1), 202–209 (2012)CrossRefGoogle Scholar
  3. 3.
    Andrzejak, R.G., Lehnertz, K., Mormann, F., Rieke, C., David, P., Elger, C.E.: Indications of nonlinear deterministic and finite-dimensional structures in time series of brain electrical activity: dependence on recording region and brain state. Phys. Rev. E 64(6), 061907 (2001)CrossRefGoogle Scholar
  4. 4.
    Callaway, E., Harris, P.R.: Coupling between cortical potentials from differentareas. Science 183(4127), 873–875 (1974)CrossRefGoogle Scholar
  5. 5.
    Acharya, U.R., et al.: Automated EEG analysis of epilepsy: a review. Knowl. Based Syst. (2013).  https://doi.org/10.1016/j.knosys.2013.02.014 Google Scholar
  6. 6.
    Murro, A.M., King, D.W., Smith, J.R., Gallagher, B.B., Flanigin, H.F., Meador, K.: Computerized seizure detection of complex partial seizures. Electroencephalogr. Clin. Neurophysiol. 79, 330–333 (1991)CrossRefGoogle Scholar
  7. 7.
    Harding, G.W.: An automated seizure monitoring system for patients with indwelling recording electrodes. Electroencephalogr. Clin. Neurophysiol. 86(6), 428–437 (1993)CrossRefGoogle Scholar
  8. 8.
    Yoo, C.S., Jung, D.C., Ahn, Y.M., Kim, Y.S., Kim, S.G., Yoon, H., Yi, S.H.: Automatic detection of seizure termination during electroconvulsive therapy using sample entropy of the electroencephalogram. Psychiatry Res. 195(1), 76–82 (2012)CrossRefGoogle Scholar
  9. 9.
    Majumdar, K.: Differential operator in seizure detection. Comput. Biol. Med. 42(1), 70–74 (2012)CrossRefGoogle Scholar
  10. 10.
    Acharya, U.R., VinithaSree, S., Chattopadhyay, S., Wenwei, Y.U., Alvin, A.P.C.: Application of recurrence quantification analysis for the automated identification of epileptic EEG signals. Int. J. Neural Syst. 21(3), 199–211 (2011)CrossRefGoogle Scholar
  11. 11.
    Acharya, U.R., VinithaSree, S., Suri, J.S.: Automatic detection of epileptic EEG signals using higher order cumulant features. Int. J. Neural Syst. 21(5), 1–12 (2011)CrossRefGoogle Scholar
  12. 12.
    Acharya, U.R., Sree, S.V., Ang, P.C.A., Yanti, R., Suri, J.S.: Application of non-linear and wavelet based features for the automated identification of epileptic EEG signals. Int. J. Neural Syst. 22(02), 1250002 (2012)CrossRefGoogle Scholar
  13. 13.
    Sartoretto, F., Ermani, M.: Automatic detection of epileptic form activity by single level analysis, clin. Neurophysiol. 110, 239–249 (1999)CrossRefGoogle Scholar
  14. 14.
    Subashi, A.: Automatic detection of epileptic seizure using dynamic fuzzy neural networks. Expert Syst. Appl. 31, 320–328 (2006)CrossRefGoogle Scholar
  15. 15.
    Indiradevi, K.P., Elias, E., Sathidevi, P.S., Nayak, S.D., Radhakrishnan, K.: A multi-level wavelet approach for automatic detection of epileptic spikes in the electroencephalogram. Comput. Biol. Med. 38, 805–816 (2008)CrossRefGoogle Scholar
  16. 16.
    Sharma, A., Wilson, S.E., Roy, R: EEG classification for estimating anesthetic depth during halothane anesthesia. In: Proceedings of the 14th annual International Conference IEEE engineering in medicine and biology society. New York. IEEE, pp. 2409–2410 (1992)Google Scholar
  17. 17.
    Gabor, A.J., Leach, R.R., Dowla, F.U.: Automated seizure detection using a self-organizing neural network. Electroencephalogr. Clin. Neurophysiol. 99(3), 257–266 (1996)CrossRefGoogle Scholar
  18. 18.
    Ubeyli, E.D., Guler, I.: Features extracted by eigenvector methods for detecting variability of EEG signals. Pattern Recognit. Lett. 28, 592–603 (2007)CrossRefGoogle Scholar
  19. 19.
    Ghosh-Dastidar, S., Adeli, H.: Improved spiking neural networks for EEG classification and epilepsy and seizure detection. Integr. Comput.-Aided Eng. 14(3), 187–212 (2007)Google Scholar
  20. 20.
    Ghosh-Dastidar, S., Adeli, H.: Spiking neural networks. Int. J. Neural Syst/ 19(4), 295–308 (2009)CrossRefGoogle Scholar
  21. 21.
    Ghosh-Dastidar, S., Adeli, H.: A new supervised learning algorithm for multiple spiking neural networks with application in epilepsy and seizure detection. Neural Netw. 22, 1419–1431 (2009)CrossRefGoogle Scholar
  22. 22.
    Postnov, D.E., Ryazanova, L.S., Zhirin, R.A., Mosekilde, E., Sosnovtseva, O.V.: Noise controlled synchronization in potassium coupled neural networks. Int. J. Neural Syst. 17(2), 105–113 (2007)CrossRefGoogle Scholar
  23. 23.
    Chen, M., Jiang, C.S., Wu, Q.X., Chen, W.H.: Synchronization in arrays of uncertain delay neural networks by decentralized feedback control. Int. J. Neural Syst. 17(2), 115–122 (2007)CrossRefGoogle Scholar
  24. 24.
    Shoeb, A., Edwards, H., Connolly, J., Bourgeois, B., Treves, S.T., Guttag, J.: Patient- specific seizure onset detection. Epilepsy Behav. 5(4), 483–498 (2004)CrossRefGoogle Scholar
  25. 25.
    Guler, I., Ubeyli, E.D.: Multiclass support vector machines for EEG signals classification. IEEE Trans. Inf Technol. Biomed. 11(2), 117–126 (2007)CrossRefGoogle Scholar
  26. 26.
    Qu, H., Gotman, J.: A patient-specific algorithm for the detection of seizure onset in long-term EEG monitoring: possible use as a warning device. IEEE Trans. Biomed. Eng. 44(2), 115–122 (1997)CrossRefGoogle Scholar
  27. 27.
    Saab, M.E., Gotman, J.: A system to detect the onset of epileptic seizures in scalp EEG. Clin. Neurophysiol. 116(2), 427–442 (2005)CrossRefGoogle Scholar
  28. 28.
    Elger, C.E., Lehnertz, K.: Seizure prediction by non-linear time series analysis of brain electrical activity. Eur. J. Neurosci. 10, 786–789 (1998)CrossRefGoogle Scholar
  29. 29.
    Martinerie, J., Adam, C., Le Van Quyen, M., et al.: Epileptic seizures can be anticipated by non-linear analysis. Nat. Med. 4, 1173–1176 (1998)CrossRefGoogle Scholar
  30. 30.
    Iasemidis, L.D., et al.: Phase space topography and the Lyapunov exponent of electrocorticograms in partial seizures. Brain Topogr. 2(3), 187–201 (1990)CrossRefGoogle Scholar
  31. 31.
    Van Drongelen, W., Nayak, S., Frim, D.M., et al.: Seizure anticipation in pediatric epilepsy: use of Kolmogorov entropy. Pediatr. Neurol. 29, 207–213 (2003)CrossRefGoogle Scholar
  32. 32.
    Khan, Y.U., Gotman, J.: Wavelet based automatic seizure detection in intracerebral electroencephalogram. Clin. Neurophysiol. 114, 898–908 (2003)CrossRefGoogle Scholar
  33. 33.
    Guler, I., Ubeyli, E.D.: Adaptive neuro-fuzzy inference system for classification of EEG signals using wavelet coefficients. J. Neurosci. Methods 148, 113–121 (2005)CrossRefGoogle Scholar
  34. 34.
    Hopfengartner, R., et al.: An efficient, robust and fast method for the offline detection of epileptic seizures in long-term scalp EEG recordings. Clin. Neurophysiol. 118, 2332–2343 (2007)CrossRefGoogle Scholar
  35. 35.
    Adeli, H., Ghosh-Dastidar, S., Dadmehr, N.: A wavelet-chaos methodology for analysis of EEGs and EEG sub-bands to detect seizure and epilepsy. IEEE Trans. Biomed. Eng. 54(2), 205–211 (2007)CrossRefGoogle Scholar
  36. 36.
    Ghosh-Dastidar, S., Adeli, H., Dadmehr, N.: Mixed-band wavelet-chaos-neural network methodology for epilepsy and epileptic seizure detection. IEEE Trans. Biomed. Eng. 54(9), 1545–1551 (2007)CrossRefGoogle Scholar
  37. 37.
    Adeli, H., Ghosh-Dastidar, S., Dadmehr, N.: A spatio-temporal wavelet-chaos methodology for EEG-based diagnosis of Alzheimer’s disease. Neurosci. Lett. 444(2), 190–194 (2008)CrossRefGoogle Scholar
  38. 38.
    Bandt, C., Pompe, B.: Permutation entropy: a natural complexity measure for time series. Phys. Rev. Lett. 88(17), 174102 (2002)CrossRefGoogle Scholar
  39. 39.
    Ouyang, G., Li, X., Dang, C., Richards, D.A.: Deterministic dynamics of neural activity during absence seizures in rats. Phys. Rev. E 79(4), 041146 (2009)CrossRefGoogle Scholar
  40. 40.
    Ouyang, G., Dang, C., Richards, D.A., Li, X.: Ordinal pattern based similarity analysis for EEG recordings. Clin. Neurophysiol. 121, 694–703 (2010)CrossRefGoogle Scholar
  41. 41.
    Bruzzo, A.A., Gesierich, B., Santi, M., Tassinari, C., Birbaumer, N., Rubboli, G.: Permutation entropy to detect vigilance changes and preictal states from scalp EEG in epileptic patients: a preliminary study. Neurol. Sci. 29, 3–9 (2008)CrossRefGoogle Scholar
  42. 42.
    Rényi, A.: On measures of information and entropy. In: Proceedings of the fourth Berkeley Symposium on Mathematics, Statistics and Probability 1960. pp. 547–561 (1961)Google Scholar
  43. 43.
    Press, W.H., Teukolsky, S.A., Vetterling, W.T., Flannery, B.P.: Section 16.5. Support Vector Machines. Numerical Recipes: The Art of Scientific Computing, 3rd ed. Cambridge University Press, New York. ISBN 978-0-521-88068-8 (2007)Google Scholar
  44. 44.
    Chen, Guangyi: Automatic EEG seizure detection using dual-tree complex wavelet-fourier features. Expert Syst. Appl. 41(5), 2391–2394 (2014)CrossRefGoogle Scholar
  45. 45.
    Guler, N.F., Ubey, E.D., Guler, I.: Recurrent neural network employing Lyapunov exponents for EEG signals classification. Expert Syst. Appl. 29(3), 506–514 (2005)CrossRefGoogle Scholar
  46. 46.
    Ghosh-Dastidar, S., Adeli, H., Dadmehr, N.: Principal component analysis-enhanced cosine radial basis function neural network for robust epilepsy and seizure detection. IEEE Trans. Biomed. Eng. 55(2), 512–518 (2008)CrossRefGoogle Scholar
  47. 47.
    Faust, O., Acharya, U.R., Min, L.C., Sputh, B.H.: Automatic identification of epileptic and background EEG signals using frequency domain parameters. Int. J. Neural Syst. 20(2), 159–176 (2010)CrossRefGoogle Scholar
  48. 48.
    Chua, K.C., Chandran, V., Acharya, U.R., Lim, C.M.: Automatic identification of epileptic EEG signals using higher order spectra. J. Eng. Med. 223(4), 485–495 (2009)CrossRefGoogle Scholar
  49. 49.
    Chua, K.C., Chandran, V., Acharya, U.R., Lim, C.M.: Application of higher order spectra to identify epileptic EEG. J. Med. Syst. 35(6), 1563–1571 (2011)CrossRefGoogle Scholar
  50. 50.
    Sharma, R., Pachori, R.B.: Classification of epileptic seizures in EEG signals based on phase space representation of intrinsic mode functions. Expert Syst. Appl. 42(3), 1106–1117 (2015)CrossRefGoogle Scholar
  51. 51.
    Acharya, U.R., Chua, K.C., Lim, T.C., Dorithy, J.S.Suri: Automatic identification of epileptic EEG signals using nonlinear parameters. J. Mech. Med. Biol. 9(4), 539–553 (2009)CrossRefGoogle Scholar
  52. 52.
    Kolmogorov, A.N.: Three approaches to the definition of the concept of quantity of information. Probl. Inf. Transm. 1(1), 1–7 (1965)MathSciNetGoogle Scholar
  53. 53.
    Lempel, A., Ziv, J.: On the complexity of finite sequence. IEEE Trans. Inf. Theory 22, 75–81 (1976)MathSciNetzbMATHCrossRefGoogle Scholar
  54. 54.
    Fu-Zen, Shaw, Ruei-Feng, Chen, Hen-Wai, Tsao, Chen-Tung, Yen: Algorithmic complexity as an index of cortical function in awake and pentobarbital-anesthetized rats. J. Neurosci. Methods 93, 101–110 (1999)CrossRefGoogle Scholar
  55. 55.
    Rapp, P.E., Zimmermean, I.D., Vining, E.P., Cohen, N., Albano, A.M., et al.: The algorithmic complexity of neural spicke trains increases during focal seizures. J. Neurosci. 14, 4731–4739 (1995)Google Scholar
  56. 56.
    Naterer, G.F., Camberos, J.A.: Entropy Based Design and Analysis of Fluids Engineering Systems. CRC Press, Boca Raton (2008)zbMATHCrossRefGoogle Scholar
  57. 57.
    Muller, I.: A History of Thermodynamics: The Doctrine of Energy and Entropy. Springer, New York (2007)zbMATHGoogle Scholar
  58. 58.
    Shannon, C.E.: A mathematical theory of communication. Bell Syst. Tech. J. 27(3), 379–423 (1948)MathSciNetzbMATHCrossRefGoogle Scholar
  59. 59.
    Tsallis, C.: Possible generalization of Boltzmann-Gibbs statistics. J. Stat. Phys. 52, 479–487 (1988)MathSciNetzbMATHCrossRefGoogle Scholar
  60. 60.
    Book Section 2001@ 978-3-540-42956-2 Discovery Science 2226 Lecture Notes in Computer Science E Jantke, Klaus P. Shinohara, Ayumi 10.1007/3-540-45650-3_9 Functional Trees, Springer, Berlin (2001)Google Scholar
  61. 61.
    Acharya, U.R., Sree, S.V., Alvin, A.P.C., Suri, J.S.: Use of principal component analysis for automatic classification of epileptic EEG activities in wavelet framework. Expert Syst. Appl. 39(10), 9072–9078 (2012)CrossRefGoogle Scholar
  62. 62.
    Chen, G.: Automatic EEG seizure detection using dual-tree complex wavelet-fourier features. Expert Syst. Appl. 41, 2391–2394 (2014)CrossRefGoogle Scholar

Copyright information

© The Author(s) 2018

Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Authors and Affiliations

  • M. Thilagaraj
    • 1
  • M. Pallikonda Rajasekaran
    • 2
  • N. Arun Kumar
    • 3
  1. 1.Department of Instrumentation and Control EngineeringKalasalingam UniversityKrishnankoilIndia
  2. 2.Department of Electronics and Communication EngineeringKalasalingam UniversityKrishnankoilIndia
  3. 3.Sastra UniversityThanjavurIndia

Personalised recommendations