Support Vector Machine-Based Classifier for the Assessment of Finger Movement of Stroke Patients Undergoing Rehabilitation



Traditionally, clinical evaluation of motor paralysis following stroke has been of value to physicians and therapists because it allows for immediate pathophysiological assessment without the need for specialized tools. However, current clinical methods do not provide objective quantification of movement; therefore, they are of limited use to physicians and therapists when assessing responses to rehabilitation. The present study aimed to create a support vector machine (SVM)-based classifier to analyze and validate finger kinematics using the leap motion controller. Results were compared with those of 24 stroke patients assessed by therapists.


A non-linear SVM was used to classify data according to the Brunnstrom recovery stages of finger movements by focusing on peak angle and peak velocity patterns during finger flexion and extension. One thousand bootstrap data values were generated by randomly drawing a series of sample data from the actual normalized kinematics-related data. Bootstrap data values were randomly classified into training (940) and testing (60) datasets. After establishing an SVM classification model by training with the normalized kinematics-related parameters of peak angle and peak velocity, the testing dataset was assigned to predict classification of paralytic movements.


High separation accuracy was obtained (mean 0.863; 95% confidence interval 0.857–0.869; p = 0.006).


This study highlights the ability of artificial intelligence to assist physicians and therapists evaluating hand movement recovery of stroke patients.


The most common paresis to occur following stroke is contralateral hemiplegia with involvement of upper extremities and fingers [1]. Stroke can result in motor dysfunction, such as paralysis of one side of the body [2]. The majority of stroke survivors experience ongoing limitations in their activities of daily life resulting in impaired quality of life. Assessment of motor dysfunction is mainly dependent on subjective measures or patient descriptions, which have limited applicability in evaluating rehabilitation interventions such as hand dexterity training and spasticity-reducing treatments.

Objective measurement of motor dysfunction and finger activity may improve the quality of testing and evaluation. One drawback of most traditionally used clinical assessment methods is that while they report whether a person can implement a task or not (for example, lift and move a small object), they fail to quantify the process of the activity, amount of compensatory movements from other joints, time to peak velocity, or sequence of joint involvement. Measurement of these parameters may provide better insight into the underlying mechanisms of movement disorders [3, 4].

The recovery process of hand paralysis is typically evaluated by physicians and therapists before and after rehabilitation exercises [5]. The Brunnstrom Recovery Stages (BRSs) system is a commonly used tool for evaluating paralytic symptoms of hands and fingers and for estimating the degree of recovery with kinematics-related parameters (peak angle or final forms of finger flexion or extension) [6]. Traditionally, clinical evaluation of paralysis has been favored by physicians and therapists because it provides immediate evaluation without the need for special tools. However, this assessment does not provide the movement time as assessed by the start and end times of movement of the paralytic hand. The status of recovery from paralysis may be more accurately assessed by considering the velocity as well as range of voluntary movements [5, 7]. Furthermore, evaluation of the kinematics-related velocity of hand movement may provide greater insight into paralysis recovery even in cases involving an unchanged range of motion (ROM).

Various three-dimensional (3D) motion analyzers that provide temporal and spatial analysis of limb movement have been applied to assess rehabilitation progress [8,9,10]. However, all the described methods have limitations. The Crescent Vicon technology is space-occupying, requires markers that can be recognized by limb-sensor cameras, and involves considerable preparation time prior to measurement [11]. The Microsoft Kinect v2 device utilizes infrared sensors to perform high-accuracy 3D analysis of limb motion without the requirement of markers, but the software can only track the movement of 25 joints in the body. Furthermore, two trackable positions of the hand tip (collective fingertips) and thumb have been reported to be unsuitable for the analysis of fine movements [12,13,14]. The Leap Motion Controller (LMC; United States Patent, US 8, 638, 989 B2) uses three infrared irradiators and two infrared receivers, and it can be used to detect hand movements with a temporal resolution of up to 120 Hz and a spatial resolution of 1/100 mm without the need for markers. The device can also recognize hand gestures [15, 16].

The movement data for fingers flexion or extension in patients with hemiplegia, with simultaneous recording, may help physicians and therapists when evaluating the state of recovery of hand movement. Therefore, technologies that use artificial intelligence, such as the classifier and calculation methodology, must be able to perform assessments equivalent to a medical practitioner to have clinical utility. The present study investigated the utility of artificial intelligence for assessing recovery of hand dysfunction following stroke through the analysis of finger kinematics obtained with LMC by using a support vector machine (SVM).



A total of 24 patients were enrolled from the 154 stroke inpatients who were treated at Tokyo Dental College Ichikawa General Hospital between June 1, 2016 and November 30, 2016. Thirty patients initially satisfied the inclusion criteria of age 20 years or older and no previous diagnosis of stroke and the exclusion criteria of one or more fingers missing or severely limited ROM, difficulty understanding verbal instructions due to dementia or aphasia, assignment to bedrest, difficulty remaining seated for 30 min, or difficulty maintaining limb position for measurements even with assistance. All 30 participants received written and oral explanation of the study and provided signed consent to participate. However, six patients (females, two; males, four) were excluded from the analysis because the infrared sensors of LMC did not provide adequate data. The skin temperatures of these patient’s fingers were probably low and the infrared sensors might not have been able to distinguish them from the background temperature. Finally, data from five females and 19 males who completed the study were analyzed. The demographic information and stroke-related information of participants are detailed in Table 1.

Table 1 Demographic and stroke-related information of participants

Ethical Considerations

In conformance with the ethics policy for medical research with human subjects, the design and protocol of the present study were approved by the Ethics Committee of Saitama Prefectural University (Approval no. 27083) and the Independent Ethics Committee of Tokyo Dental College Ichikawa General Hospital (I15-71).


Digital Measurements

The LMC uses an infrared sensor and obtains joint-centered coordinates via Unity game software ( We created an original application to recording finger movement (code name Fahrenheit, Patent no. 6375328; Saitama Prefectural University, Japan). The vector representing each presumed finger bone was derived from two of the joint coordinates representing finger joints. The joint angles and distances between joints for each set of assumed metacarpophalangeal, proximal interphalangeal, and distal interphalangeal joints were then derived by using the law of cosines in the application. The LMC with Fahrenheit was operated by using a laptop computer with a Windows 8 64-bit operating system (Microsoft, Kobe, Japan), and the resulting data were stored on the hard drive of the same computer.

The LMC with Fahrenheit application provides a 3D representation of the finger position in the form of x, y, and z axis coordinates in millimeters within the frame of reference. If the position of a fingertip is given as (x, y, z) = [100, 100, − 100], then this implies that x = 10 cm, y = 10 cm, and z = − 10 cm. Given the vectors (x1, y1, z1) and (x2, y2, z2) in the 3D space, the cosine formula was used to calculate finger ROM by using the Fahrenheit application. The angular velocity was calculated by the angle of finger ROM divided by time resolution 60 Hz.

The measurement setup consisted of a system composed of the LMC, a software package to obtain and record hand and wrist ROM from the Fahrenheit application, and a Unity-based graphical user interface that instructed participants to perform movements as well as analyzed data.

Two movement parameters were measured in accordance with the BRS assessment of disjunctive finger movement: all-fingers flexion, during which all fingers were maximally flexed from the position of maximum extension; and all-fingers extension, during which all fingers were maximally extended from the position of maximum flexion. The two measurements were analyzed from the LMC coordinates by using an algorithm. The algorithm was performed using output measurement values ranging from 0 (the minimal value of movement; 14 finger joints within a hand as 5 metacarpophalangeal joints; 4 proximal interphalangeal joints, 4 distal interphalangeal joints, and 1 interphalangeal joint were extended maximally in normal adults; 0° converted to 0) to 1 (the maximum value of movement was converted to 1; maximal flexion computed as 90° multiplied by 14 joints is 1260°) during voluntary ROM at a resolution of 0.001 mm and a sampling rate of 60 frames per second (fps).

Brunnstrom Recovery Stages

The BRSs represent six stages on an ordinal scale, from flaccid paralysis (stage I) to free disjunctive movement (stage VI). Determination of the stage for fingers was performed on the basis of flexion, extension, pinch, and grasp formation [6]. The BRS criteria can also be applied to the Fugl–Meyer standardized assessment of movement capability in stroke cases, and its correlation with the Stroke Impairment Assessment Set has been established [7].


Measurements were performed with participants in a seated position in a chair or wheelchair, with the shoulder joint on the paretic side relaxed in a slightly abducted position, elbow joint at approximately 90°, forearm in the pronated position, and wrist joint in the neutral position (Fig. 1). First, the participants were asked to hold up their hands to the LMC measurements for the tester to confirm recognition of the fingers by the LMC. The tester simultaneously gave the signal to begin measurement and pressed the measurement button. Second, the tester provided oral, action-based instructions for the starting limb position, confirmed assumption of the position, and instructed the participant to maintain the position for 5 s. At 5 s after beginning the measurement, the tester orally instructed the participant to begin the movement. The participant maintained the maximally attained limb position until the end of the measurement period (10 s after beginning the movement). After one or two practice attempts, an experimental measurement was performed once, with each participant demonstrating two movements of the paretic side: all-fingers extension and flexion. The range was determined by a preliminary investigation of each movement, wherein measurement data were calculated from 0 to 7 s after the tester cued for movement.

Fig. 1

Experimental setup. a Leap motion controller (LMC) setup for positioning the arms of participants. b Animation of hand motion generated from data detected from the patient’s hand by using the LMC

Therapist Assessment of Finger Movement Capability

Finger movement assessment of each participant during Brunnstrom recovery test of finger function was performed either by occupational therapists or physical therapists (n = 3; 6 ± 3 years of post-qualification experience; mean age, 28 ± 3 years) who had not previously met any of the participants. Assessment was performed by observation of the video-recordings of finger movements. The therapists assessed the movement in accordance with the BRS method for well-accomplished movements. All therapist assessments were performed independently, and therapists were blinded to information of the results obtained by other therapists or the SVM.

Data Analysis

The kinematics-related parameters that were evaluated were the peak angle and peak velocity of finger flexion and extension for each participant. To normalize finger angles, the measured angle changes were first divided individually by the maximum angle. Second, to adjust all initial finger positions to 0, all data were subtracted from the initial angle. The peak angle was calculated from the recorded finger angle for the 7-s movement duration. Peak velocity was calculated from the maximum slope of the differentiated finger angle curve for the 7-s movement duration. Correlation of the kinematics-related parameters including the peak angle and peak velocity of finger flexion and extension and BRSs were examined by Spearman’s rank correlation coefficient.

Thereafter, 1000 bootstrap data values were generated by randomly drawing a series of sample data from the actual kinematics-related data. This bootstrap resampling method is widely used in demographic studies [17,18,19]. Then, the bootstrap kinematics-related data were inputted into a non-linear SVM [20]. This powerful classification method can discriminate non-linearly separable data by using kernel functions to map the data to a higher dimensional space wherein the data become more separable [20, 21]. Equation (1) shows the basic concept of SVM model:

$$y = w^{T} x + b$$

where y is the output set, wT is the normal vector, x is the input set, and b is the offset. Equation (2) shows the multiplier method to create a discriminant function that maximizes the margin:

$${\mathop {\hbox{min} }\limits_{w,b}} \frac{1}{2}{\vert\vert w\vert\vert}^{2}$$

subject to \(y_{i} = w^{T} x_{i} + b \ge 1,\quad i = 1 \ldots, m\)

The undetermined multiplier method is shown in Eq. (3).

$$\mathop {\hbox{min} }\limits_{\alpha } \frac{1}{2}\sum\limits_{i = 1, j = 1}^{n} {y_{i} y_{j} \alpha_{i} \alpha_{j} \kappa \left( {x_{i} , x_{j} } \right) - \sum\limits_{i = 1}^{n} {\alpha_{j} } }$$

subject to \(\sum\limits_{i = 1}^{l} {y_{i} \alpha_{i} = 0,\;\;\;i = 1 \ldots, l}\) Equations (4) and (5) show a kernel method that constructs an optimal discriminant function. The optimal solution is α = (α1,…,αl)T. Choose αj to calculate the threshold:

$$b = y_{i} \sum\limits_{i = 1}^{l} {y_{i} \alpha_{i} \kappa \left( {x_{i} ,x_{j} } \right)}$$

and construct the decision function:

$$f(x) = \text{sgn} \left( {b = y_{i} \sum\limits_{i = 1}^{l} {y_{i} \alpha_{i} \kappa \left( {x_{i} ,x_{j} } \right)} + b} \right)$$

Therefore, the linear function is changed non-linearly according to Eqs. (1) to (5). In our case, the SVM focused on peak angle and peak velocity patterns and found the hypersurface that maximized the margin between the five distributions to classify data into the five BRSs, including stage I–II, III, IV, V, and VI.

The bootstrap kinematics-related data were then randomly classified into 940 training data values and 60 testing data values. By using the training dataset, an SVM algorithm was proposed to establish the classification model. After training, the SVM classification model was applied to the training dataset to predictively assign the testing dataset to the appropriate class of BRS stages. This validation procedure was repeated 10 times, and the accuracy rate was calculated as AR = (TP + TN)/N, (AR, accuracy rate; TP, true positive; TN, true negative; N, sum of true positive, true negative, false positive, and false negative). This ensured that the trained SVM could generalize data to be applied to peak angle and peak velocity patterns when presented newly to the SVM algorithm [22]. All analyses were performed with R 3.4.0 software (R Foundation for Statistical Computing, Vienna, Austria).


Results of BRS for all participants were agreed upon by 3 therapists on the day of the experiment (see last line in Table 1). The time-series plots of normalized finger angles recorded during the 7-s movement duration for each BRS are illustrated for 24 patients in Figs. 2 and 3. The plots reveal that ROM increased as the BRS increased. The time-series plots of normalized velocities during the 7-s movement duration for each BRS are shown in Fig. 4. The peak angles of finger flexion increased with the BRS stage, whereas those of finger extension decreased (Spearman’s rank correlation coefficients; finger flexion, r = 0.681, p = 0.003; finger extension, r = -0.848, p < 0.0001) (Fig. 5a, b). The peak velocities of finger flexion also increased with the BRS stage, whereas the peak velocities of finger extension did not correlate with the BRS stage (Spearman’s rank correlation coefficients; finger flexion, r = 0.776, p = 0.0002; finger extension, r = 0.285, p = 0.267) (Fig. 5c, d). The bootstrap data values were identical to the actual kinematics-related data (Table 2). After establishing the SVM classification model by training based on 940 kinematics-related bootstrap data values of peak angle and peak velocity, the 60 testing bootstrap data values were predictively assigned as BRSs. The accuracy rate of 10 SVM classifications ranged from 0.850 to 0.878 (mean 0.863; 95% confidence interval 0.857–0.869; p = 0.006), which indicated high separation accuracy and low variability (Fig. 6).

Fig. 2

Plots of finger angles recorded from the 24 participants. Plots of a finger extension and b flexion trials. The transverse axis indicates the time course; the bold line during 0–7 s indicates the accounting data for analysis. The vertical axis indicates the voluntary range of motion of patients with stroke; 1 indicates full flexion and 0 is full extension of all finger joints. Lines indicate individual changes in finger angles among 24 patients

Fig. 3

Plots of finger angles recorded from the 24 participants with Brunnstrom recovery stages 1–6, as evaluated by therapists. The solid lines and gray zones indicate the means and standard errors of the mean, respectively. Graphs show results of ae finger flexion and fj finger extension trials. The time point of 0 s represents the movement start cue. Range of motion increases at higher stages. Abbreviations: BRSs, Brunnstrom recovery stages

Fig. 4

Plots of finger velocities. The solid lines and gray zones indicate the means and standard errors of the mean, respectively. Graphs show results of ae finger flexion and fj finger extension trials. The time point of 0 s represents the movement start cue. The velocities observed at stage 4 and above are higher than those at stage 3 and below. BRSs Brunnstrom recovery stages

Fig. 5

Bar graphs of mean peak angles. Graphs show peak angles for a finger flexion and b finger extension trials and c mean peak velocities for finger flexion and d finger extension trials. The error bars represent the standard error of the mean. Peak angles of finger flexion increase and those of extension decrease as BRS increases; higher BRSs exhibit higher peak velocities. BRSs Brunnstrom recovery stages

Table 2 Kinematics-related actual and bootstrap data
Fig. 6

Accuracy rates of SVM classifications. The solid line and thick gray zone represent mean and 95% confidence interval of support vector machine classifications

Discussion and Conclusions

This study analyzed the kinematics involved in finger flexion and extension in stroke patients by using machine-learning technology and a bootstrap method generated from actual kinematics-related data. This study investigated the potential to train artificial intelligence to predictively assign the BRS finger movement classification. According to previous reports, accuracy rates of SVM-based classifiers were 67–97% [23] and 86–88% [24]. Our results demonstrated that high separation accuracy was obtained by using an established SVM classification model based on normalized kinematics-related parameters for peak angle and peak velocity. This suggests that a classifier could be used to distinguish the state of recovery in the fingers by assessing their peak angle and velocity during flexion and extension. Artificial intelligence based on the bootstrap method may be able to judge the state of finger paralysis more accurately if an increased amount of actual finger flexion and extension data were recorded to enhance the training of the classifier. This presents the possibility that artificial intelligence can improve prognostic assessment as well as prediction-based therapeutic approaches and enhance the accuracy of results that are achieved by human assessment [25,26,27].

The status of hand paralysis recovery is typically evaluated by physicians and therapists by examining the voluntary ROM [28]. Few assessment techniques consider the velocity of finger movement, even though these are thought to be a useful kinematic index of voluntary movement. The development of tools that evaluate the recovery status of paralysis is ongoing. For example, 3D motion capture systems, such as the LMC, have recently been developed to record the kinematics of human fingers [29]. Developments in the field of robotic rehabilitation are therefore beginning to provide precise tools for the evaluation of human motion [30,31,32]. Currently, it is possible to measure the kinematics of finger flexion and extension during a short time by using a motion capture system to collect data. However, this form of artificial intelligence is not suitable for clinical use currently because of the limited amount of patient data.

The present study had some limitations. Only two parameters of paralysis (finger flexion and extension) could be separated by using artificial intelligence and generated bootstrap data values because of the very small sample size available. The accuracy rate of SVM classification in this study was only 85–88%. The bootstrap method was used in this study; therefore, the variance of the generated training data was smaller than the actual kinetics-related data, which might have affected the accuracy of the SVM classifier. The actual angles and velocities of finger flexion correlated with BRSs, whereas the actual velocities of finger extension did not correlate with BRSs in this study. Even though the results of correlation analysis between actual BRSs and kinetics-related data validated the bootstrap data, the generation of bootstrap data from small actual data in the BRSs category is a limitation of this study.

A detailed separation of recovery stages would require greater numbers of highly variable actual samples to improve the training of the classifier. In addition, the SVM-based classifier was used instead of vector quantization (VQ) in this study. The SVM had unique advantages because of its pattern recognition in small training samples using the peak angle of joints and velocity. VQ is the process of replacing the data set represented by the vector with a finite number of representative patterns without data loss [33, 34]. VQ also scientifically conquered sprouting [35, 36], and it might be used for high-quality motion capture during future research.

Comparisons of the different measurement techniques of finger ROM have been performed [37], and flexion and extension of all fingers were assessed with the LMC and a goniometer (the standard technique). The analogue goniometer measured the dorsal side of the hand and wrist (the center of rotation was exterior to the joint), whereas the LMC estimated the center of rotation as being interior to the joint. The LMC estimated rather than measured the joint angles during occlusion. This may explain the smaller standard deviations for LMC results compared to goniometer-acquired data [38]. The issues with measurements during occlusion can be solved by using multiple LMCs. Placidi et al. used two LMCs to track the position of the hand in 3D and reported a reduction in occlusions without incurring further complications [31].

As new motion sensor and capture techniques are developed, future studies focusing on the use of these in combination with human diagnosis or predictive prognosis for enhancing hand and finger rehabilitation are required. As an example of this approach, a deep learning architecture for video-based person reidentification has been developed [39]. The kinematic indices of deviations in arm motion have been presented; these quantify the degree to which an individual’s joint-angle curves deviate from the mean curves of a normal population. However, the indices do not describe the manner of deviation [40].

In conclusion, the present study demonstrated a method using artificial intelligence to evaluate the recovery of hand movements of stroke patients. This method has the potential for enhancing the diagnosis and prognosis of patients undergoing rehabilitation. Further studies are required to enhance the feature learning model and optimize motion capture.


  1. 1.

    Coupar, F., Pollock, A., Rowe, P., Weir, C., & Langhorne, P. (2012). Predictors of upper limb recovery after stroke: A systematic review and meta-analysis. Clinical Rehabilitation,26(4), 291–313.

    Article  Google Scholar 

  2. 2.

    Hou, L., Du, X., Chen, L., Li, J., Yan, P., Zhou, M., et al. (2018). Exercise and quality of life after first-ever ischaemic stroke: A two-year follow-up study. International Journal of Neuroscience,128(6), 540–548.

    Article  Google Scholar 

  3. 3.

    DeJong, S. L., Birkenmeier, R. L., & Lang, C. E. (2012). Person-specific changes in motor performance accompany upper extremity functional gains after stroke. Journal of Applied Biomechanics,28(3), 304–316.

    Article  Google Scholar 

  4. 4.

    McCrea, P. H., Eng, J. J., & Hodgson, A. J. (2002). Biomechanics of reaching: Clinical implications for individuals with acquired brain injury. Disability and Rehabilitation,24(10), 534–541.

    Article  Google Scholar 

  5. 5.

    Fugl-Meyer, A. R., Jaasko, L., Leyman, I., Olsson, S., & Steglind, S. (1975). The post-stroke hemiplegic patient. 1. A method for evaluation of physical performance. Scandinavian Journal of Rehabilitation Medicine,7(1), 13–31.

    Google Scholar 

  6. 6.

    Brunnstrom, S. (1966). Motor testing procedures in hemiplegia: Based on sequential recovery stages. Physical Therapy,46(4), 357–375.

    Article  Google Scholar 

  7. 7.

    Duncan, P. W., Propst, M., & Nelson, S. G. (1983). Reliability of the Fugl–Meyer assessment of sensorimotor recovery following cerebrovascular accident. Physical Therapy,63(10), 1606–1610.

    Article  Google Scholar 

  8. 8.

    Vakanski, A., Jun, H. P., Paul, D., & Baker, R. (2018). A data set of human body movements for physical rehabilitation exercises. Data (Basel).

    Article  Google Scholar 

  9. 9.

    Kim, K., Song, W. K., Lee, J., Lee, H. Y., Park, D. S., Ko, B. W., et al. (2014). Kinematic analysis of upper extremity movement during drinking in hemiplegic subjects. Clinical Biomechanics (Bristol, Avon),29(3), 248–256.

    Article  Google Scholar 

  10. 10.

    Klotz, M. C., Kost, L., Braatz, F., Ewerbeck, V., Heitzmann, D., Gantz, S., et al. (2013). Motion capture of the upper extremity during activities of daily living in patients with spastic hemiplegic cerebral palsy. Gait Posture,38(1), 148–152.

    Article  Google Scholar 

  11. 11.

    Chou, S. W., Wong, A. M., Leong, C. P., Hong, W. S., Tang, F. T., & Lin, T. H. (2003). Postural control during sit-to stand and gait in stroke patients. American Journal of Physical Medicine and Rehabilitation,82(1), 42–47.

    Article  Google Scholar 

  12. 12.

    Tanaka, R., Takimoto, H., Yamasaki, T., & Higashi, A. (2018). Validity of time series kinematical data as measured by a markerless motion capture system on a flatland for gait assessment. Journal of Biomechanics,71, 281–285.

    Article  Google Scholar 

  13. 13.

    Timmi, A., Coates, G., Fortin, K., Ackland, D., Bryant, A. L., Gordon, I., et al. (2018). Accuracy of a novel marker tracking approach based on the low-cost Microsoft Kinect v2 sensor. Medical Engineering & Physics,59, 63–69.

    Article  Google Scholar 

  14. 14.

    Otte, K., Kayser, B., Mansow-Model, S., Verrel, J., Paul, F., Brandt, A. U., et al. (2016). Accuracy and reliability of the kinect version 2 for clinical measurement of motor function. PLoS ONE,11(11), e0166532.

    Article  Google Scholar 

  15. 15.

    Niechwiej-Szwedo, E., Gonzalez, D., Nouredanesh, M., & Tung, J. (2018). Evaluation of the leap motion controller during the performance of visually-guided upper limb movements. PLoS ONE,13(3), e0193639.

    Article  Google Scholar 

  16. 16.

    Smeragliuolo, A. H., Hill, N. J., Disla, L., & Putrino, D. (2016). Validation of the leap motion controller using markered motion capture technology. Journal of Biomechanics,49(9), 1742–1750.

    Article  Google Scholar 

  17. 17.

    Kim, S. H., Park, E. Y., Joo, J., & Chung, J. (2018). The De Ritis and neutrophil-to-lymphocyte ratios may aid in the risk assessment of patients with metastatic renal cell carcinoma. Journal of Oncology,2018, 1953571.

    Article  Google Scholar 

  18. 18.

    Chiu, C. T., Hayward, M. D., Chan, A., & Matchar, D. B. (2019). Educational differences in the compression of disability incidence in the United States. SSM Population Health,7, 100347.

    Article  Google Scholar 

  19. 19.

    Schaffar, R., Belot, A., Rachet, B., & Woods, L. (2019). On the use of flexible excess hazard regression models for describing long-term breast cancer survival: A case-study using population-based cancer registry data. BMC Cancer,19(1), 107.

    Article  Google Scholar 

  20. 20.

    Davatzikos, C., Ruparel, K., Fan, Y., Shen, D. G., Acharyya, M., Loughead, J. W., et al. (2005). Classifying spatial patterns of brain activity with machine learning methods: Application to lie detection. Neuroimage,28(3), 663–668.

    Article  Google Scholar 

  21. 21.

    Mohammadpoory, Z., Nasrolahzadeh, M., & Haddadnia, J. (2017). Epileptic seizure detection in EEGs signals based on the weighted visibility graph entropy. Seizure,50, 202–208.

    Article  Google Scholar 

  22. 22.

    Kloppel, S., Stonnington, C. M., Chu, C., Draganski, B., Scahill, R. I., Rohrer, J. D., et al. (2008). Automatic classification of MR scans in Alzheimer’s disease. Brain,131(Pt 3), 681–689.

    Article  Google Scholar 

  23. 23.

    Cavallo, F., Moschetti, A., Esposito, D., Maremmani, C., & Rovini, E. (2019). Upper limb motor pre-clinical assessment in Parkinson’s disease using machine learning. Parkinsonism Related Disorders.

    Article  Google Scholar 

  24. 24.

    Zhang, Y., Yu, J., Xia, C., Yang, K., Cao, H., & Wu, Q. (2019). Research on GA-SVM based head-motion classification via mechanomyography feature analysis. Sensors (Basel).

    Article  Google Scholar 

  25. 25.

    Jager, M., Mayer, C., Hefter, H., Siebler, M., & Kecskemethy, A. (2018). Big data and deep learning in preventive and rehabilitation medicine. Orthopade,47(10), 826–833.

    Article  Google Scholar 

  26. 26.

    Barry, D. T. (2018). Adaptation, artificial intelligence, and physical medicine and rehabilitation. Physical Medicine and Rehabilitation,10(9S2), S131–S143.

    Article  Google Scholar 

  27. 27.

    Gibert, K., Garcia-Rudolph, A., Garcia-Molina, A., Roig-Rovira, T., Bernabeu, M., & Tormos, J. M. (2008). Response to traumatic brain injury neurorehabilitation through an artificial intelligence and statistics hybrid knowledge discovery from databases methodology. Medical Archives,62(3), 132–135.

    Google Scholar 

  28. 28.

    Broetz, D., Del Grosso, N. A., Rea, M., Ramos-Murguialday, A., Soekadar, S. R., & Birbaumer, N. (2014). A new hand assessment instrument for severely affected stroke patients. NeuroRehabilitation,34(3), 409–427.

    Article  Google Scholar 

  29. 29.

    Hsieh, Y. W., Lin, K. C., Wu, C. Y., Lien, H. Y., Chen, J. L., Chen, C. C., et al. (2014). Predicting clinically significant changes in motor and functional outcomes after robot-assisted stroke rehabilitation. Archives of Physical Medicine and Rehabilitation,95(2), 316–321.

    Article  Google Scholar 

  30. 30.

    Stansfield, B., Rooney, S., Brown, L., Kay, M., Spoettl, L., & Shanmugam, S. (2018). Distal upper limb kinematics during functional everyday tasks. Gait Posture,61, 135–140.

    Article  Google Scholar 

  31. 31.

    Placidi, G., Cinque, L., Polsinelli, M., & Spezialetti, M. (2018). Measurements by a leap-based virtual glove for the hand rehabilitation. Sensors (Basel).

    Article  Google Scholar 

  32. 32.

    Johansson, G. M., Grip, H., Levin, M. F., & Hager, C. K. (2017). The added value of kinematic evaluation of the timed finger-to-nose test in persons post-stroke. Journal of NeuroEngineering and Rehabilitation,14(1), 11.

    Article  Google Scholar 

  33. 33.

    Lee, Y. Y., & Woods, J. W. (1995). Motion vector quantization for video coding. IEEE Transactions on Image Processing,4(3), 378–382.

    Article  Google Scholar 

  34. 34.

    Lee, J. (1998). Joint optimization of block size and quantization for quadtree-based motion estimation. IEEE Transactions on Image Processing,7(6), 909–912.

    Article  Google Scholar 

  35. 35.

    Zito, T., Wilbert, N., Wiskott, L., & Berkes, P. (2008). Modular toolkit for data processing (MDP): A python data processing framework. Frontiers in Neuroinformatics,2, 8.

    Article  Google Scholar 

  36. 36.

    Liu, T., & Liang, Z. Q. (2017). Lightweight biometric sensing for walker classification using narrowband RF links. Sensors (Basel).

    Article  Google Scholar 

  37. 37.

    Nizamis, K., Rijken, N. H. M., Mendes, A., Janssen, M., Bergsma, A., & Koopman, B. (2018). A novel setup and protocol to measure the range of motion of the wrist and the hand. Sensors (Basel).

    Article  Google Scholar 

  38. 38.

    Guna, J., Jakus, G., Pogacnik, M., Tomazic, S., & Sodnik, J. (2014). An analysis of the precision and reliability of the leap motion sensor and its suitability for static and dynamic tracking. Sensors (Basel),14(2), 3702–3720.

    Article  Google Scholar 

  39. 39.

    Sun, R., Huang, Q., Xia, M., & Zhang, J. (2018). Video-based person re-identification by an end-to-end learning architecture with hybrid deep appearance-temporal feature. Sensors (Basel).

    Article  Google Scholar 

  40. 40.

    Riad, J., Coleman, S., Lundh, D., & Brostrom, E. (2011). Arm posture score and arm movement during walking: A comprehensive assessment in spastic hemiplegic cerebral palsy. Gait Posture,33(1), 48–53.

    Article  Google Scholar 

Download references


We thank the Tokyo Dental College Ichikawa General Hospital staff at the Department of Rehabilitation for their contributions. We especially thank Sho Yoshimura for the operational approval to conduct the study. We would like to thank Editage ( for English language editing. This study was supported by a grant-in-aid from Saitama Prefectural University (2015-2017) and technical support from Takei Scientific Instruments Co., Ltd.

Author information



Corresponding author

Correspondence to Toyohiro Hamaguchi.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Hamaguchi, T., Saito, T., Suzuki, M. et al. Support Vector Machine-Based Classifier for the Assessment of Finger Movement of Stroke Patients Undergoing Rehabilitation. J. Med. Biol. Eng. 40, 91–100 (2020).

Download citation


  • Hemiparesis
  • Finger movement
  • Rehabilitation
  • Stroke