Advertisement

Journal of Medical Systems

, 41:10 | Cite as

A Serious Games Platform for Cognitive Rehabilitation with Preliminary Evaluation

  • Paula Alexandra RegoEmail author
  • Rui Rocha
  • Brígida Mónica Faria
  • Luís Paulo Reis
  • Pedro Miguel Moreira
Patient Facing Systems
Part of the following topical collections:
  1. Health Information Systems & Technologies

Abstract

In recent years Serious Games have evolved substantially, solving problems in diverse areas. In particular, in Cognitive Rehabilitation, Serious Games assume a relevant role. Traditional cognitive therapies are often considered repetitive and discouraging for patients and Serious Games can be used to create more dynamic rehabilitation processes, holding patients’ attention throughout the process and motivating them during their road to recovery. This paper reviews Serious Games and user interfaces in rehabilitation area and details a Serious Games platform for Cognitive Rehabilitation that includes a set of features such as: natural and multimodal user interfaces and social features (competition, collaboration, and handicapping) which can contribute to augment the motivation of patients during the rehabilitation process. The web platform was tested with healthy subjects. Results of this preliminary evaluation show the motivation and the interest of the participants by playing the games.

Keywords

Serious games Rehabilitation Cognitive rehabilitation Natural user interfaces Games Health informatics 

Notes

Acknowledgments

This work has been supported by FCT - Fundação para a Ciência e Tecnologia in the scope of the projects: PEst-UID/CEC/00319/2015 and PEst-UID/CEC/00027/2015. The authors would like to thank also all the volunteers that participated in the study.

References

  1. 1.
    Burke, J. W., McNeill, M. D. J., Charles, D. K., Morrow, P. J., Crosbie, J. H., and McDonough, S. M., Optimising engagement for stroke rehabilitation using serious games. Vis. Comput. 25:1085–1099, 2009.CrossRefGoogle Scholar
  2. 2.
    Burke, J. W., McNeill, M. D. J., Charles, D. K., Morrow, P. J., Crosbie, J. H., McDonough, S. M. Augmented reality games for upper-limb stroke rehabilitation. In: 2010 second international conference on games and virtual worlds for serious applications (VS-GAMES). pp. 75–78. 2010.Google Scholar
  3. 3.
    Maclean, N., Pound, P., Wolfe, C., and Rudd, A., Qualitative analysis of stroke patients’ motivation for rehabilitation. Br. Med. J. 321:1051–1054, 2000.CrossRefGoogle Scholar
  4. 4.
    Krichevets, A. N., Sirotkina, E. B., Yevsevicheva, I. V., and Zeldin, L. M., Computer games as a means of movement rehabilitation. Disabil. Rehabil. 17:100–105, 1995.CrossRefPubMedGoogle Scholar
  5. 5.
    Rego, P., Moreira, P. M., Reis, L. P., Serious games for rehabilitation: a survey and a classification towards a taxonomy. In: 5th Iberian conference on information systems and technologies. Vol. I. pp. 349–354. Santiago de Compostela, Spain, 2010.Google Scholar
  6. 6.
    Rego, P. A., Moreira, P. M., Reis, L. P., New forms of interaction in serious games for rehabilitation. In: Cruz-Cunha, M. M., (Ed.), Handbook of research on serious games as educational, business, and research tools: development and design. IGI Global, 2012.Google Scholar
  7. 7.
    Rego, P. A., Moreira, P. M., and Reis, L. P., A serious games framework for health rehabilitation. Int. J. Healthc. Inf. Syst. Inf. (IJHISI) 9:1–21, 2014.CrossRefGoogle Scholar
  8. 8.
    Rego, P. A., Moreira, P. M., Reis, L. P., Architecture for serious games in health rehabilitation. In: Rocha, Á., Correia, A. M., Tan, F. B., Stroetmann, K. A.. (Eds.), New perspectives in information systems and technologies, volume 2, Vol. 276. pp. 307–317. Springer International Publishing, 2014.Google Scholar
  9. 9.
    Mendes, L., Dores, A. R., Rego, P. A., Moreira, P. M., Barbosa, F., Reis, L. P., Viana, J., Coelho, A., and Sousa, A., Virtual centre for the rehabilitation of road accident victims (VICERAVI). In: Rocha, A., CalvoManzano, J., Reis, L. P., and Cota, M. P. (Eds.), 7th Iberian conference on information systems and technologies (CISTI 2012), vol. I. AISTI, Madrid, pp. 817–822, 2012.Google Scholar
  10. 10.
    Rocha, R., Reis, L. P., Rego, P. A., Moreira, P. M., Serious games for cognitive rehabilitation: Forms of interaction and social dimension. In: 2015 10th Iberian conference on information systems and technologies (CISTI). pp. 1–6. 2015.Google Scholar
  11. 11.
    Alankus, G., Lazar, A., May, M., Kelleher, C., Towards customizable games for stroke rehabilitation. In: Proceedings of the SIGCHI conference on human factors in computing systems. pp. 2113–2122. ACM, Atlanta, Georgia, USA, 2010.Google Scholar
  12. 12.
    Ma, M., and Bechkoum, K., Serious games for movement therapy after stroke. IEEE international conference on systems, man and cybernetics. International Convention & Exhibition Center, Suntec Singapore, pp. 1872–1877, 2008.Google Scholar
  13. 13.
    Karray, F., Alemzadeh, M., Saleh, J. A., and Arab, M. N., Human-computer interaction: overview on state of the art. Int. J. Smart Sens. Intell. Syst. 1:137–159, 2008.Google Scholar
  14. 14.
    Oviatt, S., Multimodal interfaces. In: Julie, A. J., Andrew, S., (Eds.), The human-computer interaction handbook, pp. 286–304. L. Erlbaum Associates Inc, 2003.Google Scholar
  15. 15.
    Rego, P. A., Moreira, P. M., Reis, L. P., Natural user interfaces in serious games for rehabilitation: a prototype and playability study. In: Rocha, Á., Gonçalves, R., Cota, M. P., Reis, L. P., (Eds.), First Iberian Workshop on Serious Games and Meaningful Play (SGaMePlay’2011) - Proceedings of the 6th iberian conference on information systems and technologies, Vol. I. pp. 229–232. Chaves, Portugal, 2011.Google Scholar
  16. 16.
    Rego, P. A., Moreira, P. M., Reis, L. P., Natural and multimodal user interfaces in serious games for health rehabilitation. In: MASH’14: Multi-agent systems for healthcare / AAMAS’14 - 13th international conference on autonomous agents and multiagent systems. IFAMAAS, 2014.Google Scholar
  17. 17.
    Jaimes, A., and Sebe, N., Multimodal human-computer interaction: a survey. Comput. Vis. Image Underst. 108:116–134, 2007.CrossRefGoogle Scholar
  18. 18.
    Jain, J., Lund, A., Wixon, D., The future of natural user interfaces. In: CHI ‘11 extended abstracts on human factors in computing systems. pp. 211–214. ACM, 1979527, 2011.Google Scholar
  19. 19.
    Chai, J. Y., Hong, P., Zhou, M. X., A probabilistic approach to reference resolution in multimodal user interfaces. In: Proceedings of the 9th international conference on intelligent user interfaces. pp. 70–77. ACM, Funchal, Madeira, Portugal, 2004.Google Scholar
  20. 20.
    Faria, B. M., Reis, L. P., Lau, N., Soares, J. C., and Vasconcelos, S., Patient classification and automatic configuration of an intelligent wheelchair. In: Filipe, J., and Fred, A. (Eds.), Agents and artificial intelligence, vol. 358. Springer, Berlin Heidelberg, pp. 268–282, 2013.CrossRefGoogle Scholar
  21. 21.
    Johnston, M., Bangalore, S., MATCHkiosk: a multimodal interactive city guide. In: Proceedings of the ACL 2004 on Interactive poster and demonstration sessions. pp. 33. Association for Computational Linguistics, 2004.Google Scholar
  22. 22.
    Ibrahim, A., and Johansson, P., Multimodal dialogue systems: a case study for interactive TV. In: Carbonell, N., and Stephanidis, C. (Eds.), Universal access theoretical perspectives, practice, and experience: 7th ERCIM international workshop on user interfaces for all, Paris, France, October 24–25, 2002, revised papers. Springer Berlin Heidelberg, Berlin, pp. 209–218, 2003.CrossRefGoogle Scholar
  23. 23.
    Morikawa, C., and Lyons, M. J., Design and evaluation of vision-based head and face tracking interfaces for assistive input. In: Georgios, K. (Ed.), Assistive technologies and computer access for motor disabilities. IGI Global, Hershey, pp. 180–205, 2014.CrossRefGoogle Scholar
  24. 24.
    Ronzhin, A., Karpov, A., Assistive multimodal system based on speech recognition and head tracking. In: Proceedings of 13th European Signal Processing Conference. 2005Google Scholar
  25. 25.
    Reis, L., Faria, B., Vasconcelos, S., Lau, N., Invited paper: multimodal interface for an intelligent wheelchair. In: Ferrier, J.-L., Gusikhin, O., Madani, K., Sasiadek, J., (Eds.), Informatics in control, automation and robotics, Vol. 325. pp. 1–34. Springer International Publishing, 2015Google Scholar
  26. 26.
    Ogiela, M. R., and Hachaj, T., Natural user interfaces in medical image analysis: cognitive analysis of brain and carotid artery images. Springer International Publishing, Switzerland, 2014.Google Scholar
  27. 27.
    Steinberg, G., Natural user interfaces. In: ACM SIGCHI conference on human factors in computing systems. 2012.Google Scholar
  28. 28.
    Faria, B. M., Reis, L. P., Lau, N., Moreira, A. P., Petry, M., Ferreira, L. M., Intelligent wheelchair driving: bridging the gap between virtual and real intelligent wheelchairs. In: Pereira, F., Machado, P., Costa, E., Cardoso, A., (Eds.), Progress in artificial intelligence. Vol. 9273, pp. 445–456. Springer International Publishing, 2015.Google Scholar
  29. 29.
    Faria, B. M., Reis, L. P., Lau, N., A methodology for creating an adapted command language for driving an intelligent wheelchair. J. Intell. Robot. Syst. 80, 2015.Google Scholar
  30. 30.
    Faria, B., Reis, L., and Lau, N., Adapted control methods for cerebral palsy users of an intelligent wheelchair. J. Intell. Robot. Syst. 77:299–312, 2015.CrossRefGoogle Scholar
  31. 31.
    Faria, B. M., Silva, A., Faias, J., Reis, L. P., Lau, N., Intelligent wheelchair driving: a comparative study of cerebral palsy adults with distinct boccia experience. In: Rocha, Á., Correia, A. M., Tan, F. B., Stroetmann, K. A., (Eds.), New perspectives in information systems and technologies, volume 2. Vol. 276. pp. 329–340. Springer International Publishing, 2014.Google Scholar
  32. 32.
    Faria, B. M., Vasconcelos, S., and Reis, L. P., Evaluation of distinct input methods of an intelligent wheelchair in simulated and real environments: a performance and usability study. Assist. Technol. Off. J. RESNA 25:88–98, 2013.CrossRefGoogle Scholar
  33. 33.
    Faria, B., Reis, L., Teixeira, S., Faias, J., Lau, N., Intelligent wheelchair simulator for users’ training cerebral palsy children’s case study. In: 8th Iberian conference on information systems and technologies (CISTI). 2013.Google Scholar
  34. 34.
    Faria, B. M., Vasconcelos, S., Reis, L. P., Lau, N., A methodology for creating intelligent wheelchair users’ profiles. In: ICAART 2012 – 4th International conference on agents and artificial intelligence. pp. 171–179. 2012.Google Scholar
  35. 35.
    Moussa, M. B., Magnenat-Thalmann, N., Applying affect recognition in serious games: the playmancer project. In: Egges, A., Geraerts, R., Overmars, M., (Eds.), Motion in games. pp. 53–62. Springer, 2009.Google Scholar
  36. 36.
    Gerling, K., Livingston, I., Nacke, L., Mandryk, R., Full-body motion-based game interaction for older adults. In: Proceedings of the SIGCHI conference on human factors in computing systems. pp. 1873–1882. ACM, Austin, Texas, USA, 2012.Google Scholar
  37. 37.
    Chang, Y.-J., Chen, S.-F., and Chuang, A.-F., A gesture recognition system to transition autonomously through vocational tasks for individuals with cognitive impairments. Res. Dev. Disabil. 32:2064–2068, 2011.CrossRefPubMedGoogle Scholar
  38. 38.
    Ciger, J., Herbeliny, B., Thalmannz, D., Evaluation of gaze tracking technology for social interaction in virtual environments. In: Proceedings of the 2nd workshop on modeling and motion capture techniques for virtual environments (CAPTECH04). 2004.Google Scholar
  39. 39.
    Jacob, R. J. K., Karn, K. S., Eye tracking in human-computer interaction and usability research: ready to deliver the promises. The mind's eye: cognitive the mind's eye: cognitive and applied aspects of eye movement research. pp. 573–603. 2003.Google Scholar
  40. 40.
    Mohamed, A. O., Silva, M. P. D., Courboulay, V., A history of eye gaze tracking. Tech. Rep. 2008.Google Scholar
  41. 41.
    Cowie, R., Douglas-Cowie, E., Tsapatsoulis, N., Votsis, G., Kollias, S., Fellenz, W., and Taylor, J. G., Emotion recognition in human-computer interaction. IEEE Signal Process. Mag. 18:32–80, 2001.CrossRefGoogle Scholar
  42. 42.
    Li, S. Z., and Jain, A. K., Handbook of face recognition. Springer Science & Business Media, Germany, 2011.CrossRefGoogle Scholar
  43. 43.
    Menache, A., Understanding motion capture for computer animation and video games. Morgan Kaufmann, 2000.Google Scholar
  44. 44.
    Kirishima, T., Sato, K., and Chihara, K., Real-time gesture recognition by learning and selective control of visual interest points. IEEE Trans. Pattern Anal. Mach. Intell. 27:351–364, 2005.CrossRefPubMedGoogle Scholar
  45. 45.
    Gavrila, D. M., The visual analysis of human movement: a survey. Comput. Vis. Image Underst. 73:82–98, 1999.CrossRefGoogle Scholar
  46. 46.
    Bradski, G. R., Computer vision face tracking for use in a perceptual user interface. In: Proceedings of the fourth IEEE workshop on applications of computer vision (WACV’98). 1998.Google Scholar
  47. 47.
    Wachs, J. P., Kölsch, M., Stern, H., and Edan, Y., Vision-based hand-gesture applications. Commun. ACM 54:60–71, 2011.CrossRefGoogle Scholar
  48. 48.
    Microsoft kinect for Windows. Available: https://developer.microsoft.com/en-us/windows/kinect, 2016.
  49. 49.
    Leap motion. Available: https://www.leapmotion.com/ 2016.
  50. 50.
    Duchowski, A. T., A breadth-first survey of eye-tracking applications. Behav. Res. Methods Instrum. Comput. 34:455–470, 2002.CrossRefPubMedGoogle Scholar
  51. 51.
    Duchowski, A., Eye tracking methodology: theory and practice. Springer Science & Business Media, 2007.Google Scholar
  52. 52.
    Bulling, A., and Gellersen, H., Toward mobile Eye-based human-computer interaction. IEEE Pervasive Comput. 9:8–12, 2010.CrossRefGoogle Scholar
  53. 53.
    Dickie, C., Vertegaal, R., Sohn, C., Cheng, D., Eyelook: using attention to facilitate mobile media consumption. In: Proceedings of the 18th annual ACM symposium on user interface software and technology. pp. 103–106. ACM, Seattle, WA, USA, 2005.Google Scholar
  54. 54.
    Zhai, S., Morimoto, C., Ihde, S., Manual and gaze input cascaded (MAGIC) pointing. In: Proceedings of the SIGCHI conference on Human factors in computing systems: the CHI is the limit. pp. 246–253. ACM, Pittsburgh, Pennsylvania, United States, 1999.Google Scholar
  55. 55.
    Tobii. Available: http://www.tobii, 2015.
  56. 56.
    Schneiderman, R., Accuracy, apps advance speech recognition [special reports]. IEEE Signal Process. Mag. 32:12–125, 2015.CrossRefGoogle Scholar
  57. 57.
    Schroeder, M. R., Computer speech: recognition, compression, synthesis. Springer Science & Business Media, 2004.Google Scholar
  58. 58.
    Igarashi, T., Hughes, J. F., Voice as sound: using non-verbal voice input for interactive control. In: Proceedings of the 14th annual ACM symposium on User interface software and technology. pp. 155–156. ACM, Orlando, Florida, 2001.Google Scholar
  59. 59.
    Sporka, A. J., Kurniawan, S. H., and Slavík, P., Non-speech operated emulation of keyboard. In: Clarkson, J., Langdon, P., and Robinson, P. (Eds.), Designing accessible technology. Springer London, London, pp. 145–154, 2006.CrossRefGoogle Scholar
  60. 60.
    Bilmes, J. A., Li, X., Malkin, J., Kilanski, K., Wright, R., Kirchhoff, K., Subramanya, A., Harada, S., Landay, J. A., Dowden, P., Chizeck, H., The vocal joystick: a voice-based human-computer interface for individuals with motor impairments. In: Proceedings of the conference on human language technology and empirical methods in natural language processing. pp. 995–1002. Association for Computational Linguistics, 2005.Google Scholar
  61. 61.
    Poláček, O., Sporka, A. J., and Míkovec, Z., Measuring performance of a predictive keyboard operated by humming. In: Miesenberger, K., Karshmer, A., Penaz, P., and Zagler, W. (Eds.), Computers helping people with special needs: 13th international conference, ICCHP 2012, Linz, Austria, July 11-13, 2012, proceedings, part II. Springer Berlin Heidelberg, Berlin, pp. 467–474, 2012.CrossRefGoogle Scholar
  62. 62.
    Harada, S., Wobbrock, J. O., and Landay, J. A., Voice games: investigation into the use of Non-speech voice input for making computer games more accessible. In: Campos, P., Graham, N., Jorge, J., Nunes, N., Palanque, P., and Winckler, M. (Eds.), Human-computer interaction – INTERACT 2011: 13th IFIP TC 13 international conference, Lisbon, Portugal, September 5-9, 2011, proceedings, part I. Springer Berlin Heidelberg, Berlin, pp. 11–29, 2011.CrossRefGoogle Scholar
  63. 63.
    Sporka, A. J., Kurniawan, S. H., Mahmud, M., Slavík, P., Non-speech input and speech recognition for real-time control of computer games. In: Proceedings of the 8th international ACM SIGACCESS conference on computers and accessibility. pp. 213–220. ACM, Portland, Oregon, USA, 2006.Google Scholar
  64. 64.
    Pierre-Yves, O., The production and recognition of emotions in speech: features and algorithms. Int. J. Hum. Comput. Stud. 59:157–183, 2003.CrossRefGoogle Scholar
  65. 65.
    Ververidis, D., and Kotropoulos, C., Emotional speech recognition: resources, features, and methods. Speech Comm. 48:1162–1181, 2006.CrossRefGoogle Scholar
  66. 66.
    Schiel, F., Steininger, S., Türk, U., The SmartKom multimodal corpus at BAS. In: Proc. 3rd Int. Conf. on Language Resources and Evaluation (LREC 2002). pp. 35–41. 2002.Google Scholar
  67. 67.
    France, D. J., Shiavi, R. G., Silverman, S., Silverman, M., and Wilkes, M., Acoustical properties of speech as indicators of depression and suicidal risk. IEEE Trans. Biomed. Eng. 47:829–837, 2000.CrossRefPubMedGoogle Scholar
  68. 68.
    Ozdas, A., Shiavi, R. G., Silverman, S. E., Silverman, M. K., and Wilkes, D. M., Investigation of vocal jitter and glottal flow spectrum as possible cues for depression and near-term suicidal risk. IEEE Trans. Biomed. Eng. 51:1530–1540, 2004.CrossRefPubMedGoogle Scholar
  69. 69.
    Schröder, M., Heylen, D., and Poggi, I., Perception of non-verbal emotional listener feedback. In: Hoffmann, R., and Mixdorff, H. (Eds.), Speech prosody 2006, vol. 40. TUDpress, Dresden, pp. 43–46, 2006.Google Scholar
  70. 70.
    Kostoulas, T., Mporas, I., Kocsis, O., Ganchev, T., Katsaounos, N., Santamaria, J. J., Jimenez-Murcia, S., Fernandez-Aranda, F., and Fakotakis, N., Affective speech interface in serious games for supporting therapy of mental disorders. Exp. Syst. Appl. 39:11072–11079, 2012.CrossRefGoogle Scholar
  71. 71.
    Hayward, V., Astley, O. R., Cruz-Hernandez, M., Grant, D., and Robles-De-La-Torre, G., Haptic interfaces and devices. Sens. Rev. 24:16–29, 2004.CrossRefGoogle Scholar
  72. 72.
    Göger, D., Weiß, K., Burghart, C., Wörn, H., Sensitive skin for a humanoid robot. In: Proceedings of the 2006 international conference on human-centered robotic systems. 2006.Google Scholar
  73. 73.
    AAPB. Available: http://www.aapb.org/, 2011.
  74. 74.
    Conconi, A., Ganchev, T., Kocsis, O., Papadopoulos, G., Fernandez-Aranda, F., Jimenez-Murcia, S., PlayMancer: a serious gaming 3D environment. In: International conference on automated solutions for cross media content and multi-channel distribution (AXMEDIS ‘08). pp. 111–117. Institute of Electrical and Electronics Engineers (IEEE), 2008.Google Scholar
  75. 75.
    Nacke, L. E., Kalyn, M., Lough, C., Mandryk, R .L., Biofeedback game design: using direct and indirect physiological control to enhance game interaction. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. pp. 103–112. ACM, Vancouver, BC, Canada, 2011.Google Scholar
  76. 76.
    Kuikkaniemi, K., Laitinen, T., Turpeinen, M., Saari, T., Kosunen, I., Ravaja, N., The influence of implicit and explicit biofeedback in first-person shooter games. In: Proceedings of the SIGCHI conference on human factors in computing systems. pp. 859–868. ACM, Atlanta, Georgia, USA, 2010.Google Scholar
  77. 77.
    Flynn, S., Palma, P., and Bender, A., Feasibility of using the Sony PlayStation 2 gaming platform for an individual poststroke: a case report. J. Neurol. Phys. Ther. 31:180–189, 2007.CrossRefPubMedGoogle Scholar
  78. 78.
    Saposnik, G., Teasell, R., Mamdani, M., Hall, J., McIlroy, W., Cheung, D., Thorpe, K., Cohen, L., and Bayley, M., Effectiveness of virtual reality using Wii gaming technology in stroke rehabilitation: a pilot randomized clinical trial and proof of principle. Stroke 41:1477–1484, 2010.CrossRefPubMedPubMedCentralGoogle Scholar
  79. 79.
    Nintendo: Wii console. Available: http://www.nintendo.com/wii/console, 2014.
  80. 80.
    Sony: playstation move. Available: http://pt.playstation.com/psmove/, 2014.
  81. 81.
    Vanacken, L., Notelaers, S., Raymaekers, C., Coninx, K., van den Hoogen, W., Jsselsteijn, W. I., Feys, P., Game-based collaborative training for arm rehabilitation of MS patients: a proof-of-concept game. In: Proceedings of the GameDays 2010. pp. 65–75. 2010.Google Scholar
  82. 82.
    Battocchi, A., Gal, E., Ben Sasson, A., Painesi, F., Venuti, P., Zancanaro, M., Weiss, P. L., Collaborative puzzle game - an interface for studying collaboration and social interaction for children who are typically developed or who have autistic spectrum disorder. In: Proceedings of the 7th International Conference series on disability, virtual reality and associated technologies (ICDVRAT). pp. 127–134. 2008.Google Scholar
  83. 83.
    Battocchi, A., Pianesi, F., Tomasini, D., Zancanaro, M., Esposito, G., Venuti, P., Sasson, A. B., Gal, E., Weiss, P. L., Collaborative puzzle game: a tabletop interactive game for fostering collaboration in children with Autism Spectrum Disorders (ASD). In: Proceedings of the ACM international conference on interactive tabletops and surfaces. pp. 197–204. ACM, Banff, Alberta, Canada, 2009.Google Scholar
  84. 84.
    Caglio, M., Latini-Corazzini, L., D’agata, F., Cauda, F., Sacco, K., Monteverdi, S., Zettin, M., Duca, S., and Geminiani, G., Video game play changes spatial and verbal memory: rehabilitation of a single case with traumatic brain injury. Cogn. Process. 10:195–197, 2009.CrossRefGoogle Scholar
  85. 85.
    Cameirão, M. S., Badia, S. B., Zimmerli, L., Oller, E. D., and Vershure, P. F. M. J., The rehabilitation gaming system: a review. Stud. Health Technol. Inform. 145:65–83, 2009.PubMedGoogle Scholar
  86. 86.
  87. 87.
    Maia, L., Gaspar, C., Azevedo, M., Loureiro, M. J., and Silva, C. F., Reabilitação cognitiva assistida por computador: o programa RehaCom e a sua utilização no GEARNeurop. Psiquiatr. Clín. 25:83–105, 2004.Google Scholar
  88. 88.
    Parrot software. Available: http://www.parrotsoftware.com/, 2016.
  89. 89.
    Fundación intras. Available: http://www.intras.es/index.php?id=75, 2014.
  90. 90.
    StatCounter: GlobalStats. Available: http://gs.statcounter.com/#browser-ww-monthly-201409-201509-bar, 2015.
  91. 91.
    Bangor, A., Kortum, P., and Miller, J., Determining what individual SUS scores mean: adding an adjective rating scale. J. Usability Stud. 4:114–123, 2009.Google Scholar

Copyright information

© Springer Science+Business Media New York 2016

Authors and Affiliations

  1. 1.ESTG/IPVC - Escola Superior de Tecnologia e GestãoInstituto Politécnico de Viana do CasteloViana do CasteloPortugal
  2. 2.LIACC - Laboratório de Inteligência Artificial e Ciência de ComputadoresPortoPortugal
  3. 3.DSI/EEUM - Departamento de Sistemas de InformaçãoEscola de Engenharia da Universidade do MinhoGuimarãesPortugal
  4. 4.ESS/PP - Escola Superior de SaúdePolitécnico do PortoPortoPortugal
  5. 5.INESC-TEC - Instituto de Engenharia de Sistemas e ComputadoresTecnologia e CiênciaPortoPortugal
  6. 6.Centro ALGORITMIUniversidade do MinhoGuimarãesPortugal

Personalised recommendations