Abstract
Numerous scientific studies measured cognitive abilities through administered tests. Most of these traditional tests are inappropriate for several reasons, such as cost-ineffectiveness, tiresomeness, and invasiveness. The need is to build cost-effective, exciting, and plausibly engaging techniques that could noninvasively measure cognitive abilities. This paper presents LAS, an intelligent technique that utilizes non-invasively collected game analytics to automatically evaluate the cluster of three cognitive abilities (i.e., Visual Long-term Memory (VLTM), Analytical Capability (AC), and Visual Short-term Memory in Change Detection Paradigm (VSTMiCDP)). The experimental group-based cross-generational cognitive evaluation in the game-based scenario established the potential of the proposed technique is twofold: 1) It successfully categorizes cluster of three targeted cognitive abilities within the 5-point evaluation sphere (i.e., 0–1 = very bad, 1–2 = bad, 2–3 = fair, 3–4 = good, 4–5 = excellent), and 2) It highly correlates with the results of a control group that are measured with three traditional cognitive tests.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
1 Introduction
A significant research interest emerged in investigating the positive influence of video games upon cognitive abilities [25, 26, 28, 43, 55, 61, 69, 81, 87]. In this regard, most of the literature investigated whether video game players (VGPs) outperform non-video game players (NVGPs) or not in terms of contrast sensitivity [52], temporal processing [24], change detection [14, 27], selective visual attention [9, 45], multiple objects apprehending [39], periphery and central vision [40], spatial resolution [41], attention skills [29], cognitive flexibility [17, 38], executive control [80], multiple object tracking, cognitive control [67], encoding speed [86], processing speed, visual short-term memory [62], general speeding of perceptual reaction [30], working memory [11, 18, 82], and even probabilistic inference [42]. Other significant amounts of literature considered video games as a training instrument for the improvement of various cognitive abilities, such as executive control [5], task switching, multiple object tracking [23], backward masking [53], visual skills [1], selective visual attention [70], visualization [54], attention [64], executive functioning [64, 68], cognitive functioning [13, 59], intellectual ability [10], navigation skills [19], physical functioning [59], and even contrast sensitivity [52]. The remaining literature considered video games as a therapeutic tool for cerebral palsy [58], cognitive impairment [65], short-term memory [60], and schizophrenia [20].
To measure the targeted cognitive abilities in the literature mentioned above, various observational [5, 39, 59, 82, 86], paper-pencil-based [5, 42, 59, 82], and computer-aided [1, 5, 9, 14, 18,19,20, 23, 24, 27, 29, 30, 39, 40, 45, 53, 59, 62, 67, 68, 70, 80, 82, 86] tests were administered (see Section 2). Most of these traditional tests are inappropriate due to the following reasons. First and foremost, they can only be administered by a psychologist in a clinical setting, making the cognitive assessment process cost-ineffective due to the cognitive assessment fee, tiresome due to the boring and repetitive nature, and invasive due to the subjective nature for the subjects. Second, they do not consider time as an important factor in estimating their respective cognitive abilities, which makes them incapable of tracking minor changes over frequent assessments [51]. Therefore, the current era needs to build cost-effective, exciting, and plausibly engaging techniques that could non-invasively measure cognitive abilities.
Since video games have now become a popular medium of leisure among all generations [4, 7, 31, 46, 50, 63, 71,72,73, 75, 83], this paper performs construct and concurrent validation of the following hypothesis: H= The game-based intelligent technique can substitute traditional cognitive assessment tests to non-invasively measure cognitive abilities. Accordingly, this article presents LAS, an intelligent technique that utilizes non-invasively collected game analytics to automatically evaluate the cluster of three cognitive abilities (i.e., VLTM, AC, and VSTMiCDP). A major motivation behind the selection of this cluster of cognitive abilities was its importance in human cognition [22, 33, 77, 78]. LAS employs BrainStorm as its first valuable component for the non-invasive collection of game analytics, which include accuracy (i.e., number of correct and incorrect attempts) and efficiency (i.e., time of each correct and incorrect attempt) [2, 3, 34, 35]. BrainStorm, a cross-generational game suite that consists of three brain games (BGs) (i.e., Picture Puzzle, Letter and Number, and Find the Difference) dedicatedly designed in relation to the theoretical operationalization of each particular cognitive ability (i.e., respectively as VLTM, AC, and VSTMiCDP). The second major component of LAS is its kernel which consists of three statistically driven models to interpret the nominative game analytics in a meaningful manner for evaluating the cluster of three targeted cognitive abilities. In the construct validation of LAS, an experimental group-based leave-one-out cross-validation (LOOCV) successfully categorized the targeted cognitive abilities of cross-generation within the 5-point evaluation sphere (i.e., 0–1 = very bad, 1–2 = bad, 2–3 = fair, 3–4 = good, 4–5 = excellent). Furthermore, in concurrent validation, a control group-based assessment was highly correlated with the experimental group-based cognitive evaluation results.
The remaining paper is organized as follows. Section 2 discusses the existing literature. Section 3 simultaneously explains both the components of LAS that include BrainStorm and its statistically driven models for the evaluation of each targeted cognitive ability (i.e., VLTM, AC, and VSTMiCDP). Section 4 states the research methodology including complete detail about the quantitative studies, participants’ recruitment, experimentation, data collection, and analysis. Section 5 demonstrates the results. Section 6 provides conclusive remarks.
2 Literature limitations and gaps
The domain under discussion is rich in terms of literature; however, most of the available literature is based on traditional test studies. In this section, the most relevant literature is incorporated to understand the area and background of the problem. This section is further divided into two paragraphs. The first paragraph encompasses the limitations and impracticality of \traditional test studies, whereas the second paragraph highlights the lack of validation in game-based test studies.
As demonstrated in Table 1, scientists employed multiple observational, paper-pencil-based, and computer-aided tests for the assessment of a single cognitive ability. For example, reading span, paper unfolding, and figure reconstruction tests were administered in a clinical setting to assess visual working memory [82]. Simultaneously, as also demonstrated in Table 1, one traditional test was utilized to assess the different cognitive abilities. For example, the N-back task was administered in a clinical setting to assess processing speed, visual short-term [62], working memory [18, 20], general speeding of perceptual reaction [30], executive control [5], task switching, and multiple object tracking [23]. The key reason for utilizing multiple and overlapping traditional tests to assess a single cognitive ability is the partial and overlapping association between the theoretical operationalization of cognitive abilities and the traditional tests. Due to the inability of a single traditional test to comprehensively assess targeted cognitive ability as well as the infeasibility to administer multiple traditional tests due to their cost-ineffective, tiresome, and invasive nature, a need was to build cost-effective, exciting, and plausibly engaging techniques that could non-invasively measure cognitive abilities.
The game-based test studies partially follow the footsteps of traditional test studies, where the theoretical operationalization of the targeted cognitive ability is considered as a test bed for the selection of suitable test(s), thus translating the theoretical operationalization of the targeted cognitive ability into their game design and aesthetics to offer a cost-effective, exciting, and plausibly engaging solution that could operate non-invasively [8, 12, 21, 37, 47,48,49, 84]. Like [3], these studies recorded game analytics that primarily includes task completion time, accuracy, and count to project the performance of a subject against a particular cognitive ability. It is worth mentioning that none of the studies proposed a technique to interpret the nominative game analytics comprehensively. Unlike the presented research, these studies are lacking with the construct (i.e., categorizing the targeted cognitive ability within a certain evaluation sphere) as well as concurrent (i.e., reporting a relationship between the experimental and control group-based results to potentially substitute the traditional tests) validation.
3 Las
The proposed intelligent technique, named LAS (see Fig. 1), employs BrainStorm as its first valuable component for the non-invasive collection of game analytics [2, 3, 34, 35]. BrainStorm is a cross-generational game suite that consists of three BGs (i.e., Picture Puzzle, Letter and Number, and Find the Difference) (see Fig. 2) dedicatedly designed in relation to the theoretical operationalization of each cognitive ability (i.e., respectively as VLTM, AC, and VSTMiCDP). The second major component of LAS is its kernel which consists of three statistically driven models (see Fig. 3) to interpret the nominative game analytics in a meaningful manner for evaluating the cluster of three targeted cognitive abilities.
Both components of LAS are simultaneously demonstrated and explained for each of the targeted cognitive abilities in the following subsections. Each subsection explains the theoretical operationalization of a particular cognitive ability. Subsequently, it narrates the corresponding game scenario as well as its alignment with the preceding and game analytics. Finally, it utilizes the equations that are comprehensively described in Table 2 to propose a model. For more clarity, a complete formation of each proposed model is also demonstrated in Fig. 3.
3.1 VLTM
3.1.1 Theoretical operationalization
VLTM is a two-step process that sequentially executes when visual objects are represented to the cognitive system, and both of their outputs may provide the basis on which objects get stored and retrieved. In the first step, signals from the retina are analyzed to extract visual features such as orientation, colors, and so forth. These assists in creating a detailed representation of independent features that are closely related to the physical properties of the visual objects. In the second step, the representation of independent features is integrated into the representation of coherent objects, which leads to the phenomenal experience of a visual scene that is segregated into coherent objects [74, 79]. One of the possible methods to evaluate the performance of a VLTM of an individual is by looking into their adequacy of retrieving the stored information (i.e., accuracy) in relation to retrieval time (i.e., efficiency) against the data received by visual stimuli. Thus, Picture Puzzle BG was employed to non-invasively collect the nominative analytics for VLTM evaluation.
3.1.2 Game scenario
The Picture Puzzle BG has 15 images of renowned places and personalities that appear in a sequence. Each time, the player must select the name correctly from the provided choices (see Fig. 2a). The core mechanism of this BG compels the player’s attention to receive the data from their visual source and pass it to the active memory. Active memory then proceeds and fetches its precise information from declarative long-term memory, established previously in the memorization session.
A clue cards-based memorization session, ranging from 5 to 10 minutes, was organized for each participant, prior to the gaming activity, to learn the names of distinguished places and personalities that could be asked in Picture Puzzle BG; however, it is noticed that on average only 10% of the places and personalities were already known to the participants.
3.1.3 Proposed model
The above-described theoretical operationalization of VLTM in relation to Picture Puzzle analytics is formed by utilizing the equations (see Table 2) into a proposed model as follows (see Fig. 3).
c1 = 0.91 is the adjustable constant defined after trials to confine the score of MVLTM in a 5-point evaluation sphere.
3.2 Ac
3.2.1 Theoretical operationalization
AC is a four-step process that sequentially includes visualization, information gathering, articulation, and analysis to solve complex or uncomplicated problems by rational decision-making based on one’s understanding [44]. A basic insight about an individual’s AC can be deduced by looking into any of its sub-domain’s problem-solving accuracy in relation to time (i.e., efficiency). Thus, Letter and Number BG was employed to non-invasively collect the nominative analytics for AC evaluation.
3.2.2 Game scenario
The Letter and Number BG has 10 incomplete series of numbers or letters that appear in a sequence. Each time, the player must identify its pattern to complete the series by selecting the correct letter or number from the provided options (see Fig. 2b). The core mechanism of this BG compels the player to sequentially execute information visualization, articulation, analysis, and decision-making based on their perception.
3.2.3 Proposed model
The above-described theoretical operationalization of AC in relation to Letter and Number analytics is formed by utilizing the equations (see Table 2) into a proposed model as follows (see Fig. 3).
Like c1, c2 = 0.83 is also the adjustable constant defined after trials to confine the score of MAC in a 5-point evaluation sphere.
3.3 VSTMiCDP
3.3.1 Theoretical operationalization
VSTMiCDP refers to the cognitive ability that stores visual information for a few seconds so that it can be used to compare a difference between the memory and the test array. VSTMiCDP plays an important role in maintaining continuity across visual interruptions, such as eye movement and blinking [56]. One of the possible methods to evaluate the performance of VSTMiCDP of an individual is by looking into their visual search accuracy between the two similar images (i.e., for change detection) in relation to time (i.e., efficiency). Thus, Find the Difference BG was employed to non-invasively collect the nominative analytics for VSTMiCDP evaluation.
3.3.2 Game scenario
The Find the Difference BG has 3 different pairs of similar images of renowned places that appear in a sequence. Each time, the player must discover exactly 6 differences between each pair of images, displayed one after another (see Fig. 2c). The core mechanism of this BG compels the player to hold visual information of the sample image (i.e., displayed for a few seconds on one side of the screen) in its active memory to match with the test image (i.e., later displayed on the other side of the screen) for identifying differences between the sample frame and the test frame.
3.3.3 Proposed model
The above-described theoretical operationalization of VSTMiCDP in relation to Find the Difference analytics is formed by utilizing the equations (see Table 2) into a proposed model as follows (see Fig. 3).
Unlike MVLTM and MAC, the addition of Rplayer _ incorrect _ attempt in MVSTMiCDP is due to the variableness of Nplayer _ incorrect _ attempt in Find the Difference BG. This can be explained as in the first two proposed models the value of Nplayer _ total _ attempt remains constant (i.e., 15 and 10, respectively). Thus, with the knowledge of Nplayer _ correct _ attempt, Nplayer _ incorrect _ attempt was deducible. Conversely in MVSTMiCDP, Nplayer _ incorrect _ attempt is a variable that also makes Nplayer _ total _ attempt a variable entity. Thus, to fully monitor the performance, it is equally important to explicitly consider Nplayer _ incorrect _ attempt along with Nplayer _ correct _ attempt.
Moreover, like c1 and c2, c3 = 0.65 is also the adjustable constant defined after trials to confine the score of MVSTMiCDP in a 5-point evaluation sphere.
4 Research methodology
4.1 Research design
Organized empirical research consists of three independent three-fold quantitative studies (QSs: QS1, QS2, and QS3). Every QS was undertaken by a distinct group of participants who simultaneously acted as a member of an experimental group (i.e., in the first and second fold) as well as a control group (i.e., in the third fold). In the first fold, each participant went through the memorization phase for Picture Puzzle BG. Later in the second fold, each participant played all three BGs of the BrainStorm game suite in a single-player gaming mode. Finally, in the third fold, traditional tests were administered to each participant to measure the cluster of their three targeted cognitive abilities. These tests include the names-faces subtest of the memory assessment test (MAT) [85], numerical sequences subtest of the differential aptitude test (DAT) [6], and spot the difference in cognitive decline (SDCD) [66] respectively for VLTM, AC, and VSTMiCDP.
4.2 Participants recruitment
To dynamically examine the potential applicability of LAS, participants of three different age groups were recruited (see Table 3). For instance, 23 children (13 male and 10 female of age 10 to 15 years) were recruited for QS1, 20 younger adults (14 male and 6 female of age 22 to 27 years) for QS2, and 20 older adults (7 male and 13 female of age 40 to 45 years) for QS3 on a volunteer basis. The recruitment process for volunteers as participants was carefully carried out based on the criteria of their sufficient gaming experience (i.e., habitual gameplay once a week or more). A reason behind the recruitment of volunteers with sufficient gaming experience was to avoid the learning curve and its consequences on gaming performance during QSs. An agreement on non-invasive game analytics collection was signed by the parents of the children, as well as by the younger and older adults themselves at the time of recruitment.
4.3 Experimental and control settings
Every QS intervention was 1 day for 8 hours, which includes experimental and control settings. Initially, in the experimental setup, computer-aided BGs playing multi-parallel activities were carried out under the observation of psychologists in a quiet room. On average, each participant took 20 minutes to end their one-time gaming activity. The 19.5-inch touch screens were utilized for the BGs play to allow each participant to interact with the BGs with better visibility. However, the touch screens were horizontally fixed on the table to ease the participants like tablets. Later, in the control setup, three traditional tests were administered side by side by the psychologists in a quiet room. On average, each participant took 35 minutes to complete their one-time traditional tests assessment. The first traditional test, MAT, requires the participant to learn the names of individuals who are portrayed in photographs. Following the delayed recall session of learning trials, the participant is presented with photographs and is asked to recognize the correct name from a brief list of alternatives. The second traditional test, DAT, requires the participants to identify the number sequence and solve the problem by identifying missing numbers from the sequence. The third traditional test, SDCD, requires the participant to memorize the details of the first picture for 30 seconds, after which the first picture is taken away and the second picture is shown. The participant is then asked to identify as many differences as possible between the first and second pictures, which were presented sequentially. To prevent bias risk and the potential influence of any unknown variable in the QSs, the participants who had completed the activity were not allowed to interact with the participants who were waiting to start the activity.
4.4 Data collection and analysis
The game analytics of each participant are noninvasively recorded and compiled separately for each BG activity to train and test the statistically driven models during the experimental group-based cross-generational evaluation. Similarly, the control group-based assessment results of each participant are compiled separately for their three targeted cognitive abilities.
Statistical analysis is performed on the experimental and control group-based results compiled for each targeted cognitive ability to determine trends within the QSs. During the initial phase of the statistical analysis, the mean, standard deviation, and standard error are calculated. While in the second phase, the Pearson correlation [15] is applied over the experimental and the control group-based results of each targeted cognitive ability in every QS to determine their degree of correlation, where the correlation coefficient r indicates the direction and the effect size of the correlation. According to [15, 16], the effect size is low if the value of r varies between ±0.1 to ±0.3, medium if it varies between ±0.3 to ±0.5, and large if it varies between ±0.5 to ±1.0. Furthermore, the p-value is calculated in this phase to demonstrate the significance of the findings [15]. It is well established that correlation does not imply causation, yet this method has been used by the vast range of literature that also includes a research study named StudentLife [76]. It is almost impossible in a real-world situation to discover the element(s) that has a causal relationship with the other element, as there always exist unknown factor(s) that alter(s) the causality between the related elements. Hence, the motivation behind employing the correlation technique, is not to discover a causal relationship but to comprehend the influence of an element in relation to the other(s) while conceding the influence is not causal.
5 Results
A complete summary of the experimental group-based evaluation results of each QS is demonstrated in Table 4 and Fig. 4a-c. The column charts in Fig. 4a-c demonstrate the potential of the proposed models to successfully categorize the cluster of three targeted cognitive abilities of the cross-generational participants within a 5-point evaluation sphere. Whereas the line charts in Fig. 4a-c demonstrate the potential of the proposed models to successfully estimate the mean level of the targeted cognitive abilities of three different age groups. This delineates the estimated mean level of younger adults’ targeted cognitive abilities as higher than children (i.e., VLTM : 16.2 % , AC : 6.2 % , and VSTMiCDP : 10.6%) as well as older adults (i.e., VLTM: 10.8 % , AC : 5.4 % , and VSTMiCDP : 2.2%). Furthermore, to estimate the significance of an experimental group-based evaluation in relation to a control group-based assessment, the results of the statistical analysis are given in Table 4.
The experimental group-based evaluation results of QS1 illustrate the estimated mean level of VLTM (\( \mu =2.62,\sigma =0.91, and\ {\sigma}_{\overline{x}}=0.19 \)) (i.e., above 60.9% and below 39.1% of the children’s evaluation score), AC (\( \mu =2.72,\sigma =0.79, and\ {\sigma}_{\overline{x}}=0.17 \)) (i.e., above 47.8% and below 52.2% of the children’s evaluation score), and VSTMiCDP (\( \mu =1.86,\sigma =1.19, and\ {\sigma}_{\overline{x}}=0.25 \)) (i.e., above 60.9% and below 39.1% of the children’s evaluation score) of the children (see Table 4 and Fig. 4a). This specifies that the estimated mean level of children’s VLTM and AC fall within the fair region of the 5-point evaluation sphere, whereas their VSTMiCDP falls at the upper bad region of the 5-point evaluation sphere. Statistical analysis is performed to estimate the significance of an experimental group-based cognitive evaluation in relation to a control group-based assessment. This revealed a significant correlation, with a large effect size, between the experimental and control group-based results of VLTM (r = 0.81, p < 0.001), AC (r = 0.76, p < 0.001), and VSTMiCDP (r = 0.84, p < 0.001).
The experimental group-based evaluation results of QS2 illustrate the estimated mean level of VLTM (\( \mu =3.43,\sigma =0.71, and\ {\sigma}_{\overline{x}}=0.16 \)) (i.e., above 45.0% and below 55.0% of the younger adults’ evaluation score), AC (\( \mu =3.03,\sigma =0.78, and\ {\sigma}_{\overline{x}}=0.18 \)) (i.e., 50.0% above as well as below the younger adults’ evaluation score), and VSTMiCDP (\( \mu =2.39,\sigma =1.35, and\ {\sigma}_{\overline{x}}=0.30 \)) (i.e., above 65.0% and below 35.0% of the younger adults’ evaluation score) of the younger adults (see Table 4 and Fig. 4b). This specifies that the estimated mean level of younger adults’ VLTM and AC fall within the good region of the 5-point evaluation sphere, whereas their VSTMiCDP falls at the middle fair region of the 5-point evaluation sphere. Statistical analysis is performed to estimate the significance of an experimental group-based cognitive evaluation in relation to a control group-based assessment. This revealed a significant correlation, with a large effect size, between the experimental and control group-based results of VLTM (r = 0.78, p < 0.001), AC (r = 0.72, p < 0.001), and VSTMiCDP (r = 0.86, p < 0.001).
The experimental group-based evaluation results of QS3 illustrate the estimated mean level of VLTM (\( \mu =2.89,\sigma =0.95, and\ {\sigma}_{\overline{x}}=0.21 \)) (i.e., above 60.0% and below 40.0% of the older adults’ evaluation score), AC (\( \mu =2.76,\sigma =0.89, and\ {\sigma}_{\overline{x}}=0.20 \)) (i.e., 50.0% above as well as below the older adults’ evaluation score), and VSTMiCDP (\( \mu =2.28,\sigma =1.24, and\ {\sigma}_{\overline{x}}=0.28 \)) (i.e., above 65.0% and below 35.0% of the older adults’ evaluation score) of the older adults (see Table 4 and Fig. 4c). This specifies that the estimated mean level of older adults’ targeted cognitive abilities falls within the fair region of the 5-point evaluation sphere. Statistical analysis is performed to estimate the significance of an experimental group-based cognitive evaluation in relation to a control group-based assessment. This revealed a significant correlation, with a large effect size, between the experimental and control group-based results of VLTM (r = 0.85, p < 0.001), AC (r = 0.78, p < 0.001), and VSTMiCDP (r = 0.71, p < 0.001).
6 Conclusion
The feasibility of substituting traditional cognitive assessment with a game-based approach was previously fuzzy. In this regard, this paper introduced LAS, an intelligent technique that utilizes non-invasively collected game analytics to automatically evaluate the cluster of three cognitive abilities (i.e., VLTM, AC, and VSTMiCDP). LOOCV is employed to demonstrate the construct validity of LAS, which successfully categorizes the targeted cognitive abilities within a 5-point evaluation sphere. Furthermore, the correlation technique is employed to demonstrate the concurrent validity of LAS, which reports a significant relationship between the experimental and control group-based results to potentially substitute the appointed administered tests.
It is observed that the estimated mean level of children’s VSTMiCDP falls at the upper bad region of the 5-point evaluation sphere. This finding was alarming until the estimated mean level of younger and older adults’ VSTMiCDP was reported within the fair region of the 5-point evaluation sphere. Therefore, it is assumed that in contrast to the other targeted cognitive abilities, VSTMiCDP takes relatively longer to excel among most children, and by the time a child transforms into a younger adult this ability gets naturally improved. However, another independent investigation can be conducted to further investigate the validity of this assumption.
The design principle of various commercially available BGs solutions such as Elevate [32], Lumosity [57], and Fit Brains [36] is somewhat similar to the theoretical operationalization approach of BrainStorm; however, unlike the demonstrated potential of LAS to independently evaluate the cluster of cognitive abilities, these software only provide a training facility to periodically improve and monitor the performance of the weaker brain aspects such as memory, brevity, speed, problem-solving, attention, and even flexibility. Therefore, the above-mentioned limitation in commercially available BGs solutions makes LAS the first-of-its-kind technique.
Data Availability
The datasets generated and analyzed during the current study are not publicly available due to privacy reasons but are available from the corresponding author on reasonable request.
References
Achtman RL et al (2008) Video games as a tool to train visual skills. Restor Neurol Neurosci 26(4):435–446
Ahmad F, Luo Z, Ahmed Z, Muneeb S (2020) Behavioral profiling: a generationwide study of players' experiences during brain games play, Interact Learn Environ https://doi.org/10.1080/10494820.2020.1827440
Ahmad F, Ahmed Z, Muneeb S (2021) Effect of Gaming Mode Upon the Players’ Cognitive Performance During Brain Games Play: An Exploratory Research. Int J Game-Based Learn (IJGBL) 11(1):67–76. https://doi.org/10.4018/IJGBL.2021010105
Aison C et al (2002) Appeal and interest of video game use among the elderly. The Harvard Graduate School of Education
Basak C, Boot WR, Voss MW, Kramer AF (2008) Can training in a real-time strategy video game attenuate cognitive decline in older adults? Psychol Aging 23(4):765–777
Bennett GK, Seashore HG, Wesman AG (1947) Differential aptitude tests
Brown DJ et al (2009) Game on: accessible serious games for offenders and those at risk of offending. J Assist Technol 3(2):13–25
Byun S, Park C (2011) Serious game for cognitive testing of elderly. In international conference on human-computer interaction (pp 354-357). Springer, Berlin, Heidelberg
Castel D et al (2005) The effects of action video game experience on the time course of inhibition of return and the efficiency of visual search. Acta Psychol 119:217–230
Cecilia SL et al (2014) “logical blocks” multimedia game development for students with intellectual disabilities. HCI international posters part II. Commun Comput Inf Sci 435:371–375
Chang C et al (2011) Leisure activities for the elderly–the influence of visual working memory on mahjong and its video game version. HCI international posters part I. Commun Comput Inf Sci 173:358–362
Chen J, Wang G, Zhang K, Wang G, Liu L (2019) A pilot study on evaluating children with autism spectrum disorder using computer games. Comput Human Behavi 90:204–214, ISSN 0747-5632. https://doi.org/10.1016/j.chb.2018.08.057
Chuang T, Chen W (2007) Effect of Computer-Based Video Games on Children: An Experimental Study. IEEE Int Workshop Digit Game Intell Toy Enhanced Learn ISBN: 0–7695–2801-5:114–118
Clark K, Fleck MS, Mitroff SR (2011) Enhanced change detection performance reveals improved strategy use in avid action video game players. Acta Psychol 136:67–72
Cohen, J (1988) Statistical power analysis for the behavioral sciencies. Routledge
Cohen J (1992) A power primer. Psychol Bull 112(1):155–159. https://doi.org/10.1037/0033-2909.112.1.155
Colzato LS et al (2010) DOOM’d to switch: Superior cognitive flexibility in players of first person shooter games. Front Psychol, pp 1–8
Colzato LS, van den Wildenberg WPM, Zmigrod S, Hommel B (2013) Action video gaming and cognitive control: playing first person shooter games is associated with improvement in working memory but not action inhibition. Psychol Res 77(2):234–239
Connors C et al (2014) Virtual environments for the transfer of navigation skills in the blind: a comparison of directed instruction vs. video game based learning approaches. Front Human Neurosci 8:223
Dang J, Zhang J, Guo Z, Lu W, Cai J, Shi Z, Zhang C (2014) A pilot study of iPad-assisted cognitive training for schizophrenia. Arch Psychiatr Nurs 28(3):197–199
de la Guía E, Lozano MD, Penichet VMR (2015) Digital educational games to improve cognitive abilities. Br J Educ Technol 46:664–678. https://doi.org/10.1111/bjet.12165
Diamond A (2013) Executive functions. Annu Rev Psychol 64:135e168. https://doi.org/10.1146/annurev-psych-113011-143750
Dobrowolski P, Hanusz K, Sobczyk B, Skorko M, Wiatrow A (2015) Cognitive enhancement in video game players: the role of video game genre. Comput Hum Behav 44:59–63
Donohue SE, Woldorff MG, Mitroff SR (2010) Video game players show more precise multisensory temporal processing abilities. Atten Percept Psychophys 72(4):1120–1129
Dorval M, Pepin M (1986) Effect of playing a video game on a measure of spatial visualization. Percept Mot Skills 62(1):159–162
Drew B, Waters J (1986) Video games: utilization of a novel strategy to improve perceptual motor skills and cognitive functioning in the noninstitutionalized elderly. Cogniti Rehab 4(2):26–31
Durlach PJ, Kring JP, Bowens LD (2009) Effects of action video game experience on change detection. Mil Psychol 21(1):24–39
Dustman RE et al (1992) The effects of videogame playing on neuropsychological performance of elderly individuals. J Gerontol 47(3):168–171
Dye MWG, Green CS, Bavelier D (2009) The development of attention skills in action video game players. Neuropsychologia 47(8–9):1780–1789
Dye MWG et al (2009a) Increasing speed of processing with action video games. Current Direct Psychol Sci 18(6):321–326
Eggermont S, Vandebosch H, Steyaert S (2006) Towards the desired future of the elderly and ICT: policy recommendations based on a dialogue with senior citizens. Poiesis Prax 4(3):199–217
ELEVATE: Your personal brain trainer (2016) Retrieved April 19, 2016, from https://www.elevateapp.com/#/
Esteban-Cornejo I, Tejero-Gonzalez CM, Sallis JF, Veiga OL (2015) Physical activity and cognition in adolescents: a systematic review. J Sci Med Sport/Sports Med Aust 18(5):534e539. https://doi.org/10.1016/j.jsams.2014.07.007
Faizan A, Yiqiang C, Shuangquan W, Zhenyu C, Jianfei S, Lisha H, Jindong W (2016) A Study of Players’ Experiences During Brain Games Play. In: Booth R, Zhang ML (eds) PRICAI 2016: Trends in Artificial Intelligence. PRICAI 2016. Lecture notes in computer science, vol 9810. Springer, Cham
Faizan, A, Yiqiang, C, Lisha, H, Shuangquan, W, Jindong, W, Zhenyu, C, Xinlong, J, Jianfei, S (2017) BrainStorm: a psychosocial game suite design for non-invasive cross-generational cognitive capabilities data collection, J Exp Theor Artif Intell, https://doi.org/10.1080/0952813X.2017.1354079
Fit brains: Rosetta Stone* (2016) Retrieved April 23, 2016, from http://www.fitbrains.com/
Gaggi O, Palazzi CE, Ciman M, Galiazzo G, Franceschini S, Ruffino M, Gori S, Facoetti A (2017) Serious Games for Early Identification of Developmental Dyslexia. Comput. Entertain. 15, 2, Article 4 (Summer 2017), 24 pages. https://doi.org/10.1145/2629558
Glass D et al (2013) Real-time strategy game training: emergence of a cognitive flexibility trait. PLoS One 8:e70350
Green S, Bavelier D (2006) Enumeration versus multiple object tracking: the case of action video game players. Cognition 101(1):217–245
Green S, Bavelier D (2006a) Effect ofaction video games onthe spatial distribution of visuospatial attention. J Exp Psychol Hum Percept Perform 32(6):1465–1478
Green S, Bavelier D (2007) Action-video-game experience alters the spatial resolution of vision. Psychol Sci 18(1):88–94
Green CS, Pouget A, Bavelier D (2010) Improved probabilistic inference as a general learning mechanism with action video games. Current Biol CB 20(17):1573–1579
Greenfield PM et al (1996) Action video games and informal education: effects on strategies for dividing visual attention. Interacti Vid 11:187–205
Heuer Jr RJ (1999) Psychology of Intelligence Analysis. Center Stud Intell, ISBN 1 929 667–00-0
Hubert-Wallander B, Green CS, Bavelier D (2011) Stretching the limits of visual attention: the case of action video games. Wiley Interdiscip Rev Cogn Sci 2:222–230
Ijsselsteijn W et al (2007) Digital Game Design for Elderly User. Proceed Conf Future Play pp 17–22
Jeon J, Yoon D, Yang S, Kim K (2017) Extracting gamers' cognitive psychological features and improving performance of churn prediction from mobile games, 2017 IEEE Conference on Computational Intelligence and Games (CIG), pp 150–153, https://doi.org/10.1109/CIG.2017.8080428
Joshi V, Wallace B, Shaddy A, Knoefel F, Goubran R, Lord C (2016) Metrics to monitor performance of patients with mild cognitive impairment using computer based games, 2016 IEEE-EMBS International Conference on Biomedical and Health Informatics (BHI), pp 521–524, https://doi.org/10.1109/BHI.2016.7455949
Khowaja K, Salim SS (2019) Serious game for children with autism to learn vocabulary: an experimental evaluation. Int J Human–Comput Interact 35(1):1–26. https://doi.org/10.1080/10447318.2017.1420006
Koops MC (2008) Digital adventure game-based learning. A research project of Dr. M. C. Koops together with centre for science and mathematics education
Kueider AM et al (2012) Computerized Cognitive Training with Older Adults: A Systematic Review. Plos One 7 (7)
Li R, Polat U, Makous W, Bavelier D (2009) Enhancing the contrast sensitivity function through action video game training. Nat Neurosci 12(5):549–551
Li R, Polat U, Scalzo F, Bavelier D (2010) Reducing backward masking through action game training. J Vis 10(14):1–13
Linehan C et al (2014) Designing Games for the Rehabilitation of Functional Vision for Children with Cerebral Visual Impairment. CHI Extend Abs Human Factors Comput Syst ISBN: 978–1–4503-2474-8:1207–1212
Lowery BR, Knirk FG (1982) Micro-computer video games and spatial visualization acquisition. J Educ Technol Syst 11(2):155–166
Luck SJ (2007) Scholarpedia, 2(6):3328. https://doi.org/10.4249/scholarpedia.3328
Lumosity (2016) Retrieved April 21, 2016, from http://www.lumosity.com/
Luna-Oliva L, Ortiz-Gutiérrez RM, Cano-de la Cuerda R, Piédrola RM, Alguacil-Diego IM, Sánchez-Camarero C, Martínez Culebras MC (2013) Kinect Xbox 360 as a therapeutic modality for children with cerebral palsy in a school environment: a preliminary study. NeuroRehabilitation 33(4):513–521
Maillot P, Perrot A, Hartley A (2012) Effects of interactive physical-activity video-game training on physical and cognitive function in older adults. Psychol Aging, Am Psychol Assoc 27(3):589–600
Matsushima et al (2014) Touch Screen Rehabilitation System Prototype Based on Cognitive Exercise Therapy. HCI Int Commun Comput Inf Sci 435:361–365
McClurg PA, Chaille C (1987) Computer games: environments for developing spatial cognition? J Educ Comput Res 3(1):95–111
McDermott AF, Bavelier D, Green CS (2014) Memory abilities in action video game players. Comput Hum Behav 34:69–78
Melenhorst AS (2002) Adopting communication technology in later life: the decisive role of benefits. PhD dissertation, Eindhoven: Technische Universiteit Eindhoven
Montani V et al (2014) A new adaptive video game for training attention and executive functions: design principles and initial validation. Front Psych, 409 (5)
Navarro J et al (2013) Game Based Monitoring and Cognitive Therapy for Elderly. Workshop Proceedings of the 9th International Conference on Intelligent Environments 17:116–127
Nishiguchi S, Yamada M, Fukutani N, Adachi D, Tashiro Y, Hotta T, Morino S, Aoyama T, Tsuboyama T (2015) Spot the difference for cognitive decline: a quick memory and attention test for screening cognitive decline. J Clin Gerontol Geriatr 6(1):9–14
Oei AC, Patterson MD (2013) Enhancing Cognition with Video Games: A Multiple Game Training Study. PloS One 8 (3)
Oei AC, Patterson MD (2014) Playing a puzzle video game with changing requirements improves executive functions. Comput Hum Behav 37:216–228
Okagaki L, Frensch PA (1994) Effects of video game playing on measures of spatial performance: gender effects in late adolescence. J Appl Dev Psychol 15:33–58
Patrícia B et al (2013) Video game training to improve selective visual attention in older adults. Comput Hum Behav 29(4):1318–1324
Pearce C (2008) The truth about baby boomer gamers: a study of over-forty computer game players. Games Cult 3:142–174
Plowman L, Luckin R (2004) Interactivity, Interface, and smart toys. IEEE Computer 37(2):98–100
Prensky M (2001) Digital natives, digital immigrants. On the Horizon NCB University Press 9 (5)
Riesenhuber M, Poggio T (1999) Hierarchical models of object recognition in cortex. Nat Neurosci 2:1019–1025. https://doi.org/10.1038/14819
Roschelle JM, Pea RD, Hoadley CM, Gordin DN, Means BM (2000) Changing how and what children learn in school with computer-based technologies. Futur Child 10(2):76–100
Rui W, Fanglin C, Zhenyu C, Tianxing L, Gabriella H, Stefanie T, Xia Z, Dror B, Andrew TC (2014) StudentLife: assessing mental health, academic performance and behavioral trends of college students using smartphones. Proceedings of the ACM conference on ubiquitous computing, 3–14
Ruiz JR, Ortega FB, Castillo R, Martín-Matillas M, Kwak L, Vicente-Rodríguez G et al (2010) Physical activity, fitness, weight status, and cognitive performance in adolescents. J Pediatr 157(6):917e922e5. https://doi.org/10.1016/j.jpeds.2010.06.026
Ruiz-Ariza A, Grao-Cruces A, Loureiro NEM, Martínez-Lopez EJ (2017) Influence of physical fitness on cognitive and academic performance in adolescents: a systematic review from 2005e2015. Int Rev Sport Exerc Psychol 10(1):108e133. https://doi.org/10.1080/1750984X.2016.1184699
Serences JT, Yantis S (2006) Selective visual attention and perceptual coherence. Trends Cogn Sci 10:38–45. https://doi.org/10.1016/j.tics.2005.11.008
Strobach T, Frensch PA, Schubert T (2012) Video game practice optimizes executive control skills in dual-task and task switching situations. Acta Psychol 140:13–24
Subrahmanyam K, Greenfield PM (1994) Effect of video game practice on spatial skills in girls and boys. J Appl Dev Psychol 15:13–32
Tanabe, Osaka N (2009) Picture span test: Measuring visual working memory capacity involved in remembering and comprehension. Behav Res Methods 41(2):309–317
Tarling A (2005) Older people’s social and leisure time, hobbies and games. Master's Thesis. University of Sessex
Tong T, Chignell M (2014) Developing a serious game for cognitive assessment: choosing settings and measuring performance. In proceedings of the second international symposium of Chinese CHI (Chinese CHI '14). Association for Computing Machinery, New York, NY, USA, pp 70–79. https://doi.org/10.1145/2592235.2592246
Williams, JM (1991) Memory assessment scales. Odessa, FL: Psychological Assessment Resources, 199(1)
Wilms IL, Petersen A, Vangkilde S (2013) Intensive video gaming improves encoding speed to visual short-term memory in young male adults. Acta Psychol 142:108–118
Yuji H (1996) Computer games and information-processing skills. Percept Mot Skills 83(2):643–647
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Ethical approval
All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.
Informed consent
Informed consent was obtained from all individual participants included in the study.
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Ahmad, F., Ahmed, Z., Shaheen, M. et al. A pilot study on the evaluation of cognitive abilities’ cluster through game-based intelligent technique. Multimed Tools Appl 82, 41323–41341 (2023). https://doi.org/10.1007/s11042-023-15100-x
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11042-023-15100-x